Moved blogging engine….again

Ok I bit the bullet and moved to WordPress. I’ve been avoiding it because it’s the main one people use and I like being a little different. In the end though the support for 3rd party publishing apps, the number of plugins and themes etc meant that I’ve done it at last.

All of the posts and comments + RSS feeds etc should have now been moved across but let me know if you notice any problems.

Useful VS2010 plug-ins

VS2010’s built in extensions manager is a really nice addition. There are already a lot of very good plug-ins available on it. Here are some of the ones I have installed that I find useful, you might too. Normal caveats apply about me not being responsible if one of these plug-ins causes your computer to catch fire and runs off with your wife/husband etc.

VS2010 Plugins

*updated* with new plug-ins.

  1. GOPI: HI WHERE I CANT THOSE PLUGINS

    • Robert Garfoot: The plugins are all available within Visual Studio 2010 from the extensions manager.

Setting up Mercurial Under IIS 6

I was just looking at setting up Mercurial on my Windows Home Server and found some good blog articles. My first one was on Jeremy Skinner’s blog here which shows how to configure it under CGI on IIS 7. I’d like to run it as an ISAPI extension though since that would be much faster, I then found this blog article from Matt Hawley which shows how to configure Mercurial as an ISAPI extension under IIS 7.

The final blog article was from another Matt here that shows how to install it under IIS 6. Now I’d already figured out most of this but I was looking for more info because of a little problem, I kept getting 404 errors whenever I accessed the URL. I couldn’t see anything wrong….

Eventually after much wailing and gnashing of teeth I realised my stupidity in a real DOH! moment. I thought I’d share this though since you might also come across this one. I forgot the with IIS 6 it will by default block all ISAPI and CGI extensions unless they are explicitly added into an allow list. I’d forgotten to add the Mercurial DLL into the list. A quick trip to the IIS admin console and the web service extensions section, add in the _hgwebdir_wsgi.dll as an allowed extension and all is now well. Hours wasted to a silly mistake, will remember that one for a while!

  1. Ben Alabaster: I wrote a 4 part blog post about this a few months back that goes over everything from the basics,…

Flexible Data Template Support in Silverlight

WPF has a great feature called data templates. These allow you to specify the visual appearance for a data object, you can either place them in a control or put them in a resources section to reuse them for multiple controls.

The real benefit in my opinion of them is that you can tag them with the type of the data object that they display, WPF will then automatically select the appropriate data template when it needs to render an item of that type. This makes building controls that display heterogeneous data structures really easy.

I was recently working on a Silverlight application in which I had a 3 pane view, one of the panes contained the main content and the other two contained navigation elements in a typical master / details type scenario. The items in the navigation pane were are of different types all deriving from a common base class. When I selected the item in the navigation pane I wanted to display the details in the main pane, however since each item was of a different type they needed to display differently in the main pane.

Fantastic I thought, I can use data templates and this’ll be easy. That was when I found out that Silverlight doesn’t support data templates that vary by type. You can define them and share them using a named key but unfortunately you cannot assign a target type to them like you can in WPF.

This wouldn’t be much of a blog article unless I’d found a way around this though so here we go. There are probably several different ways you could do this but this is the way I chose and it seems to work quite nicely, at least for me.

More

  1. Andris: Hi, Rob!I try create combo box with different items' lookup (depending of model type). Data source for combo box is…

    • Robert Garfoot: Hi Andris, better late than never but I've uploaded a full sample now that shows it working with combo boxes…

  2. admin: Thanks for the comment Andris. I've not tried using it with a combo box so it's possible that there is…

    • Keyon: Thanks alot - your answer solved all my problems after several days strgluging

  3. Ralph Shillington: This worked like a charm. Thanks for posting it.

  4. Chris: Handy idea, I am trying to do the same thing, and so figured I would try your code. I am…

    • Robert Garfoot: ListBoxes and ComboBoxes aren't something that I tried this with so I'm not surprised that there are a few issues…

    • Robert Garfoot: Hi Chris, I've just updated the sample to include a list box as well.

  5. CSMAC's Blog » Post Topic » WPF VS Silverlight 1: Data Template Selector: [...] For a simple implementation of the DataTemplateSelectorConverter, check out this posting. Data Template Selector for Silverlight [...]

More comments

Performance Profiling .NET Applications Using the Visual Studio Profiler

When I consult with development teams, at some point performance always rears its ugly head. When I code review something I often find it overly complex with lots of caching, “clever” code etc. and when I ask why the code is this complex the answer is usually for performance, E.g. “we need to cache the results as that’s an expensive operation”.

At this point I usually ask if they have any evidence to show how expensive the operation is, all too often the answer is “no”. Once again another developer is the victim of premature optimisation.

Now don’t get me wrong, optimising code is important, things like caching can indeed help reduce the cost of expensive operations. However, if that operation is only called once is caching really going to help? If the operation is “expensive” but only takes 2ms to execute and is called next to an operation that takes 2s is optimising it really going to help? I hope you all answered no there!

So how do we find out if we need to optimise, where do we focus our effort with limited time and resources to get the best return? That’s where profilers come in. If you’ve not used one before, they are applications that monitor how an application runs, usually capturing data such as number of calls, time taken, memory allocated etc.

Visual Studio comes with some pretty good profiling tools in the Team System version that can really help. There is a nice Performance Wizard on the Analyze menu that does much of the work for you. But as nice as the wizard is, it doesn’t fit all situations. If you need to profile services, code running as different users, on servers without VS installed or just need more control then the command line tools is where it’s at. You can download a standalone version from http://www.microsoft.com/downloads/details.aspx?familyid=fd02c7d6-5306-41f2-a1be-b7dcb74c9c0b&displaylang=en for installation on servers or machines without VS.

In this post I’m going to go through the (simple) steps needed to profile an application, I’ll leave interpreting the reports to another post.

More

  1. Performance Profiling .NET Applications Using the Visual Studio Profiler (Part 2) | Rob Garfoot's Blog: [...] the first part of this article I discussed how to profile applications using the command line tools for Visual…

Activating AHCI mode after installing Windows on IDE mode

I recently just rebuilt my home PC and installed Winodws 7 on it. It was a nice fast, smooth install and generally went without a hitch.

That is, without a hitch until I discovered I’d installed Windows whilst the motherboard’s SATA controller was in IDE mode and not AHCI mode. Why is this important? Well without AHCI you don’t get nice things like power saving, native command queuing etc and that impacts the power consumption, speed and noise of your drives.

Of course that does rely on having a drive that supports those features, since like most newish drives mine do, I wanted to benefit.

Unfortunately just changing the setting in the BIOS causes Windows 7 and Windows Vista to both blue screen (BSOD) at boot up with the error STOP 0x0000007B INACCESSABLE_BOOT_DEVICE. That’s because Windows doesn’t have the drivers for AHCI since I installed it with IDE drivers.

How to get around this? Well it’s surprisingly simple. After reading lots of hairy articles about hacking in drivers etc I found this knowledge base article http://support.microsoft.com/kb/922976. One simple registry setting and Windows enables it’s default AHCI driver. You can then reboot, change your BIOS settings from IDE to AHCI and Windows will boot and redetect your controller and drives.

Once that’s done, I rebooted to finish the install, then I installed the Intel Matrix Raid drivers so I had the actual manufacturers drivers on rather than the generic Windows ones.

All sorted now and it saved me a reinstall. I’ve just done a copy of 20GB of data from partition to partition on the same drive, much faster and quieter with AHCI than without. I guess that’s down to NCQ being able to reorder the reads and writes into something a bit more effiicient.

Moved blogging engine

It’s been a little while since I last blogged. Well I’ve decided to change the blogging engine that I use from Graffiti to Subtext. Why? Well no real reason other than I’m not great at web design and there are more themes available for Subtext that I like. I reserve the right to move back though!

I did have some issues moving posts since Graffiti doesn’t have a BlogML export function, only import. Fortunately I was able to find a little utility on Curt’s blog at http://darkfalz.com/post/2009/04/14/Graffiti-To-BlogML-Exporter.aspx that allowed me to export by blog and comments although it did lose some of the names against the comments, sorry about that those that posted.

  1. Curt C: I'm glad the utility (mostly) worked for you. It was a pain I had to go through when I converted…

Installing CrashPlan on WHS part 2

In my previous post “Installing CrashPlan on WHS” I looked at installing CrashPlan on my home server and making it use UNC paths. In this post I was originally intending to look at what was needed to perform a backup between two home servers using a backup seeded from a portable hard disk.

I did manage to get all of that set up and working and was about to post when my friend noticed a problem. For some reason his backup files got erased and CrashPlan started a full backup over the net, this was going to take nearly 2 months.

After restoring the backup from portable hard disk again and starting things off it did the same thing again. So for now we’ve decided to uninstall CrashPlan until we have time to look into what’s going on.

  1. Anonymous: Disregard my post on your previous blog entry :) Just found this one. Unfortunate, needless to say.

  2. garfoot: I replied in your original comment on the part 1 article.

Installing CrashPlan on WHS

I’ve been looking at a backup solution for my home server for a bit. Now I’ll forgive you at this point if you are thinking why do you need a backup solution for WHS when it does duplication.

Well several reasons but the main one being I want an offsite copy of most of the server contents in case of theft or fire. There are plenty of solutions out there for this but most of them require you to upload your files to the cloud and that is rather slow when trying to send a TB or so over a 800kbps uplink.

CrashPlan has the nice feature that you can back up to a friend’s machine and you can seed the backup locally. So the plan is to get a new removable hard disk, backup, take to friends then update over the net.

Installing CrashPlan

I downloaded CrashPlan (www.crashplan.com) and installed it on my home server. I went to add my folders to the sources for backup and immediately hit a problem. CrashPlan doesn’t let you specify a UNC path for the source (or the backup archive folder for that matter). Now I could just use D: but recommended practice for WHS is to always access data through the UNC paths so that’s what I’d like to do.

The reason for this is that CrashPlan runs as SYSTEM and this account doesn’t have network privileges and thus cannot access UNC paths.

To get around this I just changed the service to run as administrator and manually edited the %programfiles%/crashplan/conf/my.service.xml file to use UNC paths. The easiest way to do this is to add a dummy directory using the UI then find it in the config and change it. Use / instead of \ in the config file as it’s a Java app and needs that.

I’ve only tested scanning to see if it can read the files and it seems to work, getting a removable drive and doing a backup is the next step.

Issues

Doing the above has 2 main issues, first you are running a service that provides a remote access interface on the internet as administrator, the second is that administrator may not have permissions to read the files being backed up.

To be honest both of these issues are present with CrashPlan anyway, it runs as SYSTEM and exposes the remote interface and SYSTEM can be denied permission on ACLs and so prevent backups from working. Fortunately WHS seems to create directories with ACLs that have both SYSTEM and Administrator in them with Full Control so it should work, I’ll just have to remember not to change the ACLs on any files to remove those permissions.

Ideally the service interface should be running in a separate service with limited permissions and the backup engine service should be running as a user in the backup operators group.

I did initially try creating a new user in the backup operators group and run CrashPlan as that user, unfortunately CrashPlan doesn’t use backup semantics when opening the files for backup and as such the ACL bypassing of the backup operators group doesn’t kick in and it still can’t access protected files.

I have suggested to the developers that the split the remote interface from the backup engine and use backup semantics when calling CreateFile() to access the files during backup. I’ll let you know if I hear anything from them.

In the meantime I’m hoping the configuration I’ve built works ok, I’ll follow this post up once I’ve got my external drive and have a backup done.

  1. Anonymous: I too have played with Crashplan, and am also looking at the setup/configuration for WHS.Any specific reason you couldn't backup…

  2. garfoot: I did think of using the d:\shares folders but as per the WHS docs (from www.microsoft.com/.../details.aspx) you are supposed to…

  3. Anonymous: Rob,Now two months later are you satisfied with your solution? I'm about to purchase a WHS and this offsite backup…

  4. garfoot: @Jason, as you've seen from my 2nd post I have been unable to get CrashPlan to work properly on WHS…

  5. rajdude: There is no need to do all this mapping and such. WHS is just a GUI on top of the…

    • Robert Garfoot: WHS isn't just a GUI, it also provides Drive Extender which managers file replication between volumes and also provides a…

More comments

Load More