I used the BlogML importer plugin from http://dillieodigital.net/2010/07/03/blogml-importer to import my BlogML from Subtext. However the output from Subtext included Base64 encoded content so I wrote a little C# application to convert the Base64 decode the content before importing it.
The source code is below, the usual caveats about if it blows up your computer don’t blame me etc. apply.
static void Main(string args)
XDocument doc = XDocument.Load(@"blogml-b64.xml");
XNamespace blogML = "http://www.blogml.com/2006/09/BlogML";
var query = doc.Descendants(blogML + "content")
.Where(node => (string) node.Attribute("type") == "base64");
foreach(var item in query)
string content = Encoding.UTF8.GetString(Convert.FromBase64String(item.Value));
Console.WriteLine("Press a key to continue");
Ok I bit the bullet and moved to WordPress. I’ve been avoiding it because it’s the main one people use and I like being a little different. In the end though the support for 3rd party publishing apps, the number of plugins and themes etc meant that I’ve done it at last.
All of the posts and comments + RSS feeds etc should have now been moved across but let me know if you notice any problems.
VS2010’s built in extensions manager is a really nice addition. There are already a lot of very good plug-ins available on it. Here are some of the ones I have installed that I find useful, you might too. Normal caveats apply about me not being responsible if one of these plug-ins causes your computer to catch fire and runs off with your wife/husband etc.
I was just looking at setting up Mercurial on my Windows Home Server and found some good blog articles. My first one was on Jeremy Skinner’s blog here which shows how to configure it under CGI on IIS 7. I’d like to run it as an ISAPI extension though since that would be much faster, I then found this blog article from Matt Hawley which shows how to configure Mercurial as an ISAPI extension under IIS 7.
The final blog article was from another Matt here that shows how to install it under IIS 6. Now I’d already figured out most of this but I was looking for more info because of a little problem, I kept getting 404 errors whenever I accessed the URL. I couldn’t see anything wrong….
Eventually after much wailing and gnashing of teeth I realised my stupidity in a real DOH! moment. I thought I’d share this though since you might also come across this one. I forgot the with IIS 6 it will by default block all ISAPI and CGI extensions unless they are explicitly added into an allow list. I’d forgotten to add the Mercurial DLL into the list. A quick trip to the IIS admin console and the web service extensions section, add in the _hgwebdir_wsgi.dll as an allowed extension and all is now well. Hours wasted to a silly mistake, will remember that one for a while!
WPF has a great feature called data templates. These allow you to specify the visual appearance for a data object, you can either place them in a control or put them in a resources section to reuse them for multiple controls.
The real benefit in my opinion of them is that you can tag them with the type of the data object that they display, WPF will then automatically select the appropriate data template when it needs to render an item of that type. This makes building controls that display heterogeneous data structures really easy.
I was recently working on a Silverlight application in which I had a 3 pane view, one of the panes contained the main content and the other two contained navigation elements in a typical master / details type scenario. The items in the navigation pane were are of different types all deriving from a common base class. When I selected the item in the navigation pane I wanted to display the details in the main pane, however since each item was of a different type they needed to display differently in the main pane.
Fantastic I thought, I can use data templates and this’ll be easy. That was when I found out that Silverlight doesn’t support data templates that vary by type. You can define them and share them using a named key but unfortunately you cannot assign a target type to them like you can in WPF.
This wouldn’t be much of a blog article unless I’d found a way around this though so here we go. There are probably several different ways you could do this but this is the way I chose and it seems to work quite nicely, at least for me.
When I consult with development teams, at some point performance always rears its ugly head. When I code review something I often find it overly complex with lots of caching, “clever” code etc. and when I ask why the code is this complex the answer is usually for performance, E.g. “we need to cache the results as that’s an expensive operation”.
At this point I usually ask if they have any evidence to show how expensive the operation is, all too often the answer is “no”. Once again another developer is the victim of premature optimisation.
Now don’t get me wrong, optimising code is important, things like caching can indeed help reduce the cost of expensive operations. However, if that operation is only called once is caching really going to help? If the operation is “expensive” but only takes 2ms to execute and is called next to an operation that takes 2s is optimising it really going to help? I hope you all answered no there!
So how do we find out if we need to optimise, where do we focus our effort with limited time and resources to get the best return? That’s where profilers come in. If you’ve not used one before, they are applications that monitor how an application runs, usually capturing data such as number of calls, time taken, memory allocated etc.
Visual Studio comes with some pretty good profiling tools in the Team System version that can really help. There is a nice Performance Wizard on the Analyze menu that does much of the work for you. But as nice as the wizard is, it doesn’t fit all situations. If you need to profile services, code running as different users, on servers without VS installed or just need more control then the command line tools is where it’s at. You can download a standalone version from http://www.microsoft.com/downloads/details.aspx?familyid=fd02c7d6-5306-41f2-a1be-b7dcb74c9c0b&displaylang=en for installation on servers or machines without VS.
In this post I’m going to go through the (simple) steps needed to profile an application, I’ll leave interpreting the reports to another post.
I recently just rebuilt my home PC and installed Winodws 7 on it. It was a nice fast, smooth install and generally went without a hitch.
That is, without a hitch until I discovered I’d installed Windows whilst the motherboard’s SATA controller was in IDE mode and not AHCI mode. Why is this important? Well without AHCI you don’t get nice things like power saving, native command queuing etc and that impacts the power consumption, speed and noise of your drives.
Of course that does rely on having a drive that supports those features, since like most newish drives mine do, I wanted to benefit.
Unfortunately just changing the setting in the BIOS causes Windows 7 and Windows Vista to both blue screen (BSOD) at boot up with the error STOP 0x0000007B INACCESSABLE_BOOT_DEVICE. That’s because Windows doesn’t have the drivers for AHCI since I installed it with IDE drivers.
How to get around this? Well it’s surprisingly simple. After reading lots of hairy articles about hacking in drivers etc I found this knowledge base article http://support.microsoft.com/kb/922976. One simple registry setting and Windows enables it’s default AHCI driver. You can then reboot, change your BIOS settings from IDE to AHCI and Windows will boot and redetect your controller and drives.
Once that’s done, I rebooted to finish the install, then I installed the Intel Matrix Raid drivers so I had the actual manufacturers drivers on rather than the generic Windows ones.
All sorted now and it saved me a reinstall. I’ve just done a copy of 20GB of data from partition to partition on the same drive, much faster and quieter with AHCI than without. I guess that’s down to NCQ being able to reorder the reads and writes into something a bit more effiicient.
It’s been a little while since I last blogged. Well I’ve decided to change the blogging engine that I use from Graffiti to Subtext. Why? Well no real reason other than I’m not great at web design and there are more themes available for Subtext that I like. I reserve the right to move back though!
I did have some issues moving posts since Graffiti doesn’t have a BlogML export function, only import. Fortunately I was able to find a little utility on Curt’s blog at http://darkfalz.com/post/2009/04/14/Graffiti-To-BlogML-Exporter.aspx that allowed me to export by blog and comments although it did lose some of the names against the comments, sorry about that those that posted.
In my previous post “Installing CrashPlan on WHS” I looked at installing CrashPlan on my home server and making it use UNC paths. In this post I was originally intending to look at what was needed to perform a backup between two home servers using a backup seeded from a portable hard disk.
I did manage to get all of that set up and working and was about to post when my friend noticed a problem. For some reason his backup files got erased and CrashPlan started a full backup over the net, this was going to take nearly 2 months.
After restoring the backup from portable hard disk again and starting things off it did the same thing again. So for now we’ve decided to uninstall CrashPlan until we have time to look into what’s going on.