Saturday, September 12, 2009

Behold, the Meta-stamp

Checking my paper mail, I noticed that Australia Post is peddling a new stamp.  Except it looked like two stamps.   Now usually stamps are tiny postcard of cultural history – endangered species, famous people/places and the like, but one day you run out of ideas, and that’s how we get this– the Meta-stamp – a stamp depicting… errrm… a stamp!


I guess it’s kind of similar to when Hollywood remakes classic movies. 

Can’t wait for the next one – the stamp of a stamp of a stamp. A bit like the infinite cat project.

Monday, May 25, 2009

Robot arm ride

This was forwarded to me a little while ago, by now making the rounds through the “tubes”.  Not sure if it’s a group of factory workers with too much time on their hands (a spectacular breach of Occupational Health & Safety, surely), or an equally unsafe attraction where you actually pay for the privilege of experiencing your face being millimetres away from being ground into the pavement by a giant robot arm.  Step right up.

Thursday, May 14, 2009

Printing MSDN Magazine articles from Firefox

I like to read developer articles.  Some of the best dev articles (in my opinion) for the .NET space are published in the MSDN magazine, available online for free.  Sometimes it’s convenient to print them, to read on the train, or to make notes in the margins.  But Microsoft wants to make that difficult for you.

They do this in two ways:

1. Refusal to remove a print redirection tag in the header of the articles.  The print redirection tag exists to suggest to browsers where they should go in order to get a more printer friendly version of the page. For example, say you look at this article.  If you are in IE7 or IE8, when you go to File->Print Preview, you’ll see a version of the page that’s entirely unlike what the original page looked like.  That was achieved via this directive in the header:  <link rel="alternate" media="print" href="/en-us/magazine/dd569757(printer).aspx" />.  The browser sees this, takes you to the “print version”, and prints it.  Note that Firefox seems to ignore this, and stubbornly refuses acknowledge that the creator of the page went to the effort of providing an optimised print page. 

But ignoring that… so far so good – but then the web designers did something weird.  The print version of the page (to which you can get by clicking the “Print View” button on the page) has a printing header link… back to itself.  This has the effect of cancelling any changes you performed on the page, like expanding code regions etc – the page reloads just before printing. 

Now recently, MS seems to have realized this, but their fix was nothing short of bizarre – add a script to automatically expand all the collapsed regions on load.  Why does the print view redirect to the print view at all??  Anyhow, the bizarre fix kind of works, in the past you couldn’t actually expand the collapsed regions, as they’d collapse as soon as you tried to print (because the page reloads).  Now they’re open by default.  But this doesn’t work for Firefox - when you print in Firefox, the regions are collapsed, the script for some reason doesn’t execute, and you’re off trying to expand all the regions by hand.

2.  Somebody forgot their stylesheets.  The generated page is full of class attributes, but the stylesheet definitions for a lot of them are missing:

image image

There is no differentiation between headings and text.  The class attributes are there.

So what to do?  Greasemonkey, that’s what.  Greasemonkey is an add-on for Firefox, that allows you to execute custom scripts after the page loads.  You can do funky things like tweaking the specific HTML elements, adjust styles etc.  There is a handy online reference, that gets you started using it in no time.  So now every time I visit an MSDN print page, the styles are magically fixed.

The resulting script I present here improves on the page in more ways than the above features – I’ve changed the code font to Courier, as that’s a far more common programmer font family, and removed items that “float”, upsetting printing, as well as removing the Copy Code link.

Here is are some snapshots of the results, side by side (“before” on the left, “after” on the right):

Header and “Sidebar” entries (I’ve moved the “Sidebar” element to appear inline, for better printing):


Section headings and code snippets (no need for the grey background in my opinion, or the “Copy Code “link):


The entire script is below, enjoy (feel free to comment on any styles that I might have missed):

The obvious barrier of entry is that you have to know how to actually add a script to Greasemonkey.  An exercise left to the reader.

  1. // ==UserScript== 
  2. // @name           MSDN Magazine 
  3. // @namespace      a 
  4. // @include*/*(printer).aspx 
  5. // ==/UserScript== 
  7. //this function is straight from the DiveIntoGreasemonkey tutorial 
  8. function addGlobalStyle(css) { 
  9.     var head, style; 
  10.     head = document.getElementsByTagName('head')[0]; 
  11.     if (!head) { return; } 
  12.     style = document.createElement('style'); 
  13.     style.type = 'text/css'
  14.     style.innerHTML = css; 
  15.     head.appendChild(style); 
  17. //get rid of the copy code header 
  18. addGlobalStyle(".CodeSnippetTitleBar {display: none !important;}"); 
  20. //code snippets 
  21. addGlobalStyle("pre.libCScode {background:#FFFFFF none repeat scroll 0 0 !important;border-top:1px solid #C8CDDE !important;border-bottom:1px solid #C8CDDE !important;border-left:1px solid #C8CDDE !important;border-right:1px solid #C8CDDE !important;display:block !important;font-family:courier,monospace !important;margin:0 0 10px !important;padding-left:5px !important;padding-right:5px !important;padding-top:5px !important;}"); 
  23. addGlobalStyle(".ArticleTypeTitle {color:#008080 !important;font-family:'Segoe UI',Arial !important ;font-size:14px !important;font-style:normal !important ;font-variant:normal !important ;font-weight:bold !important;margin-top:3px !important;text-transform:none !important;}"); 
  25. //article section title 
  26. addGlobalStyle(".ColumnTypeTitle, .FeatureSmallHead {border-bottom:2px solid #008080 !important;color:#003399 !important;font-family:'Segoe UI',Arial !important;font-size:36px !important;font-style:normal !important;font-variant:normal !important;font-weight:normal !important;line-height:36px !important;margin-bottom:4px !important;padding-bottom:2px !important;text-decoration:none !important;text-transform:uppercase !important;}"); 
  28. addGlobalStyle(".ColumnTypeSubTitle, .FeatureHeadline {color:#000000 !important;font-family:'Segoe UI',Arial !important;font-size:20px !important;font-style:normal !important;font-variant:normal !important;font-weight:normal !important;line-height:20px !important;margin-bottom:8px !important;text-decoration:none !important;}"); 
  30. //author name in header 
  31. addGlobalStyle(".ColumnByLine, .FeatureByLine {color:#000000 !important;font-family:'Segoe UI',Arial;font-size:16px !important;font-style:normal;font-variant:normal;font-weight:normal !important;text-decoration:none !important;}"); 
  33. //sidebar header 
  34. addGlobalStyle(".SidebarHeadline {color:#008080;font-family:'Segoe UI',Arial;font-size:18px;font-weight:bold;line-height:20px;}"); 
  36. //recently they've started using floating divs as well this gets rid of them 
  37. addGlobalStyle("div {float:none !important; height:auto !important; width:auto !important;}"); 
  39. //show all hidden panels 
  40. var allLinks, thisLink; 
  41. allLinks = document.evaluate( 
  42.     "//div[@style]", //any style, as they start off invisible 
  43.     document, 
  44.     null
  46.     null); 
  47. for (var i = 0; i < allLinks.snapshotLength; i++) { 
  48.     thisLink = allLinks.snapshotItem(i); 

Sunday, April 5, 2009

MIX 09 Videos

The MIX 09 conference is well over in the US, and all the content is now available online, all the presentations were recorded.  You can even get them all on the one page, instead of search the presentations for each day:

Monday, March 23, 2009

iPod lap timer parser

The iPod Nano is a beautiful thing.  That is unless you want to do something custom.  So it is with the lap timer in it.  There is no obvious way of synchronising the timer logs with the computer.  Even if there is a way in ITunes (I’m pretty sure there isn’t), the point is moot, as I don’t use ITunes (instead I’m using the wonderful ml_ipod plugin for Winamp).  This is “suboptimal” if you want to, for example, use the lap timer to chart your jogging progress over the course of a year.  You want to crunch those stats in Excel or some such, you certainly aren’t going to copy them from the screen.

The information on the web is a little sparse on this matter, but there are a number of applications that will let you do this.  The one that caught my eye was which luckily provided source code (Mac OS X only – aren’t we feeling exclusive).  A relatively trivial exercise in parsing.  The timer format was loosely described in the TimerFormat.txt in the zipped source.  At least detailed enough for me to whip up an implementation in C#.  Hence this post.

The underlying format is a binary file.  Time to dust off .NET’s BinaryReader.  The below method ReadTimerEntries(string filepath) will read in an iPod timer file (on my Nano it’s /IPod_Control/device/timer) and produce a list containing the date of the recording, and the associated laps.  From there you have all the info you need.  Anyway, it works for me, on an iPod Nano (3rd gen).  I don’t imagine that the file structure changes much though.

  1. public partial class Form1 : Form 
  2.     public Form1() 
  3.     { 
  4.         InitializeComponent(); 
  5.     } 
  7.     private void button1_Click(object sender, EventArgs e) 
  8.     { 
  9.         OpenFileDialog ofd = new OpenFileDialog(); 
  10.         ofd.FileName = Application.StartupPath; 
  11.         DialogResult dr =ofd.ShowDialog(); 
  12.         if (dr == DialogResult.OK) 
  13.         { 
  14.             List<TimeEntry> entries = ReadTimerEntries(ofd.FileName); 
  15.             foreach (TimeEntry entry in entries) 
  16.             { 
  17.                 System.Diagnostics.Debug.WriteLine(entry.EntryDate.ToString()); 
  18.                 foreach (TimeSpan ts in entry.Laps) 
  19.                 { 
  20.                     System.Diagnostics.Debug.WriteLine(ts.ToString()); 
  21.                 } 
  22.             } 
  23.         } 
  24.     } 
  26.     public List<TimeEntry> ReadTimerEntries(string filePath) 
  27.     { 
  28.         List<TimeEntry> results = new List<TimeEntry>(); 
  29.         //open the file 
  30.         var stream = File.OpenRead(filePath); 
  31.         BinaryReader br = new BinaryReader(stream); 
  32.         br.ReadBytes(4); //start file 
  33.         while (!br.ReadBytes(4).SequenceEqual(new byte[] { 0x13, 0, 0, 0xC0 })) 
  34.         { 
  35.             //first 4 bytes are already consumed 
  36.             ReadRound(br, results); 
  37.         } 
  38.         return results; 
  39.     } 
  42.     private void ReadRound(BinaryReader br, List<TimeEntry> itemList) 
  43.     {             
  44.         br.ReadBytes((16 * 5));// guff 
  46.         br.ReadBytes(8);// start date 
  47.         byte seconds = br.ReadByte(); 
  48.         byte minutes = br.ReadByte(); 
  49.         byte hour = br.ReadByte(); 
  50.         byte day = br.ReadByte(); 
  51.         byte month = br.ReadByte(); 
  52.         int year = br.ReadInt16(); 
  53.         br.ReadByte(); //unknown use 
  54.         TimeEntry entry = new TimeEntry(); 
  55.         entry.EntryDate = new DateTime(year, month, day, hour, minutes, seconds); 
  56.         itemList.Add(entry); 
  57.         br.ReadBytes(4); //end date 
  59.         br.ReadBytes(4); //start laps 
  60.         while (!br.ReadBytes(4).SequenceEqual(new byte[] { 0x10, 0, 0, 0xC0  })) 
  61.         { 
  62.             br.ReadBytes(4); //start lap 
  63.             int lapMilliseconds = br.ReadInt32(); //4 bytes 
  64.             TimeSpan lapTime = TimeSpan.FromMilliseconds(lapMilliseconds); 
  65.             entry.Laps.Add(new TimeSpan(0, 0, 0, 0, lapMilliseconds)); 
  66.             br.ReadBytes(4); //end lap 
  67.         } 
  68.         br.ReadBytes(4); //end round 
  69.     } 
  72. public class TimeEntry 
  73.     public DateTime EntryDate; 
  74.     public List<TimeSpan> Laps = new List<TimeSpan>(); 

Sunday, March 22, 2009

Terminator Salvation – coming to cinemas May 21

Ahh, the Terminator movie that we’ve always wanted – entirely set in the future, focusing on the war between man and machine.  I’ve always wondered when they would show the future they hinted at, a void slightly filled by the Terminator TV series (Sarah Connor Chronicles).  Another point of note is that this coming instalment is but the beginning of a Terminator revival – part of a new trilogy apparently.  And you can’t go wrong with Christian Bale as John Connor.

Thursday, March 5, 2009

Reading database tables in Powershell

Firstly, a correction.  Last time I posted about Powershell, I came up with something that did the job, but was, well, a tat long winded.  Kind of like building a skyscraper so that you can store garden tools in the basement.  Basically I didn’t realise that the Sort cmdlet already had a –Unique switch.  Thanks to Stephen Mills for pointing it out:

import-csv c:\mydata.csv | select Category, Subcategory | sort category, subcategory -Unique | Group-Object -Property category

Now to the business at hand – reading database tables.  Consider a situation where you’re asked to read in the data in a database table into some readable form, for reference, printing or whatnot.  Rather than using SQL Management Studio to query the results, selecting the entire results grid, copying it, pasting into Excel (if available on the same machine), or into a text file, then getting it into Excel – you could just use Powershell to connect to the DB and dump out the results into a HTML table.  Behold:

  1. # Parse Database tables 
  2. # This will connect to a database, do a "select *" on a table or view, and produce a html file with the data in a table. 
  3. # Note: don't forget to loosen up your execution policy "Set-ExecutionPolicy Unrestricted" 
  4. # and the connection string is ADO style : "Data Source=database;Initial Catalog=oesc_offerman;User Id =username;Password=password;Trusted_Connection=False;" 
  6. param
  7. $connectionString = $(throw "Specify connection string" ), 
  8. $tableName = $(throw "Specify a table or view name"), 
  9. $outputPath =  $(throw "Specify output path"
  11. echo ("Processing " + $tableName
  12. $table = new-object System.Data.DataTable; 
  13. $sqlConn = new-object System.Data.SqlClient.SqlConnection($connectionString); 
  14. $sqlConn.Open(); 
  15. $adapter = new-object System.Data.SqlClient.SqlDataAdapter(("select * from " +$tableName),$sqlConn); 
  16. $adapter.Fill($table); 
  17. $sqlConn.Close(); 
  18. $table.Rows | ConvertTo-Html | out-file $outputPath 

Wednesday, March 4, 2009

Fish blimp

Had to share this one – from a homemade airship competition, footage of a small blip that uses a tail fin for propulsion – it looks very graceful, especially for a blimp – sure it’s not encumbered, with no wind, but it’s still mesmerising to watch:

Air Art from flip on Vimeo.

Saturday, February 21, 2009

“SELECT DISTINCT” in Excel and Powershell

Just recently at work we had to import a large dataset into a database, but two of the columns were “category” and “subcategory”.  Naturally we wanted these in a separate table, so we needed a way of parsing through the existing data and get the unique categories, and the unique matching subcategories for each.


Firstly, there’s the Excel ‘07 way (most likely also possible in other versions too):

Select the two columns that hold your categories and subcategories. Go to the “Advanced” menu item in Filtering:


The presented dialog has the option of “Unique Records Only”


Done, it’s the equivalent of doing a “distinct” in SQL.  Now sort the data, and you’re ready to go.


But no discussion of parsing/manipulating data is complete without mentioning Powershell – surely we can do this in PS.  Let’s assume that the data was saved in csv form (you could always save your Excel spreadsheet as .csv).

Luckily, there’s already a command that is able to parse csv’s for us: “Import-Csv”.  This cmdlet will parse the csv and create objects that have properties named after the columns – in the below image you can see that the object has a “Category” and “SubCategory” property.


Now that we have this collection of objects, use your favourite way of flushing out duplicates.  I’m keen on building up a new list, checking on each insert that the same entry doesn’t exist:

  1. $temp = @{}  
  2. $data | foreach {  
  3.       if ($temp.ContainsKey($_.Category))  
  4.       {  
  5.         #check if subcategory exists, insert subcategory if doesnt  
  6.         if ($temp[$_.Category].ContainsValue($_.Subcategory) -eq $false)  
  7.         {  
  8.             $temp[$_.Category].Add($_.Subcategory, $_.Subcategory);  
  9.         }  
  10.     }  
  11.     else  
  12.     {  
  13.         #put it in:  
  14.         $temp.Add($_.Category, @{$_.Subcategory = $_.Subcategory});  
  15.     }  

This is all well and good, but there are issues.  The "Category” property names are hardcoded, and it depends on an existing $data variable.  Ideally we want a cmdlet that can support piping, and will let us specify the names of the columns.

Luckily for us, Powershell is able support the following constructs:


What happened there is by putting the variable in brackets, PS will evaluate the variable and put it in the script.  I was able to create a variable $somestring, which had a string value of “Subcategory”.  When I used it in brackets, it expanded out to “Subcategory” in the script, and functioned in the same way as $data[2].Subcategory.  Neat.  This is an invaluable feature in scripting.

So we replacing all places where we previously had “Category” and “Subcategory” with expansions, and add them as required parameters.

Last thing, we need to get rid of the assumption of pre-existing $data variable.  Ideally the input will be piped in.  No problem, the reserved $input variable is the piped in parameters.

So now we can call it like so:

import-csv c:\MyTest.csv | c:\categoryParser.ps1 "Category" "Subcategory"

The final script looks like this:

  1. param
  2. $categoryName = $(throw "Specify category column name" ), 
  3. $subcategoryName = $(throw "specify subcategory name"
  5. $temp = @{} 
  6. $input | foreach
  7.     if ($temp.ContainsKey($_.($categoryName))) 
  8.     { 
  10.         #check if subcategory exists, insert subcategory if doesnt 
  11.         if ($temp[$_.($categoryName)].ContainsValue($_.($subcategoryName)) -eq $false
  12.         { 
  13.             $temp[$_.($categoryName)].Add($_.($subcategoryName), $_.($subcategoryName)); 
  14.         } 
  15.     } 
  16.     else 
  17.     { 
  18.         #put it in: 
  19.         $temp.Add($_.($categoryName), @{$_.($subcategoryName) = $_.($subcategoryName)}); 
  20.     } 
  22. #dump out the output 
  23. $temp 

And produces something like this, a hashtable of objects each of which has the category name, and a hashtable of subcategories. 


From here it would be easy to traverse them all with two nested foreach loops, and do whatever with them, like generating INSERT statements.

Friday, February 6, 2009

Minotaur in a China Shop

I have a soft spot for independent games – possibly the last bastion of innovation in computer gaming.  Came across a very amusing little independent game, about a Minotaur running a fine china shop.  The game is 3D, isometric perspective.  Runs in the browser.  The minotaur isn’t exactly light on his feet, and handles like fridge - thus there are two ways to win – trash your own shop to get an insurance payoff, or service customers.  Give it a go – most amusing.

Minotaur China Shop Trailer from Flashbang Studios on Vimeo.

Saturday, January 31, 2009

On Precision

Do YOU know your car’s height down to the millimetre?  You’d better…

Saw this in Southbank:


Thursday, January 29, 2009

Blade Runner 2, Brave New World, Forever War – coming to a cinema near you… umm.. soon…

In case you missed it, Slashdot had coverage of the fact that a Blade Runner sequel may be in the works, as would a film based on Brave New World (by Aldous Huxley), and Forever War (by Joe Haldeman).

Brave New World didn’t really do much for me, but Forever War, if properly done, could be the serious sci-fi movie people have been waiting for – they seem a bit thin on the ground these days.  More importantly, all this is supposedly to be directed by Ridley Scott (director of Blade Runner, Alien, Gladiator and Black Hawk Down among others).  Apparently he’s secured the rights to Forever War, and is now looking for a script.

Read all about it:

Tuesday, January 27, 2009

Chernoff Faces and Data Visualization

While we might have a good feel for what makes a good UI, few of us give much thought to the niche are of data visualization.  Too often we’re content to present users with drab boring datagrids and lists.  While on one side there is the issue of familiarity (everyone has seen tables), the usability of grids for slicing and dicing data is limited at best.  Filtered columns, sorted columns can only get you so far, and certainly don’t provide you with a “feel” for the data.

Having just finished reading Blindsight by Peter Watts (freely available online – not too bad, with an interesting central idea looking at the potential for self-awareness to be unnecessary for intelligence), one of the characters makes use of Chernoff faces to visualise data.

Chernoff faces are an attempt to make use of the special brain circuitry that we have for recognizing faces in the context of data visualization.  The idea is to present the user with a field of stylized faces, associate a specific data dimension to a facial feature, and let the brain do the rest.  The below example maps data onto a real geographical space, but that is not strictly speaking necessary, the map could have been a set of two dimensional data of more abstract nature.  Using the following key to map facial features:


You can for example sweep your eye across the map looking for all the “big eared” faces to give an idea of crime rate.  Or you could look for “frowning faces with big ears” (which would map to high unemployment and high crime rate).


(image from

(for another interesting example, go to

While not bad on paper, it appears the applications are quite limited, and the data still doesn’t “jump out” at you, as well as the difficult issue of non-intuitiveness when dealing with abstract data sets (what facial feature should be “average sales”?).  Your mileage may vary – it still seems quite good at performing visual searches for specific patterns (for example consider the lot of faces with brown skin and short hair – higher percentage of collage graduates, but with lower incomes – what gives?) .

Regardless, this is but one attempt at visualizing data in a non-tabular way.  There are other stabs at doing this, some which even add motion to the visualization to suggest another data dimension.  Consider “Anymails” where email is visualized as various critters moving around in the visual space – older email moves more slowly, and various characteristics of the email are encoded into the shape/color of the insects:


(from, there’s also a video of the application in action there, I believe)

Why am I bringing this up? Well it’s all to do with with the trend we’re seeing in UI technologies.  Silverlight and WPF are both platforms in which animation and custom drawing are readily available, giving the freedom to easily create “compelling” user interfaces that may give us the ability present data in a more intuitive way, as well as interaction options that would have been too “expensive” to create in the past.  This will require more focus on what the users are trying to achieve as opposed to regurgitating the same dropdowns and text-boxes.  It will take years for this trend to develop, as the area appears to be largely unexplored in mainstream software design, and would be classified as risky. 

Regardless, creative thought in software engineering is back, and we should use the flexibility of SL and WPF to bring users the UIs that will give them truly better productivity, not just proxies of paper-based processes – Minority Report UI here we come!

Sunday, January 25, 2009

Functionality vs Prettiness

Although I’ve traditionally been a proponent of Windows Mobile as a platform for pocket devices, the arrival of the iPhone certainly shakes the foundations of that conviction.  Watching now the rapid spread of the iPhone I wonder at how something like that could become so popular.  After all, PDA devices that mixed the PDA functionality with a phone are nothing new.  Consider the O2 XDA range for example, which has been out for years.  Nor is the tilt sensor anything unique – Nokia N95 had done it earlier.  GPS – others had it.  We can argue that it was all these features finally have come together in one reasonable package, and we’d be right.  Mostly.  There is however another feature that I think cuts to the bottom of the iPhone sales mystery – the UI.  Apple recognises that great functionality only gets you so far – the average user is a sucker for a pretty UI.  The iPhone is simply a pleasure to look at and operate. Everything from the sliding menus to the gradient background screams “luxury”.

For a lark, consider a unit/currency conversion application. Let’s ignore multi-touch, GPS and tilt sensors  Here’s a screenshot of an existing iPhone app that does this:


And here’s the same functionality, mocked up in a Windows Mobile 5 emulator under Visual Studio 2008, using the Compact Framework:


Sure, I could have made custom controls to beautify the thing, but the point is how far one could get with mostly out of the box tools, and there is no contest.  I’d buy the mobile that gives me the great looking UI, given the same functionality.  In fact I suspect a lot of people will take a hit on functionality for the sake of snazzy UI.  This is one of the reasons Silverlight on Windows Mobile is a big deal, and I look forward to it in “Q1 of 2009”.  No doubt MS will be playing catch-up to Apple for a few years.

PS: I don’t actually own an iPhone, the above is strictly based on short impressions, however I’d argue that a lot of purchasing decisions are made that way, for the majority of the shopping populace.

Tuesday, January 6, 2009

Silverlight 2 for mobile devices

Just finished watching a presentation from PDC2008, and apparently Silverlight 2 will be available as a tech preview on the mobile in the first quarter of 2009.  This is the first time I’ve heard of SL2 being released for mobile.  Previous announcements have centered around SL1, not version 2.  Version 2 was previously discussed as a distant Alpha, with no indication of timelines.  Silverlight 1 always seemed like a sad excuse for a framework, so I can’t say I was looking forward to a mobile release.  But the release of SL2 will finally give us the ability to design Iphone looking applications for the mobile.

As expected, designing will be available from Blend as well as from Visual Studio.  According to the demo, the development experience will be very similar.

See session PC10 for the announcement. (

Performance will of course be an interesting issue, as will the experience on a non touch enabled phone, eg Smartphone.