Monthly Archives: February 2004

Don’t build a framework!

I’ve been helping folks with .NET for a little while now and one of the recuring themes I see, especially with larger organisations is the tendency to indulge in a lengthy and expensive framework building process.

Quite often I hear of the “productivity improvements” that teams are getting out of these frameworks. Its not uncommon to hear about projects taking anywhere between twice to ten times as long to achieve successful completion. It’s hard not to see where the attraction is right?

Martin Fowler has some excellent (and short) writing on the subject of harvested frameworks vs. foundation frameworks. So why am I making this post? Its simple, I want the IT services firms out there getting involved in these projects to focus on delivering products that relate to their customers core business – I want them to “wait and see” if frameworks appear. Keep it simple stupid.

*sniff* no architect summit for me!

I’ve been looking forward to getting down to Melbourne to the upcomiing architect summit but it appears that I somehow forgot to submit my registration. Well, registration is now closed and its a full house so I won’t be going, its a real bummer because Pat Helland was going to be presenting. Simon Guest also did a great presentation last year – in fact his lip sync’d media player demo was a riot. The flights are booked, so I guess I’ll find something else constructive to do with the time.

AFL: A good excuse to talk about Longhorn.

Frank is talking about AFL again. Being born in Queensland much more interested in rugby and league although it is pleasing to see the Bears (er Lions) doing so well. However, when I moved to Victoria I was forced to choose a team. Knowing nothing about the sport I referred to as “aerial ping pong” I chose to support the Bombers – I thought their patch looked cool. Although it could have been the kiss of death for them . . .

Longhorn? Well, I had this idea a few weeks ago about how the Australian edition of Longhorn should ship with a number of tiles and pop-up toast applications too integrate sports into the shell. So how about a BaggyGreen tile and an AFL ladder tile (atleast!). Just need to unpack that Longhorn VM. Sounds like a good excuse to geek out Wednesday night next week.

P.S. Frank: I’m sure you have some partners you need to see early on Monday morning that means you would absolutely positively have to been in Melbourne over the weekend 😛

Temperature: 18 degrees celcius. Woohoo!

This has been my first summer in Canberra since I moved up here with Monash.NET. Before that I was living in Melbourne, and grew up in Brisbane (Caboolture actually). In Queensland summer is humid, but atleast there is a good chance of rain most evenings. Melbourne is much drier during summer but cool changes do come through quite regularly, and the temperature is very predicable – look at the weather in Adelaide and add a day.

Canberra, however, has been a real shock to the system. Because there is no body of water to regular the temperature it can climb and climb and climb. For the last month or so we have fairly consistently been in the 30oc to 40oc (thats up to 104of if you don’t have a converter handy). While its not the surface of the sun, it does make getting a good night sleep quite difficult – you always feel tired.

Well, today I had to put on a jumper. In fact I am wearing this spiffy canterbury MVP jersey that  Rose (local MVP contact) sent me. If you have an appreciation of real football (apologies to soccer fans – this isn’t directed at you), then you will know just how warm these things can be. The current temperature is down to 18oc, and while it might climb during the day I can’t see it getting over 30oc.

I’m looking forward to looking out on the snow frosted hills this winter. Last year it snowed a number of times but I was unlucky enough to be delivering training courses or working interstate each time.

So, who spotted the bugs in <csunit />?

When I first introduced my task to the world I mentioned that there were a few things that I wanted to tackle in future releases. Here is a bit of a referesher for those of you who missed my earlier post.

  • Write automated unit tests using csUnit.
  • Write automated build script to run unit tests and build a release.
  • Improve API documentation for the source code using XML comments.
  • Make the NAnt output look prettier.

I am now pleased to annouce that in the 1.1.0.0 drop I have started to address the first two items in that list. There are now a bunch of unit tests in the NotGartner.Build.Tasks.Tests project to excercise the task and weed out any problems with future modifications/enhancements I make. The act of writing the unit tests actually uncovered a number of bugs which are now fixed.

  • When input validation failed, clean-up of temporary files still occured causing an exception, masking the underlying exception that was raise as a result of invalid arguments.
  • Test summary output failed when tests passed (can you believe I missed this!).

Both of these errors resulted in me not running a full suite of tests as the task evolved, so things slipped through the cracks. Anyway, give this drop a crack!

Google Juice

I was reading one of FrankArr’s blog entries where he was checking the level of GoogleJuice for the bits that he puts out there. Ofcourse there is no prizes for guessing what the first link is for my surname was. Sometimes I wonder whether based on that association I’ll get stopped at US customs for acts of terrorism on US soil. Mind you, that is based on second hand information. I’ve seen the signs, just never had the pleasure.

Searching for my first name produces the desired results, however the cache that Google has shows the site as being in error. Not surprising giving the reliability of my hosting provider. If the site isn’t down, the database is down, and if the database is down files are getting zero-byted. Groan.

Are you registered for the Australian security conference?

The folks at Microsoft are taking to the road and are putting on a security conference for developer folks and IT professionals. I’m going to be at the Canberra one on the 5th of March, other dates for the rest of the country are available. Security has been under the spotlight recently and while the press would have you believe all the problems are in the OS, most vulnerabilities exist in home grown application code.

About a year ago Jeff Prosise and Dan Green did a tour of Australia. Jeff show folks how easy it was to compromise a poorly designed application and I think I saw a collective jaw drop several times that day. Anyway, be there, or hand in your geek card!

Introducing the Newcastle .NET User Group!

I good friend of mine, Peter Champness, has recently moved to Newcastle and is starting up a .NET user group for folks that live in the area. Currently the homepage has a form you can fill in as an expression of interest to let Peter know what .NET related subjects you are interested in.

If you living in Newcastle or near Newcastle why not let Peter know you are out there? If you know anyone that lives in the Newcastle area please forward a link to the Newcastle .NET User Group homepage. If you have a weblog why not put a post up and help get the word out!

A <csunit /> NAnt task for csUnit fans!

Update: This task has been updated, I strongly recommend you download that drop instead of this one. You can read all about it.

Yesterday I got an e-mail from Steve Smith of ASPalliance fame (among other things). Steve wanted to know whether I had ever seen a NAnt task for a unit testing tool called csUnit. I knew csUnit existed by being a long time devotee of NUnit (another unit testing framework) I’d never found the need to go hunting for a task to plug it into my build scripts.

I asked Steve what his reasons for prefering csUnit over NUnit were. He listed a few features, but the one that really stuck out in my mind was the [FixtureSetUp] and [FixtureTearDown] attributes which you can apply to methods in your test fixtures so that they get executed when the fixture is created and discarded respectively. This is a feature that I have long felt that NUnit needed, and something I could have used in my test code more than once.

As a direct result of this I spent last night writing version one of a task which allows me to easily integrate tests designed to work with that unit testing tool into my NAnt build scripts. You can grab a copy of the source and try it for yourself. Please keep in mind that this is just an initial drop, and its missing a number of very important features. I have the following features on my TODO list.

  • Write automated unit tests using csUnit (chicken and egg scenario).
  • Write automated build script to run unit tests and build a release.
  • Improve (have some) API documentation for the source code using XML comments.
  • Make the NAnt output look prettier (looks like a dogs breakfast at the moment).

Hooking up the task is as simple as dropping the NotGartner.Build.Tasks.dll assembly file into the same directory as the NAnt executable. If you can’t do that you can load it in from a location you specify inside the build script using the task. The task itself is very simple to use, “just” provide an assembly attribute pointing towards the assembly containing the unit tests and you are away.

If you have more than one assembly that you want to run against you can use an fileset with includes and excludes. When using the fileset the assembly attribute cannot be used, thats just an arbitary limitation I put in to avoid any confusion (good or bad?).

Established users of csUnit might have a set of “recipe” files. Instead of referencing the assemblies directly you can specifiy that the task use the recipe file instead. Like the assembly attribute and assemblies attribute, this option is exclusive.

Under the covers, using the assembly attribute or the assemblies fileset results in a temporary recipe file being generated. The recipe file is then passed to the csUnitCmd.exe executable to have the tests run. This means that this task does not compile against the csUnit libraries at all which should allow more flexibility in the future as framework versions get out of sync.

The task tries to guess the location of the csUnitCmd.exe executable by finding out there the official “Program Files” directory is and then going to “csunit.org\csUnit 1.9.4\csUnitCmd.exe” from there. If you haven’t installed csUnit or have installed it into another location then you can override this behaviour by specifying a program attribute on the task.

This is great if you want to lock in the version of unit testing tool you are using along with the source. I do a smilar things with NAnt and NUnit already. Speaking of versions, I have only manually tested this with version 1.9.4 of csUnit, I believe the earlier version 1.9.2 had a problem with the XML reporting function that this task relies on (can anyone confirm or deny this?).

By default csUnitCmd.exe spits out an XML file next to the unit test assemblies containing the test results. The task forces the output to a temporary file which it removes (along with the temporary recipe file if it needed to be created) after the results have been retrieved. If you need to keep the test results then you can override this behaviour by providing an output attribute on the task.

When this attribute is specified the file is not deleted when the task finishes. Well, thats just about it, I hope you csUnit fans out there find this useful. If you encounter any problems please do shoot me an e-mail  and let me know, I’ll add any bugs encountered to my TODO list and try and ship an updated version at some point in the future.

Forgiving XML Parsers?

No, its not a new feature in Whidbey, but should it be considered? The [aus-dotnet] mailing list has been having a few problems getting the mailing list made accessible as an RSS feed, threads like this don’t usually catch my eye, but Stewart Johnson posted this link to an excellent blog entry on the history of XML with respect to how it copes with errors.

To be honest I always thought that forgiving HTML parsers were a bit of mistake that we are still paying for, but perhaps I was to hasty in coming to that conclusion. I strongly recommend that you read the entry linked to above, it is truely a facinating read.

One of the things that I pondered was what would an error correcting parser look like? I drew up a bit of diagram showing what a forgiving XML stack would look like. Error correction would have to involve a set of rules on how to handle certain types of errors, and the behaviour would need to be configurable by the application developer.

It would also need to be layered, so the first and most simple check “is the document well-formed?” would be done first. The error correction rules would come into play and attempt to fix things up before it got passed up the stack – for example, to a validating parser. Rules would hook in at this layer too.

The thing is, most developers are under amazing time pressures to get the job done, so for the average in-house corporate developer I doubt whether they would set the forgiving bit. Certainly in the first release of any application I produced, I would be fairly strict in what I accepted, if only because its easier on me.