Monthly Archives: September 2009

Observations on Scrum, Timesheets and Estimation

As a professional services provider, Readify is well practiced in the process of filling in timesheets to support billing functions within the business. Consulting tends to be a T&M style engagement so turning those time entries into money isn’t a major problem (although you wouldn’t have AR/AP staff if it was a complete no-brainer :P).

Moving into the project delivery space like Readify is puts a slightly different spin on timesheets. We effectively engage on T&M projects but because we utilise Scrum (or Scrum-butt variant depending on the client) we run into an interesting situation when comparing time estimates against sprint backlog items and timesheet actuals.

Going through the sprint planning exercise recently we sit our ideal day at about five productive hours of development per day, meaning that there was about three hours of other stuff not accounted for (not necessarily unproductive, but potentially unforeseen).

The problem is, when you have a team of four people that means that you might end up with 60 hours of “other stuff” and the client rightfully questions why they are paying for this.

It’s About Business Value

As we talked through the issue I pointed out that we really need to compare the submitted timesheets against the business value delivered, not the submitted timesheets against the estimates. Submitted timesheets include the 3 hours of “other stuff” which is difficult to track in a software project.

At the end of the day, dividing submitted hours by the estimated hours in a sprint backlog really just gives you a number and completely ignores the business value delivered. I think that this is probably the biggest argument for using a non-numerical sizing mechanism like T-Shirt sizing (S, M, L, XL) rock-paper-scissors style over planning poker.

I’d much rather have a conversation about value delivered to determine in the first instance whether there was some, and secondly how to optimise it sprint on sprint. I think when you get back to this you start being able to focus on what is really important. If you didn’t get much value out of this sprint then maybe the product owner didn’t prioritise correctly.

Whilst it is a bit of an overhead the team started tracking time in a timesheet to do some task analysis. Hopefully after a few sprints we’ll be able to talk intelligently about what some of this “other stuff” is and suggest ways to optimise it.

Converting people to Windows 7 one MacBook at a time.

Yesterday I was sitting in the lobby of the hotel waiting for my ride to a client meeting. A MacBook user walked up to me an commented on the slightly different power adapter on my MacBook Air (as compared to say a MacBook Pro). We talked about the relative performance of a few of the different models and then I mentioned that I was actually running Windows 7.

A lot of people aren’t even aware that Windows 7 is on its way. I ended up having a bit of a conversation about how Windows 7 runs very well on my MacBook Air – much better than Vista.

Overall I think the feel around Windows 7 is very positive in the technical community and I think with that Microsoft is going to be very successful with this new release. Vista got such a bad name so early that it was impossible to recover from and as much as people despised Vista, their XP environments are starting to get a bit long in the tooth.

Besides – if you are a Mac OS X user, Windows 7 is much more interesting than Apple’s latest offering. Their latest marketing material reads like a bunch of “mediocre and proud of it” statements.

Windows Image Acquisition via Silverlight

Yesterday I wrote a blog post about the possibilities & problems around hardware integration from Silverlight using the sockets capabilities. This evening whilst I was watching some television I started playing with this idea.

In the end I created a simple HttpListener (in a console application) which listens on port 8888 which serves up the client access policy file. In a subsequent WebClient request from the Silverlight application a call the the Windows Image Acquisition (WIA) API is used to grab an image and transfer it into the Silverlight application.

You can download my demo code over at MSDN. The reason I’ve been playing with this stuff is that I’d like to deliver a web-based Point-of-Sale experience but there are some requirements for hardware integration at particular workstations.

With a little bit more work I think it might be possible to come up with a really elegant solution that works across multiple sites. So if you were building a Silverlight application and you wanted to support some image capture you could just launch a hyperlink to a ClickOnce application (from a trusted source) which would download a small footprint HTTP server which would provide the client-side integration.

One of the key requirements would be security so you could set it up so that each time an application requested a scan it could pop up and approval dialog for that domain. That would then tweak the access policy file that it dished up and Silverlight would then make the actual request to scan the image.

Silverlight: Hardware Integration Possibilities & Problems

One of the interesting capabilities in Silverlight is the ability to communicate cross-domain to socket hosts provided that certain policy requirements are satisfied. Specifically a policy file needs to be served up on port 943 and then a port in the range 4502-4534 can be used.

One of the possibilities that this creates is the ability to run daemon’s on desktop computers that serve up access to hardware devices. This is useful for scenarios where the majority of functionality can be delivered in a web-browser and only specific parts of the application need access to things like serial ports.

One problem with this architecture though is if over time we get lots of applications using this integration strategy – we are going to get contention on programs wanting to dish up policy on port 943.

I’d be great to have some kind of plug-in framework where the Silverlight runtime dropped down a default listener on port 943 on the local machine. When hit, the socket would serve up a policy file granting access to specific services in the 4502-4534 port range. Hardware vendors to could write Silverlight aware drivers which registered themselves with the out of the box listener.

Might be worth prototyping.

Using T4 for E-mail Generation and Reporting?

I’ve just been looking into a few options for reporting and e-mail generation within .NET applications. In the past I’ve used the Text Template Transformation Toolkit (T4) templating engine to generate e-mails to good effect and I’m looking to use it again on a little pet project. I’ve also got a need for a simple report generation capability within the application and was thinking that T4 could come in useful again.

Hopefully it’ll give me the capabilities I need right now until I can find a nice web-based report building tool that can be exposed to end users. The feedback that I’ve gotten from the user is “I’d like it to be like Quick Books”.

T4 is good – but it obviously lacks a web-based designer.

Catching up with peeps at Tech.Ed 2009.

Ah, you have to love Tech.Ed, it is one of the few events every year where the majority of the .NET developer community makes an attempt to get in the same place at the same time. Today was the academic day – and a big deal for MrAndyPuppy, and tonight the majority of delegates arrived for the welcome party.

Thinks started to wrap up at the venue at about 9pm but quite a few people kicked on – I suspect that there will be a few sore heads at the keynote tomorrow morning. Rumour has it that we are really going to want to be there.

So far the event has been great. Jordan posted some awesome stats about the bandwidth and I have to admit so far I’ve been impressed. Going to spend some more time down on the floor tomorrow to see what kind of warez the exhibitors are showing off.