Monthly Archives: May 2013

Internet of Things Hitting Mainstream

Yesterday I caught up with my boss for my fortnightly review meeting. He was nice enough to head down to Mornington near where I live to have a coffee and a bite to eat. I’m used to being mobile when I work so obviously I cracked open my tablet and started reviewing and responding to e-mails, cutting code and making notes.

One of the e-mails was from Graeme Strange (CEO of Readify) asking for some thoughts on what people are calling the “Internet of Things”. Internet of Things (or IOT for acronym fans) is really just an umbrella term for the way everyday devices are being connected up to networks and linked together to behave in intelligent ways.

My Personal IOT Journey

As it happens I’m interested in home automation and there is a strong overlap between IOT and the technologies traditionally used in IOT. Not only that but IOT technologies are raising abstractions that make dealing with physical hardware easier so that software developers like me, and even complete novices can achieve some pretty spectacular results.

Over the years I’ve played with Arduinos, Netduinos, Raspberry Pis and NinjaBlocks. At each step of the way the level of abstraction has increased to the point that it is pretty much point and click that anyone who understands the basics of cause and effect can work with.

Of course these devices are really just the distributed brains of IOT technology. They need devices to connect to and control. Even here there have been some interesting developments. Two that have really piqued my interest are the Philips Hue lightbulbs and the upcoming LIFX lightbulbs. The former is a light bulb that can be controlled centrally from a base station and can display any colour you want. The later is basically the same thing but where it is different (and in my opinion where it is better) is that the bulb itself includes WiFi connectivity, no base station required.

Both of these devices can be hacked into an IOT infrastructure so that when you walk down the hallway in the middle of the night it puts on a low intensity glow to light your path. I’m looking forward to the LIFX shipping and am extra excited because it has Australian founders.

These kinds of LED bulbs are quite expensive (around $70 per unit) but remember that a LED bulb should last significantly longer than a traditional bulb and also consume less power so you may even come out better in the long run.

Business, the next IOT Frontier

Business already uses process automation technology, particularly in manufacturing and mining. However, IOT technology is going to make it cheaper and easier to integrate systems together which will make similar technologies accessible in other industries. As I mentioned to Graeme I think the industry areas to watch around commercial application of IOT are:

  • Environmental control systems on commercial real-estate.
  • Process automation in manufacturing facilities.
  • Metered services (parking etc).
  • Medical device integration.

All of these sections are already consumers of technology in some way, but they are not as well integrated as they could be.

Doctors Perspective

I recently underwent some minor hand surgery and during the process had to fill in about a gazillion forms (you think I am exaggerating). Yesterday I caught up with the surgeon to review the results and I mentioned that I think that the booking process for surgery is a complete nightmare (there are four different parties that you have to deal with: surgeon, hospital, anaesthetist and health insurance).

He agreed and said that he has long advocated improved use of technology in hospitals. He remarked that he has previously told hospital administrators proud of their latest and greatest new building that they have succeeded in building a facility that hasn’t changed much since the 1900s.

One example he cited was the integration of various specialist devices (automatic drip feeders) into some kind of overall connected system.

From my point of view this is exactly where IOT technology would help. Each one of these devices sends a continuous feed of structured data into a bus which then analyses it against a set of rules (and previous results) and then pushes messages to other devices for display or action.

Keystone Technologies

Overall I think all the technology exists today to make this possible but it is important to call out a few of them (or associated trends) which I think are really the keystones:

  • Miniaturisation of entire PC-like devices to the size of a credit card.
  • Standardised connectivity over TCP/IP (specifically TCP/IP).
  • Simple key-exchange for WiFi devices (push a button to connect).
  • NFC tags for negotiating casual connections to devices.
  • Increased solar panel efficiency for outdoor applications.
  • Reliable mobile data communications.

I think that these are the key ingredients. I think the next step is for intelligent device manufacturers (devices with flexible microprocessors and connectivity options) need to come up with a way of exchanging data about their devices with control systems in terms of “messages sent” and “commands accepted”.


Right now I think that IOT is still on the leading edge of the wave. But I think it is becoming more mainstream, particularly in the consumer space. I think that business will start to take advantage of loosely coupled IOT technology in the years ahead which will lead to cheaper implementation costs, better integration and ultimately better services.

I would go one step further and say that adopting these technologies to improve efficiency is critical for Australia if we want to maintain first-world status.

ODF, OOXML and Standard Operating Environments at AGIMO

Part of my role at Readify is keeping abreast of what is going on in the IT sector from a broad perspective and then understanding how it will impact our customers and by extension the Readify business itself.

Over the past couple of days I’ve received a few questions from various people inside the business to provide some input on various topics. One of those topics was the recent announcement by AGIMO (Australian Government Information Management Office).

Specifically AGIMO reversed a decision to promote OOXML over ODF, and instead now recommends ODF. The change was part of a larger document which provides guidance on what departments should consider when defining a SOE (Standard Operating Environment).

Tale of Two Formats

Both ODF and OOXML are specifications which describe how documents for various types can be stored physically in digital media. Both specifications seek to solve the same problem, but differ in their origins and therefore their implementation details.

OOXML (or Office Open XML) is the standard file format for Microsoft Office products since Office 2007. It is effectively a group of files contained within a ZIP file (but renamed to a specific extension such as *.docx). One of the files inside the archive is an XML file which contains the actual document content, and other files (such as images) are referenced in.

ODF (or OpenDocument Format) also uses XML to persist document data and it can be either compressed with a ZIP algorithm or uncompressed. In this way it is very similar to OOXML. The exact structure of the XML is where the two formats differ and ODF has its origins with Open Office, originally part of Sun Microsystems, then Oracle (which then stopped commercial support for the product). Since then a number of other products have added support for the format.

War of Words

Within the IT community, those who care argue bitterly about which format is best ODF or OOXML. The ODF proponents argue that even though OOXML is a standard it has tight dependencies on Office/Windows. On the other hand the proponents of OOXML (mostly those who use Office) argue that it doesn’t matter since people want 100% fidelity from what they see in their Office products and what is stored to disk.

In this war of words (pun intended) what seems to be missing is a bit of a reality check about software development, proprietary formats, preferred formats and lossy format conversion. The truth is that every document creation product on the market is initially designed to persist data in a format which it can faithfully read back to reproduce the document on the screen just as it was when persisted.

Tendency to Diversification of Formats

If you have two word processing products, you are going to have two file formats, and whilst both products might support the other file format you can bet that information loss will occur when product A saves in file format B and vice versa. The argument is by introducing a standard file format C that both products support then the problem is solved. Unfortunately this isn’t the case, now what you have is two products and three file formats and imperfect support for the standard format from product to product.

Whilst it is possible to achieve good standards support over time (just look at HTML and CSS over the years) is requires a massive effort that vendors and by extension their customers would probably be unwilling to pay for. So we are where we are.

Putting my .NET developer hat on for a second I personally would prefer to work with OOXML because it is possible for me to (relatively) easily interact with OOXML based documents using the System.IO.Packaging namespace. In fact Readify has worked on a number of applications where templates are produced as *.docx files and we process the file in .NET to spit out the finished product. I am sure that others could make similar arguments for ODF too.

In the end I don’t think it matters. I think that your average user will focus on what they can create with the tools at their disposal and then pick the file format that most faithfully represents that document on this. Within the Australian Government it is fair to say that Microsoft Office is the dominant player and so if there is a desire to push another file format then what that really means is users putting down their favourite tools such as Word, Excel and PowerPoint in favour of alternatives. That is a lot of inertia to overcome.

In the Case of Emergency

It is important to remember that AGIMO is an advisory body which provides recommendations to government agencies. Whether they pick them up is another question and in my experience various agencies fall to various degrees outside the guidance provided. In some senses it would be good to see AGIMO have a little bit more influence across agencies because it would force many agencies to upgrade their technology stacks.

If that did come to pass you can be sure that organisations like Microsoft might introduce measures into their products where the end-user pre-determines their target document persistence format (instead of when it is first saved) and the document creation tool would then restrict what features are available.

In reality I don’t think it would happen because some proprietary features are just to useful to sacrifice in the name of open standards (that is the whole reason organisations buy specific software packages).

New Hope In The Cloud

In the end the argument about document file formats for presentations, spreadsheets and word processors might end not because we reach agreement on file formats but because the very concept of files goes away. With the propagation of cloud technologies within businesses it is becoming increasingly common for a file to be hosted on a web-server and converted to HTML for display to the end user. Google Docs and Office Web Apps are both good examples of this. And whilst you could upload an Office Document to Google Docs and get render issues (quite a few in fact), if you forget about moving files around and instead expose the file directly from where it was authored then the respective cloud platform takes care of the lossless conversion to HTML on the server.

These days I think of documents less as a sequence of bytes on a disk and more as a “service endpoint” that my client software (e.g. Microsoft Word) talks to in a series of document updates. This approach has enabled cloud-hosted document editors to support multiple people editing a document. Watching a group of people update a word document hosted on SharePoint 2013 is quite a sight. The fact that a file might be transferred from the server to the client is just a side effect of a data caching process which enables you to work when connectivity is lost to the cloud service.

AGIMO Announcement in Context

What I thought was particularly interesting is that not much was said about the rest of the documents in which these recommendations were couched. The broader document talks specifically to the provision of Standard Operating Environments within government agencies and if you are so inclined its interesting to look through some of the recommendations and think about how they might impact your productivity.

As a software developer working within some of the constraints within that document wouldn’t be practical so it is likely that certain groups of users would necessarily not be able to work within the COE (Common Operating Environment).

I’m also interested in how AGIMO might approach developing a BYOD device policy (another emerging trend in commercial organisations).

In Summary (and an Ironic Observation)

I personally think that broad statements about preferred document formats (ODF, OOXML or something else) won’t have much impact because ultimately it is the tools that users master which impact the formats they use. If another vendor wants to get a better showing inside Australian Government around document creation, then they are going to have to produce better tools that users love. When that happens – file formats will change.

The ironic thing I observed was that the published SOE build guidelines are in the proprietary Word 97-2003 format (so neither OOXML or ODF). That’s OK, I was able to convert  the document to ODF with only minor conversion errors.

Three Windows 8 Application Archetypes

Over the past six months I’ve been working with various Readify customers helping them understand how they can take advantage of Windows 8 within their organisations. In particular I’ve been working with commercial and/or government entities who are looking at deploying the new operating system and want to better understand what benefits the new application model might provide.

From Windows 7 to Windows 8

For those of you just getting familiar with Windows 8, it is important to understand that Windows 8 builds upon the foundation of Windows 7 in that if the application worked with Windows 7, it’ll work with Windows 8. Windows 8 represents a superset of functionality of Windows 7 (as you would expect). One of the significant things that Microsoft added to Windows 8 was an additional model for building applications – these are called “Windows Store Apps” but the title is somewhat misleading because you can actually build these kinds of applications and deploy them within an organisation without publishing them on the public Windows Store. For the remainder of this post I’m going to refer to this new kind of application as a “Modern App” and the old-style of application as a “Legacy App”.

Impact of Mobility and Form Factor

There is no doubt that mobile computing has had a major impact on our society and I think I could argue that the trend hasn’t even hit the corner of the hockey-stick yet. We are seeing increased adoption of touch-based tablet devices and in the consumer space these devices are displacing traditional family computers. The problem is that the legacy approach to building desktop applications just doesn’t scale down well to these kinds of touch screen devices – and so a new approach was needed.

Microsoft has developed the new modern application framework as part of a larger overall rethink on how users want to interact with their PC today and into the future. Microsoft now sees Windows running on high-end engineering PCs used for CAD and software development all the way down to handheld devices used for casual access. Prior to Windows 8 these lower end devices represented a problem for Microsoft because their OS couldn’t be scaled to run efficiently on this kind of hardware. Windows 8 changed all that – the operating system is now significantly more friend on battery consumption and with the modern UI of new Windows 8 applications it works much better on a smaller form factor (whilst maintaining its mouse/keyboard friendly capabilities).

Taking Advantage of the Platform

While it is possible for enterprises to roll out Windows 8 and not redevelop any of their internal business systems I recommend that organisations take a look at their existing business systems and think about the kinds of systems that could benefit from the new platform capabilities. Any sizable business is likely to have dozens of applications that sit within this category.

No doubt organisations will already have Windows 8 application projects underway with more in the pipeline. One of my concerns is that customers are going to start building large monolithic Windows 8 applications which span multiple departments  and have relatively large footprints. One of the benefits of the new Modern App model is that you break applications down into smaller pieces and link them together with techniques like protocol activation and sharing contracts.

The Three Archetypes

What is really needed is a shorthand for talking about the kind of application that is being built to put a fence around the functionality that should be contained within it. After a good period of time working with customers, and thinking about the problem I think I’ve come up with three different application types. They are:

  1. The Activity Hub
  2. The Gateway Application
  3. The Role/Departmental Application

What I’ll do here is expand on each of them a little bit.

The Activity Hub

The Activity Hub is a “one-of” kind of application within any organisation (where an organisation might be an entire business or a significant business unit). It is the one application that every employee/contractor uses several times a day to keep abreast of what is happening across the organisation. In an increasingly mobile and distributed world it is the equivalent of the notice board in the coffee room.

Many organisations that are getting started with Windows 8 have this idea to replace the Windows Store with their own company specific store. Whilst this is a laudable goal I think that it is a case of re-inventing the wheels. I would rather see organisations use technologies such as Windows InTune or System Centre Configuration Manager to deploy applications to user devices and instead re-purpose this hub as a central point for discovery of business intelligence, notifications and other generally available information.

The Gateway Application

The gateway application is a thin layer over an existing line of business application. Rather than expose any really task-like functionality this kind of application is focused on getting data in and out of the existing system via the various shell extension points in Modern Apps (such as sharing contracts, file pickers, search etc).

Gateway applications are important because they represent a relatively cheap way to get data into the hands of people that need it without fully replacing the existing application.

The Role/Departmental Application

The role-focused or departmental application is a fully featured application focused on helping a particular kind of user within the business key their job done quickly. It might include forms for capturing data, or take advantage of capabilities such as NFC (Near Field Communication) for bringing up records for physical items.

Role/Departmental applications represent a bigger investment that gateway applications and would probably be developed initially for those people who need to be mobile (whether that is inside the building, or out of the road). These applications become a rethink of how users work with backend systems on modern devices which might work as a desktop one minute, and as a tablet the next.

Basis for Estimation and Higher Success Rates

One of the interesting side effects of breaking high level application requirements into these three categories is it gives you a practical way to weigh up the likely implementation costs. Most activity hub development projects are going to be very similar in size, as are gateway applications. Departmental applications might be the area of most significant deviation but I believe that organisations should limit the size of these applications to increase the speed with which they can be pushed out but also realise some of the benefits of building smaller applications focused on very specific areas and then linking them together.

For example, you might have a common concept of a customer across two departments, and instead of building screens for customers in two different applications you build a single “customer micro-app” which both departmental applications can launch into when necessary.


I think this is going to be the year of Windows 8 adoption within the enterprise. As part of that I expect many businesses to start embracing a more mobile style of working where all staff take their devices with them wherever they go. To make this work they are going to need new kinds of applications that work well in this style of working. I think that breaking applications down into these three application archetypes will help.

If you are about to kick-off a deployment of Windows 8 in your organisation and you want someone to help come up with an overall approach get in touch. I’ve got some work that I’ve already done in this space which might help.

Nine Years of Application Lifecycle Management

When I first started working with .NET in 2000 the “process tooling” for the platform didn’t exist, heck the platform wasn’t even released. It didn’t take long however and immigrants from other platforms were getting involved building out the tool-chain required for agile software development, including build scripting languages, unit test frameworks and CI build engines.

Having come from other development platforms I was familiar with the alternatives to these tools and was comfortable stringing them together and I adapted my knowledge to work with .NET. From my very first .NET project I made sure that assets were in version control, that builds could occur automatically and that we got a shippable product every time someone checked in. My tools of choice were NAnt, NUnit and Draco.Net, CVS and/or Subversion.

I introduced these tools to many software development teams and they appeared to help the team produce software on a regular basis. This is really when I started to “care about ALM” although the acronym ALM wasn’t really defined back then.

In late 2004 I became aware of a new product that was being developed at Microsoft (code-named Whitehorse from memory). It was set of extensions to Visual Studio which brought team collaboration directly into the IDE. It combined Microsoft interpretations of all of the above tool-types and baked in one of a series of development processes.

Around the same time I had started experimenting with Scrum as a way of focusing development teams that had been under-performing (for various reasons) so the idea of a product that combined a process and tooling appealed to me. Of course Team Foundation Server and Team Explorer shipped with an updated version of Visual Studio and I changed from being an ASP.NET MVP to an ALM MVP.

Fast forward to present day and I am just as interested in ALM as I was back then, the big difference is that the tooling has improved significantly (including cloud support) and now has direct support for Scrum as a methodology in product. With these new capabilities the expectations on teams have increased and we are now talking about DevOps as a way of reducing the length of the feedback cycle.

Agile methods themselves are now finding their way into enterprise organisations (even outside of software development teams). This is creating a demand for training and certification so that businesses can have the confidence that the new methods are embraced and well understood.

Nine years in I still think that ALM tooling, and good agile processes are the key to successful projects (good people helps too!). Looking forward to what the next nine years brings.