I happen to be staying in a hotel tonight that has a wireless hotspot (amazing). Anyway I noticed on the Azure (thats their network provider) sign-in page that you could opt to listen to podcasts by The Podcast Network.
Unfortunately I hadn’t been aggregating these guys feed so I wasn’t aware what their latest news was – but its pretty cool. You can read more about it here.
I thought that $20 a month for 400MB was a little bit steep in the pricing given the number for shows that the TPN has and how big podcast files tend to be, but I guess its a start. Maybe they could reduce costs by pushing some of the data out to caches on the hotspot nodes – hrm.
I picked up this link to to Anina’s blog from Scoble. Up until today I had no idea who Anina was but she seems pretty cool. She is a model but also a geek – so she is into gadgets and apparently posing with gadgets!
What I can’t understand is why her agency doesn’t want her to blog. Agencies demand that models provide portfolios to get work – and a blog is a kind of portfolio, but it just proves that whilst you may have legs that go all the way up to there, you also have a brain and an opinion. Maybe its having an opinion that is the problem.
Personally I think a gadget company (or group of companies) need to pick her up as a spokes person to not only model, but also to write about her digital lifestyle – in essence, do what she does now, except allow her to charge BIG$$$ for it.
This interesting article on XHTML 2.0 up on the IBM developerWorks site got me thinking. For me the acid test for XHTML 2.0 will be how strictly it parses content and how reliable the renderings are across the various browsers that are out there today.
I know a guy who works on a fairly popular web-browser at the layer between the abstract model that sits on top of it and he pointed out that the W3C really hasn’t had a good set of test cases for how things should work and render.
In fact, in one scenario they had to take test cases from a later specification and dumb them down to conform to the older spec, and even then there was a lot of wriggle room.
I am in Sydney for the week presenting the first delivery our latest course – PROFESSIONAL .NET. I’m looking forward to getting into some of the material although funily enough this first pass is in VB.NET and those who know me will know how amusing that is.
Anyway – I’m looking forward to getting along to the Sydney Deep .NET User Group on Thursday evening and meeting up with the usual suspects!
I love it when the crew from Microsoft post up details about their internal processes. This time its Brian Harry who has posted up about their existing source branch structure and what they hope to do when they move to TFS internally.
One thing that they do is break up the feature development into branches to reduce contention and the risk of someone inadvertantly breaking the build of another group in DevDev (Brian outlines some good stats why this is a real issue). If you want to know more about branching you can read my recent posts on the subject (1, 2, 3, 4) which cover off some of the techniques that the guys at Microsoft are doing to control the complexity of building what they do.
Anyway – some of the comments in Brian’s post are worth reading, including some of the critical ones. All in all however I think that once a development team reaches a certain size branching is inevitable and if you ever want to produce patches for your software without slipping in lots of bugs you are going to need it – even for a one man development band!
Brian Sherwin has posted up an insightful piece about VSTS not being able to help you unless you change your habits. In many respects Team System is a bit like SAP (except heaps cheaper) in that it offers an amazing opportunity to customise and thats what a lot of early adopters are going to do.
Once again, like SAP users, VSTS users are going to hit a stumbling block, but in this case I don’t think its going to be upgrade blockers, instead I think its going to be the fact that they have tried to graft their old processes onto a product which may have served them better in its default configuration.
One of the things that I have tried explaining to people is that VSTS provides an opportunity to reduce the dead tree usage in your organisation and store design and project management information as first class (and strongly typed) entities in the various TFS data stores.
If you insist on using the tools the way that they are supposed to be use AND continuing to produce your out-dated design artifacts then you are just wasting your time.