When to re-invent the wheel.

Software development is full of decisions, some classic ones:

  • Which logging framework should we use?
  • Which IoC container should we use?
  • Which packaging tool should we use?
  • Which database deployment tool should we use?
  • Which navigation framework should we use?

With all of these there are multiple choices that we could make, but there tend to be a few categories that they fall into:

  • The default choice.
  • The popular choice.
  • The weird choice.

Notice that there is no right or wrong choice in that list, there are just choices. By definition the default and weird choice aren’t as popular as the popular choice, although they probably both have some level of support from someone in the team.

The challenge is – which choice should you make? Well, you really need to evaluate it against a set of criteria:

  • Is the choice going to work us now?
  • Is the choice going to work for us later?

But I want to step back a little bit and look at the Default Choice again. For example, in the case of database deployments, the default choice for me on the .NET platform is to try to use vsdbcmd.exe. I get a bit of push back about that because people don’t trust what it is doing, but what other choices are there?

  • The popular choice: change scripts, manually applied to the database.
  • The weird choice: build our own database deployment tool.

I usually here the popular choice coming from DBA’s, and I usually hear the weird choice coming from the software development team members. But both stem from a lack of trust in what Microsoft has developed with DBPro. I get that, I’ve seen some of the issues myself, but overall it does work.

To me, for a .NET development team targeting a SQL server database backend, vsdbcmd.exe is actually the default choice and you need a really good reason not to use it. The popular choice is my second favourite because it is often a reality when you are dealing with customer DBAs, but I don’t think the weird choice (the re-invent the wheel choice) is the way to go because you aren’t going to be able to build a tool powerful enough in the time you have available.

9 thoughts on “When to re-invent the wheel.

  1. Andrew Tobin

    The default choice.
    The popular choice.
    The weird choice.
    The one that Paul Stovell has written.😉

  2. David Gardiner

    I guess we went with the ‘weird’ choice as vsdbcmd.exe wasn’t an option for us.

    We used Red Gate’s SQL Compare SDK to create our own change script generator. It worked really well.

    Sounds like between vsdbcmd and the new Data-tier Application features in SQL 2008 R2 deploying/upgrading databases is getting easier.

    -dave

  3. Paul Stovell

    The interesting discussion here is how something came to become the default. In the database deployment and upgrade case, change scripts and migration-based approaches have a long history of being used, being reliable, and just working. The “diff and generate” approach is much newer, much less tested and, on the face of it, much more complicated thus creating more room for failure.

    So how did a new approach embodied in a new tool suddenly become the default choice? It generally comes down to one of:

    1) Someone has a preference for it
    2) Microsoft made it, therefore it must be awesome

    The idea that someone has to justify a deviation from the default, when the default choice isn’t deserving of the title, is where a lot of problems begin.

  4. Mitch Denny Post author

    Hi Paul,

    When you create your change scripts do you use a differencing tool to validate them against the target system and the source system? Or do you hand craft them?

    Tools like RedGate and DBPro have a market precisely because they help deal with the complexities of building change scripts. Even DBAs use them.

    So – why not just get rid of the middleman? If we run into a _specific_ problem then we can work around it, but 90% of the time (or higher) it should just work. And most issues should be caught in testing anyway.

    I’ll have to check with VS2010, but I think that it now handles the rename scenario anyway, and even if it doesn’t there are some work arounds involving adding a column and copying the data across, then later dropping the original column – which is what a DBA might do anyway.

  5. Paul Stovell

    I think we’re getting into comparing specific approaches rather than the general reasoning that the post started with. There’s no reason you couldn’t still use red gate/DB pro to help you create change scripts.

    The fundamental difference in the approach is that what gets checked into source control is a change script, not a database definition. That means the exact same T-SQL is executed:

    1) Every time a developer gets latest
    2) Every time integration tests run
    3) Every time you deploy to non-prod environments
    4) Every time you deploy to prod environments

    The risk in the VSDBCMD.exe in production approach that I see is there’s always a possibility that the generated diff script will be different in prod than it was in all other environments.

    >> I don’t think the weird choice (the re-invent the wheel choice) is the way to go because you aren’t going to be able to build a tool powerful enough in the time you have available.

    There are dozens of products and open source projects that embody the change script approach – just because you elect not to use VSDBCMD.exe in production doesn’t mean you have to start with Notepad.exe and ILASM🙂

  6. Damian

    I’m with Paul

    We have a problem with vsdbcmd on a very regular basis.

    “Error SQL01268: .Net SqlClient Data Provider: Msg 50000, Level 16, State 127, Line 12 Rows were detected. The schema update is terminating because data loss might occur.”

    That’s in dev, making it entirely useless in production.

    If I’m going to have to reverse engineer the diffs to create a change script anyway, why not capture that intent up front ?

    (As an aside, why do we expect that competent .net devs should be able to get by not understanding basic SQL DDL ? )

    A couple of other problems, a project being developed in VS 2010, but deployed into an environment with 3.5 installed doesn’t work. Because in a neat little blow to the usefulness of this tool, the vs9 version of the tool won’t read the vs10 scripts because no thought was made to backwards compatibility.

    So we’re left either abandoning the tool when we want it the most, or trying to persuade people to install a pre-release .NET 4.0 on their production servers just to allow a DB migration that nobody really trusts.

    I love the concept of datadude, it’s sexy and shiny and makes for a great demo. Yet like so many shiny demoable tools, it will really bite you in that final run when you really just want your tools to work.

    For me it’s back to change scripts, or maybe something like fluent migrations.

  7. Mitch Denny Post author

    *sigh*

    All you ALT.NET communists🙂

    I actually think things like fluent migrator are pretty cool. But I’ve also seen them bite you for the same reason you guys are saying that DBPro might bite you.

    I guess there are two issues in the particular case we are talking about:

    1. We can’t deploy some pre-release tooling, so we need to fall back to another approach.

    2. What is a good general approach to things like database migrations.

    Now I suggest that DB Pro, bring Microsoft’s tool, and the fact that a lot of .NET solutions use SQL server is probably going to be fairly well integrated into the SQL space in the coming years. That is why I put it in as the default choice category.

    Change scripts are the popular choice, as is RedGate actually as a tool that DBAs love.

    I’d really like to be able to crack the deployment problem across various projects, because not just this one has the problem. In a few months DB Pro is then a viable choice. In the meantime it’ll get you to the point that a popular choice can be used (change scripts).

    Now – moving forward, I think something like Fluent Migrator has a role to play – but not just in the database space, but more in the general servicing story for applications.

    We know that applications generally need to be able to be:

    – upgraded (and occasionally downgraded)
    – configured
    – installed
    – dependency checked

    It’d be great to have a framework that we could use post install that did all this stuff. We do in a way (for installs/upgrades/dependency checking) in System.Configuration.Install if you have ever spent any time in there (I have spent a lot of time there).

  8. silky

    The deployment “problem” isn’t so hard to crack; infact it’s basically solved.

    The “problem” is actually using the solution. And the problem becomes harder, when lots of databases are involved.

    But there is really only one option: get a diff script somehow (via some custom or existing tool) and execute it (via some custom tool or some existing strategy).

    Pretty trivial.

    The issue is really around having the patience (and time) to write the proper DB scripts (and, obviously, scheduling it at the right time, or however your deployed environment handles updates).

    Nevertheless, it’s “solved”, just not “done”😛

  9. Pingback: Database Migrations vs. Database Differencing « notgartner

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s