Your browser may have trouble rendering this page. See supported browsers for more information.

This page shows the source for this entry, with WebCore formatting language tags and attributes highlighted.

Title

Was software good enough 5 years ago?

Description

<a href="http://www.neilgunton.com/rewrites_harmful/" author="Neil Gunton" title="Rewrites Considered Harmful?: When is 'good enough' enough?">Rewrites Considered Harmful?</a> thinks developers, especially Open-source ones, make a mistake when moving to a completely new code-base. You see, the old code-base had a lot of testing time in it and was, if not more stable, at least had known bugs. This is a good argument and one that most project managers are aware of. However, stability is relative to the given feature set. A product is well tested with feature set A. Within 3 years, the product will have feature set B. What must a project manager ask him/herself? <ol> Is it possible (technically) to get feature set B into this code base? How much will it cost to develop feature set B with the existing code-base? How interested are developers in continuing work on this code-base? How flexible is the code-base (if new, similar feature requests come up, will it be possible to integrate them quickly and cheaply)? </ol> If the answers to these questions are too negative, then it may simply be cheaper to start from scratch, knowing full well how much testing time is being thrown away with the old code-base. His most salient example is Apache 1.x vs. Apache 2. This is a migration that is taking longer to happen than many have expected. I believe this is only because their expectations are too high. Which situation would he rather have? That Apache 2 has a slow adoption rate, but is completely developed and in beta long before it's actually needed? Or that Apache 1.x has long outlived its usefulness, but there is no replacement on the horizon and Apache 2 will be done 'whenever'? I find it nice that Apache 2 is there when the cracks in Apache 1 start to show. As stated above, his article completely ignores the need for maintainability and flexibility in software. In describing the Apache codebase, he says of threading <iq>Couldn't we have instead built on the existing code, which by now is extremely robust (if a little messy)...</iq>. I know software and I know threading and I <i>know</i> that it is very easy to make a design that is completely and utterly thread-incompatible. At this point, the patch for existing code would affect so much code that a rewrite is probably a safer proposition. There are things that are impossible, no matter how many people and how much time you throw at it. The human element is also missing from his analysis. These open-source projects are done by software professionals --- for free. No one likes to play around with badly-designed APIs and spaghetti code; especially if you're smart enough to create a better design and smart enough to implement it. <bq>if software works and is used by lots and lots of people quite successfully, then why abandon all the hard work that went into the old codebase, which inevitably includes many, many code fixes, optimizations, and other small bells and whistles that make life better for the user.</bq> Because it's fun. I know that's not a good answer, but you're <i>never</i> going to find a good programmer who's happy adding new functionality or optimizations into a horrible code. It sucks. The testing alone is nightmarish. The <iq>code fixes, optimizations, and other small bells and whistles</iq> are inevitably an unreusable, incoherent mass that just happens, through massive testing, to work. It's just a matter of time before you come up against a bug in that code-base that you <i>just can't fix</i> (for a perfect example, ask yourself why Microsoft hasn't made a single fix to their renderer in IE for almost 4 years). He also asks <iq>Are we really doing anything all that different today than what we were doing ten years ago? Or even 20 years ago?</iq>, then mentions that <iq>[He] use[s] Netscape 4.80 on a daily basis - not because [he] love[s] it, but simply because it is fast</iq>. I'm not sure which websites he's visiting, but Netscape is an unpredictable renderer and often simply 'hangs' on pages, which detracts from its speed somewhat. This hangs together with a critique of the advances in specification languages: <bq>Some of the changes to HTML were done in a way that shouldn't break old browsers, but as I said before, I am increasingly seeing websites that don't render properly in Netscape 4.x - and believe me, when I see them in Mozilla, they are really not doing anything that couldn't be achieved very readily with "old" HTML. So apparently the FONT tag is deprecated - now we have to use style sheets and whatnot to do something that was originally very simple - e.g. making some text red. Why? We sacrifice simplicity in the name of an intellectual goal that promises greater consistency, but at the expense of being able to do simple things quickly.</bq> This is a classic misunderstanding of standards. The standards are not there to necessarily make web pages prettier today. Mostly the standards increase maintainability by several orders of magnitude. Most sites don't look very different with proper standards applied, but some do ... and they certainly behave much better when viewed with other browsers (how well does HTML 4.0 with a bazillion nested tables render on a phone?). This site takes advantage of standards to offer users themes for the web site, so each can customize the view. Other sites do amazing things with presentation. That will become more and more prevalent. <bq> And still, all most people really want to do is browse the web...</bq> Yes, but someone has to build a web to browse ... and building large sites with HTML only (no stylesheets, no XML, etc.) would involve a lot of repetitive work and would be very error-prone. Not to mention no fun at all. You see, when you're using a free browser to browse free content, you really have no right to bitch that the people providing that content aren't working on stuff you find useful. I get a distinct sense of 'settling' in Neil's attitude. The classic case of "I've learned enough about computers, now stop advancing so I don't get left behind.". He says it's a <iq>subtle</iq> point he's making, that <iq>[o]ften, the rewrite process seems to be driven by a desire to make the product somehow more theoretically consistent and complete</iq>, but that's not it. Software is made by capitalist companies, for the most part, and these companies make new versions because it is profitable. That means users are demanding new features and that those features are more expensive to build into the existing codebase. Either that, or it's flat-out impossible to do it. Often the increased maintainability and flexibility of better-designed (or better than not designed at all) code far outweighs the cost of retesting. That's another thing to note about this article: he has no patience. I know we've been trained to have a pathetic attention span, but transition doesn't happen instantaneously ... nor even within 3, 4 or 5 years. The change is gradual. This web site is developed with PHP 4.x. PHP 5 is now in Beta 3. I won't be porting this app to PHP 5 anytime soon because it involves too many changes to the code base and will take a lot of testing. That doesn't mean that PHP 5 is useless and not necessary. I could have built my apps more easily had I had PHP 5 to begin with, but PHP 4 wasn't crap either. PHP 5 simply has useful advances that make development easier. Change happens. Change is good. Change is fun.