Chad Perrin: SOB

17 May 2007

a brief history of Windows on the desktop for people with faulty memories

Filed under: Geek — apotheon @ 01:34

I’ve been an on-again, off-again computer geek throughout my life. Since about 1997, however, I’ve been on-again the whole time. While I missed wide swaths of the history of the advancement of PCs, thanks to the off-again periods, I’ve also seen a lot — particularly since 1998, when I actually reached a level of computer knowledge one might call “professional”. I’m regularly amused by the way everyone around me seems to forget the lessons of the past, and the historical context of our computing environment, even among those who have lived through the same events I have.

In the interests of correcting some common errors of forgetfulness, I think I’ll provide a very brief run-down of some of the development of the MS Windows operating system that are, from what I’ve seen, most often forgotten:

  1. MS Windows 3.11 was really the first useful release of MS Windows in the sense of a modern, network-capable OS. It used a WIMP GUI that ran on top of the operating system, which interacted with the standard command line interface of the OS beneath it — DOS. The GUI was reasonably stable, but did have its little glitches. Luckily, such a glitch didn’t tend to kill the entire OS, but instead just killed the GUI and anything that specifically depended on it.
  2. The first MS Windows 95 release was still a 16-bit OS with a GUI overlaid, but in this case most of the useful userland bits of the system wouldn’t run outside the GUI. A failure of the GUI was effectively a failure of the OS as a whole, despite the fact that the GUI wasn’t technically part of the OS (yet). This sort of thing was part of the reason I never really used Win95 — I stuck with 3.11 basically until Win98 was available. I wasn’t the only one, either.
  3. The second Win95 release was finally a 32-bit OS, and it actually integrated the GUI with the rest of the OS rather than simply running it as a separate piece of software over the top of the OS. Considering the way the first Win95 release was pretty much unusable without the GUI, though, this wasn’t a huge problem. Overall, Win95’s second release was an improvement over the first release, but mostly only because of the ability to support larger filesystems.
  4. MS Windows NT 4.0 was a huge improvement over Win95 of any vintage. It was more stable, more capable, and better designed. Though NTFS was by no stretch up to the standards of certain non-Microsoft filesystems, it was a tremendous technical advance over Win95. There was actually some kind of rudimentary attention to security in the design of NT4, as well, which was certainly different from Win95.
  5. MS Windows 98 was a flash-and-glitter update more than anything else. It provided better graphical capabilities than either Win95 or NT4. Unfortunately, it was also less stable than NT4 by a long shot, provided no kind of security at all, and had passed the point of no return on crufty feature-creep. NT4 had been pushing the feature-creep problem, but hadn’t quite gotten past the point of no return yet, in my estimation. Win98 blew it out of the water on that score, including so much bloat in the core system that it was no longer a truly maintainable piece of software. It was no longer small enough or simple enough to refactor when and how needed. There was one substantial benefit to Win98 outside of the flash and glitter, however: hardware support. While its “plug and play” capabilities were flaky and prone to failure at times, they still provided a tremendous advance over the difficulty of support for new hardware from earlier versions. This was really the point of no return for the “Windows is easier!” argument, too, as people started expecting “plug and play” capabilities from everything. The fact the rest of the industry caught up almost immediately, and in some cases provided even better functionality in this area, did nothing to dispel the mystique of the first across the line.
  6. MS Windows NT 4.0 Service Packs increased the capabilities of NT4, providing it with much of what it lacked in comparison with Win98. Unfortunately, as many of us discovered, to achieve that end it provided not only greater graphical capabilities but also the greater instability of the Win98 OS. Suddenly, NT4 was no longer the rock-solid stability poster child of the Microsoft lineup of OSes. Some wondered: Is this the beginning of the end for Microsoft as a business-worthy OS vendor? That was an interesting bit of timing, considering that it wasn’t until the latter days of Microsoft Windows NT 3.5 that anyone really considered Microsoft capable of providing real business-class computing in the first place. That would have made Microsoft a bit of a one-hit wonder.
  7. MS Windows 2000 was, luckily for Microsoft, something of a reprieve on the business-class computing front. Initially it was a disaster, with pathetically poor hardware support, and it was marketed to both business and home users as an attempt to unify the two separate lines of Microsoft desktop OSes with incredibly poor results for a number of reasons. That problem was shortly defused, however, by a combination of a distraction (MS Windows ME) and a distinct improvement in hardware support. Once people were able to actually get it working on their hardware, and once the security-averse general public was steered away from Win2k, those who were using it discovered that it was a return to the halcyon days of stability and security of NT4 before all the service packs gummed up the works. Better yet, it was a return to those standards of security and stability without giving up high-end (for MS Windows) graphics support.
  8. MS Windows Millennium Edition was an attempt to drag the home user market into the realm of new OS license purchases. Originally, it was hoped that the improved security and stability of Win2k over Win98 would suffice to do this, but Microsoft had indoctrinated its primary home user customer base so successfully with its “security doesn’t matter” approach to software development that they never quite grasped the advantages of Win2k, and the clumsy way it had been released with poor hardware support and uncertain, largely invisible (to the end user) benefits ensured its failure on that front. WinME was a desperation play, getting something out the door as quickly as possible that provided some kind of notable “upgrade” over Win98 without coma-inducing focus on those pesky issues of stability and security that plagued Win2k. Unfortunately, after initial success with its newfangled home user OS, Microsoft’s WinME quickly became the red-headed stepchild of the MS Windows family — it lacked security and stability so badly that it proved the wake-up call for the MS Windows home user that was needed before Win2k could have become popular. Because Win2k was an “older” OS (despite being based on newer technologies than WinME), though, the home desktop market wouldn’t take advantage of Win2k’s stability and security. Something new was needed — and even Microsoft wanted to pretend WinME didn’t exist at this point.
  9. MS Windows XP was proof that it hadn’t given up on unifying its business and home desktop lines. In fact, WinXP was nothing more than Win2k with a Fisher-Price widget set and other “ease of use” interface features slapped onto it. The point, of course, was that Microsoft had learned from the failure of Win2k, and was making up for it. This was, in short, the Win2k it had meant to release in the first place — though Microsoft didn’t realize at first that this was what it really wanted. It had extensive hardware support (because Win2k did, after all that improvement), a friendlier interface (because that’s all Microsoft really did to change Win2k in the interim), and a less business-oriented image (because it wasn’t available as a dedicated server OS and had a less austere, functional appearance).
  10. MS Windows XP Service Packs have changed the face of WinXP drastically. In particular, SP2 replaced a lot of WinXP’s guts. Ultimately, it was all accomplished for no better reason than to make WinXP more marketable. Security features were added, but better security architecture was not — the deep changes to the OS itself were really only for the purpose of introducing greater integration between the OS and delivered feature sets. This provided all the problems of a rearchitecting of the OS (a bunch of software and hardware stopped working with it, for example), but not much in the way of the expected benefits.

Because this is about the history of MS Windows, and not its present, there’s no need to discuss MS Windows Vista. People look at Vista now and know exactly what they’re looking at. They look backward, and they don’t see the way things were. I recently read in a discussion forum about how someone was disappointed with the way “Device driver support has also kept Windows popular, and Vista goofed here as well,” as though extensive hardware support existed before Win98, and as though Win2k didn’t have any driver problems when it was first released.

Keep some of these facts in mind the next time you want to reminisce about the past successes and failures of MS Windows as compared with Vista, or whatever comes next as Microsoft’s newest desktop OS. The past may not be exactly as you remember it.

a sickness in the software industry

Filed under: Geek — apotheon @ 12:07

In a TechRepublic article titled Server ‘roles’ to save companies headaches, not cash, I read that Microsoft will be offering slimmed-down installs of Windows Server 2008. I also read that these slimmed-down versions of the OS will not get you any discounts on the price you pay for server licenses.

I’m reminded of something similar in the food service industry — specifically, Ruby Tuesday. There’s an article in Time Magazine about Big Chain Restaurants’ New Small Portions that discusses how TGI Friday’s and the Cheesecake Factory are offering smaller portion sizes for entrees, with an attendant reduction in price as compared with the full-size portions they offer. In 2004, Ruby Tuesday tried doing that as well, complete with nutritional information printed in the menu — but, like Microsoft, it didn’t cut back on the prices any when it cut back on your return on investment. You would have paid the exact same price for the privilege of being given less of the same food, all in the name of “health”.

Unsurprisingly, Ruby Tuesday’s little plan backfired, and everyone hated the idea. Ruby Tuesday no longer offers those reduced portions with nutritional information in the menu. This is a sign of a fairly healthy industry, even if it’s not exactly a health industry — when someone tries to squeeze extra profit out of you without providing at least the same value for your dollars spent, you’re likely to say “No thanks, I’m going somewhere else.” The software industry, unfortunately, is not so healthy — Microsoft sells you less for the same price, and you think it’s normal, or even some kind of improved value.

From the article at TechRepublic:

Laing said such pricing would also further confuse Microsoft’s product lineup, which already has different pricing options based on the scale of the server. With Windows Server 2003, there are standard, enterprise and data center versions of the main OS, as well as a separate storage server, compute cluster edition and small-business version.

“It’s very hard for everyone to manage if you go by scale and then the specific role they are going to run,” Laing said. “It would be very hard to do.”

I think it’s a sign of how sick Microsoft has made the software industry that statements like this can be made with no irony or humor, in all seriousness, and people just accept it as a reasonable statement. Think about it for a moment:

Microsoft already charges you for your server OS software based on how much use the system will get. If the system gets more use, it needs to be licensed by a more expensive agreement. In other words, the more use you get out of it, the more it costs you.

On the other hand, you do not have to pay specifically for its capabilities. If it is of more limited capabilities, you still pay the same price. This is, according to Laing, because it would be too “confusing” to combine pricing differences for different ranges of capability with pricing differences for how much you use it.

When you buy a chainsaw, you pay for it based on its capabilities — not based on how many trees you’ll cut down. When you buy a car, you pay for it based on its performance, efficiency, and safety characteristics — not how many miles you’re going to drive it. When you buy a laptop, you pay for it based on its performance and resource specs, peripherals, and physical characteristics (keyboard, screen size, weight, et cetera), not based on whether you use your laptop all day for work, how many hours of World of Warcraft you’re going to play on it per week, or whether you’re planning to loan it to family members now and then. When you buy a meal at TGI Friday’s, you expect to pay a lower price for smaller portions — you’re paying for the quantity of food you get, all else being equal, not for how much of it you actually eat.

Software, however, is different — and Microsoft made it that way. Yeah, the software market is pretty damned sick, and Microsoft is spreading the plague.

All original content Copyright Chad Perrin: Distributed under the terms of the Open Works License