Attitudes on Bloat
One comment complained about the bloat:
1. I hate that VS.net (as wonderful as it is), is so frickin bloated.
To which Robert replied:
1) I’m learning C# and using Notepad. Is that bloated? And who cares about bloat when a gig of RAM is less than $100?
I’m not attacking Scoble, as I do respect him even if we live on different ends of the Microsoft Love spectrum, and I do admire that he’s using notepad. However, the if this is the attitude about using RAM that is around Redmond campus we are in trouble. Actually it’s not even Microsoft, it’s the vibe I get from a lot of programmers and companies. I will now procede to ramble about the old days and how things were so much better then, and how you kids are spoiled with all your fancy gizmos… why back in my day…
Back in the day there was hardly any memory to be had in computers, so programmers used wonderful tricks to get around this. Sure, it didn’t always work out in the future (for example, storing the year as only two digits), but for the most part it created smart people and brilliant hacks. Now RAM, and hard drive space, is stupidly cheap. A new machine really has no reason why it shouldn’t come with at least 1G of RAM and at least 120G hard drive. Processors run in the 2Ghz range and can move data around in that memory and drive space pretty fast.
There are gobs (well, at least three or four) different widget sets and programming tools that give you the building blocks to write rich and interesting applications, without having to worry about the nitty gritty. Hell, even in 1994-6ish (ah, the college days) using Borland Delphi 1.0 I could create an application without even typing (well, maybe not a full one, but there was lots of clicking instead of typing).
Compare this to the old days when you could fit a full pinball game into the space contained on an audio tape, and load it into 4k of memory! Programmers did things in assembler because they needed to squeeze every ounce of performance out of the hardware they could.
Which is better though? I know that Yoda said “The dark side is quicker, easier, more seductive.” Clicking the mouse is a heck of a lot easier than actually learning assembler and taking the time to move memory around by yourself.
The attitude of “hey, there’s lots of memory, we don’t have to optimize or worry about bloat” is worrysome though. This leads to more and more memroy and hard drive being used. It leads to other hardware having to speed up to make it less painful (IE: CDROM/DVD drives will have to speed up to get the program from the install medium to the hardware… or increase the bandwidth). On the surface it might seem like it is software following the hardware capabilities, but to me it seems that software is getting slower instead of faster. Each progressive version of Windows has needed more and more hardware behind it. Yes, each version does do more. As a consumer just once I’d like to install $CURRENTVERSION+1 and say “wow, that’s nice and fast,” instead of “shit, guess it’s time to upgrade again.”
If each version does more, and with each version the programming is getting better, faster, more efficient, shouldn’t these two factors cancel each other out and allow me to run $CURRENTVERSION+1 on $CURRENTHARDWARE? Windows 98 flies on a K7-900 with 512 megs of ram. Windows XP (and most apps) is acceptable on the same hardware. Will the hyped up Longhorn be able to run on it as well? I’m willing to guess it won’t run better. I’ll be willing to guess that the hardware requirements for the new WinFS filesystem (based on the much bloated MS-SQL Server IIRC) are going to be high enough that I’d need to upgrade.
I know I’m not saying it elequantly or rationally, but I know that the reliance of cheap hardware and cheap memory being there to use by developers is wrong. Not optimizing where you should because hey, the user will have lots of memory, is wrong. Writing inefficient code because no one will notice because hey, everyone’s running at at least 1Ghz+ these days, is bad.
Shit I do ramble on don’t I?