The now-famous Law of Computer Bloat, as shown graphically in this Gizmodo article, doesn't quite seem to cut it any more.
It portrays the difference between potential performance & actual performance as a constant proportion, but the gap seems to be widening.
Take, for example, the specifications & performance of the three main PCs I've owned:
1. Windows 95 on a 233MHz processor with 128MB RAM, acquired 1997:
Took up to five seconds to launch an app, but was completely stable & ran pretty fast for a machine of its era. Booted in about thirty seconds, shut-down in five. Never crashed or BSoD'd until the BIOS chip on its cheap motherboard wore-out.
2. Windows XP on a 2.1GHz processor with 512MB RAM, acquired 2002:
Launched apps almost instantly, sometimes froze for a couple of seconds while saving work. That said, it went like a rocket most of the time. Took about twenty seconds to boot & ten to shut-down. Never crashed or BSoD'd until I got a virus due to accidentally disabling my AV software while updating a third-party application.
3. Windows 7 on a 2.4GHz processor with 1.5GB of RAM, acquired 2010:
Can take five to ten seconds to launch apps, always freezes while saving substantial changes to files, can lag quite badly when playing media in spite of a reasonable graphics card. Often will not boot, requiring use of repair CD, can take nearly a minute to boot when it does decide to & as much as thirty seconds to shut-down. Has only BSoD'd once in all the time I've had it (when I attached a second monitor).
Why is it that; as my computers get exponentially more powerful, in accordance with Moore's Law, the actual productivity they support seems to be dropping off in a linear fashion that implies my computer will be more-or-less unusable by about 2015?!
It portrays the difference between potential performance & actual performance as a constant proportion, but the gap seems to be widening.
Take, for example, the specifications & performance of the three main PCs I've owned:
1. Windows 95 on a 233MHz processor with 128MB RAM, acquired 1997:
Took up to five seconds to launch an app, but was completely stable & ran pretty fast for a machine of its era. Booted in about thirty seconds, shut-down in five. Never crashed or BSoD'd until the BIOS chip on its cheap motherboard wore-out.
2. Windows XP on a 2.1GHz processor with 512MB RAM, acquired 2002:
Launched apps almost instantly, sometimes froze for a couple of seconds while saving work. That said, it went like a rocket most of the time. Took about twenty seconds to boot & ten to shut-down. Never crashed or BSoD'd until I got a virus due to accidentally disabling my AV software while updating a third-party application.
3. Windows 7 on a 2.4GHz processor with 1.5GB of RAM, acquired 2010:
Can take five to ten seconds to launch apps, always freezes while saving substantial changes to files, can lag quite badly when playing media in spite of a reasonable graphics card. Often will not boot, requiring use of repair CD, can take nearly a minute to boot when it does decide to & as much as thirty seconds to shut-down. Has only BSoD'd once in all the time I've had it (when I attached a second monitor).
Why is it that; as my computers get exponentially more powerful, in accordance with Moore's Law, the actual productivity they support seems to be dropping off in a linear fashion that implies my computer will be more-or-less unusable by about 2015?!