For those that don't know, Moore's Law is that the number of ICs on a given sized board (and, presumably computing power) will double at the same cost every 24 months. Considering Mr. Moore predicted this in 1965 I'd say he's been remarkably prescient.
I say this writing on a computer purchased in early 2002 with only minor RAM and hard disk upgrades since.
I bought my machine, a Sony VAIO, sometime in January 2002. It came with a 1.8 GHz P4 CPU, 128 MB of RAM, and Windows ME. The video card had 64 MB of RAM and it had 80 GB of hard disk space.
WinME was killed immediately in favor of XP. The hard disk developed a weird hiccup in early 2004 and I substituted a WD 160 GB drive for the main drive. Shortly thereafter I updated the RAM, RAMBUS and expensive, to 512 MB. Bought and installed a new video card in early 2005 with 256 MB video RAM. Total cost to upgrade about $500, mainly due to the expensive, and weird, RAM.
So now I sit with a fully capable machine with no real inclination to replace. And I see nothing, except for some nifty games, to compel me to upgrade in the next 12-24 months. When I do finally get a new machine this baby will probably be 7+ years old.
WTF? Is the technology now so mature that we can take our time to upgrade, secure in the knowledge that we won't be obsolete anytime soon? Seems that way. So, what does all the new 64-bit and dual core technology buy us? I'm not sure. I'm running Vista and VS 2005 without complaint.
This must be freaking out the marketing boys at Intel and AMD.
John's Law used to be replace your machine every 18 months or you're so far behind the tech curve you're in trouble. Hard to see an argument that it still applies.
Of course, my VFP friends know that VFP 9 happily percolates on any machine running Win2K (actually Win95 if you overlook some UI issues). Which translates to a base of a 60 Mhz Pentium and about 96 MB of RAM.
Amazing.
Friday, July 06, 2007
Subscribe to:
Posts (Atom)