Wednesday, April 16, 2008

Office 2007 = 12 times more memory than Office 2000?

Fat, Fatter, Fattest: Microsoft's Kings of Bloat
posted by Thom Holwerda on Tue 15th Apr 2008 20:12 UTC, submitted by Craig Barth
"What Intel giveth, Microsoft taketh away. Such has been the conventional wisdom surrounding the Windows/Intel (aka Wintel) duopoly since the early days of Windows 95. In practical terms, it means that performance advancements on the hardware side are quickly consumed by the ever-increasing complexity of the Windows/Office code base. Case in point: Microsoft Office 2007, which, when deployed on Windows Vista, consumes more than 12 times as much memory and nearly three times as much processing power as the version that graced PCs just seven short years ago, Office 2000. Despite years of real-world experience with both sides of the duopoly, few organizations have taken the time to directly quantify what my colleagues and I at Intel used to call The Great Moore's Law Compensator (TGMLC). In fact, the hard numbers above represent what is perhaps the first-ever attempt to accurately measure the evolution of the Windows/Office platform in terms of real-world hardware system requirements and resource consumption. In this article I hope to further quantify the impact of TGMLC and to track its effects across four distinct generations of Microsoft's desktop computing software stack."

See the full Infoworld article here.

One interesting comment:

You think it's just Office? Its ALL SOFTWARE. The dirty secret that nobody wants to talk about is that it's the OOP software paradigm. It's been a dismal failure and your bechmarks prove it. OOP was supposed to make software development faster and easier. In fact it simply made it bigger and harder to work on. Easily 80% of all clock cycles are consummed diving through ever-deeper heirachy trees full of do-nothing, duplicated getters and setters at every level. Moores Law is the only reason anything still workds. Imagine where we'd be if we didn't spend all those cycles trying vindicate a failed programming paradigm. Back in the 80's people thought true artificial intellegence would be possible when computers achieved 2G of memory. Today my desktop has 4g, plus 4, 3.2-gigahertz processors and more disk space that my employer owned in the entire company in the 80's. But its performace barely matches the TRS-80 I learned basic on. It makes me wish I had done something useful with my life -- like Walmart greeter.
Post a Comment