• Desktop PC Platform: Killed By Overclocking

    Home » Forums » AskWoody support » PC hardware » Questions: How to troubleshoot hardware problems » Desktop PC Platform: Killed By Overclocking

    Author
    Topic
    #470961

    Desktop PC Platform: Killed By Overclocking

    Articles – Opinion & Editorials
    Written by Olin Coles
    Monday, 09 August 2010

    In my first entry to this series, Desktop PC Platform: Fears and Predictions, you were introduced to the basic framework of threats surrounding the desktop market segment. That article wasn’t meant to be a self-sufficient story, but instead provide an illustration of the chain of events that have precipitated to create the perfect storm. Desktop PCs are our life blood, after all, and you wouldn’t be here unless you held a vested interest in the future of this platform. I’ve already got more content prepared in support of my initial post, but this article will focus on one of the lesser-known threats: overclocking.

    No, it’s not the act of overclocking itself that threatens the survival of desktop computers as a platform; it’s the overclocking market that’s killing the industry. Allow me to illustrate my point with a few passages from our recent Best CPU Cooler Performance series:

    Why do we overclock? It’s really a very simple question, but one that has found new meaning over the years. It used to be that computer hardware enthusiasts had very few options when it came to choosing a processor, and building your own custom system was simply not possible. You looked for the best pre-built system, and compared Kilobytes of memory between choices. Those days are behind us, and now the computer hardware industry offers hundreds of processor, motherboard, memory, and peripheral hardware options. But the question still remains.

    In this paragraph, I state how overclocking desktop computer hardware was born from need, not packaged as a product. I go on to demonstrate how the industry picked-up on this enthusiast hobby:

    Its been more than a decade, but I still remember why I began overclocking: it was out of necessity, because my computer operated close to a modern day speed limit. This was back in the day when computers featured a ‘Turbo’ button, overclocking from 33 to 66MHz was a click away. It wasn’t until around 1998 that I began visiting ‘enthusiast’ websites and found myself overclocking a pathetic Cyrix M-II 233MHz processor. My pursuit for speed would risk an entire Packard Bell computer system for the purpose of finishing reports faster. Back then, overclocking the CPU could push clock speeds past any production level. Today the market is different, and overclocking the processor could result in very little additional performance.

    So overclocking began when enthusiasts simply needed hardware that could drive at the speed limit, and not necessarily to outperform a reasonable need for speed. That’s when the component hardware industry stepped in to make a profit:

    Now days I’m fortunate enough to afford top-end hardware, and so I no longer overclock out of need. With so many dual-, quad-, and hexa-core processors sold on the open market, it seems unnecessary to overclock for the sake of productivity. Overclocking has transformed itself from a tool to help people work faster, into a hobby for enthusiasts. There’s a level of overclocking for every enthusiasts, from simple speed bumps to the record-breaking liquid nitrogen extreme projects. Overclocking is addictive, and before you know it the bug has you looking at hardware that might cost as much as a low-end computer system.

    At its inception overclocking computer hardware was a tool for making the incapable, capable. Professional, students, enthusiasts, and countless personal users, all found that using the computer was more enjoyable when it kept up with the demands placed on it. For the longest time, the industry couldn’t sell a piece of hardware that satisfied the fast-paced tasks a user could throw at it. When it slowly began to happen, which is subjective due to individual perceptions of need, the computer component industry created an entire market segment dedicated to hardware enthusiasts and overclockers.

    The age of overclocking hardware was born. Effectively standardized overnight, computer hardware components were separated into various categories of quality. There was budget, mainstream, professional, and then enthusiast. We’ve witnessed this trend for years now, as graphics solutions, processors, system memory, motherboards, and even power supplies have all be segregated by class. That’s when overclocking stopping being the solution, and became the problem.

    The examples are everywhere: Intel’s $1000+ ‘Extreme Edition’ desktop processors, Gigabyte’s $700 GA-X58A-UD9 motherboard, and $300 system memory kits for overclockers. While there are people willing to buy these items, they often lose sight of the original purpose behind overclocking: making something slow become fast, and getting something more for no added cost. Tacking $2000 onto the price tag of your computer system is hardly keeping in the spirit of overclocking, and is more closely identified with showing off how much money you can spend. The problem only gets worse.

    Back when I was taking my first baby steps into overclocking by risking everything to push a lousy Cyrix M-II 233MHz processor an extra 33MHz, the reward was a 15% bump in speed and a noticeable increase in performance. That was before computer hardware could keep up with user demands. These days, most hardware components are faster than you’ll ever need. Enthusiast-branded products simply mean you’re paying a premium for the privilege to own hardware capable of yielding an overclock… but once you’ve paid their price there’s no guarantee you’ll experience any difference.

    At some point the computer industry went from asking consumers to pay more for the faster products, to paying more for products you might be able to make faster. This runs opposite of other industrial markets, which is why manufacturer’s have spent so much of that added cost on convincing you that the purchase was necessary. Intel’s Core i7-980X 6-Core CPU was advertised as the “Ultimate Gaming Weapon”, but testing proved it did nothing at all for video game performance when paired with a suitable (and much less expensive) video card. The same message is parroted by memory manufacturers, who have notoriously labeled their products as gamer this-or-that. So how long can this business model last?

    Viewing 6 reply threads
    Author
    Replies
    • #1238691

      I remember overclocking, or should I say full clocking, an early IBM PC-AT. IBM installed 6Mhz {yes that is only 6} clock crystals to play it safe with the I-286 processor. However, the processor was spec’d at 8Mhz. 7 bucks later a quick pull and plug of a new clock crystal and I had a machine that was 33% faster – Zoom-Zoom…never had a problem with it!

      May the Forces of good computing be with you!

      RG

      PowerShell & VBA Rule!
      Computer Specs

    • #1238745

      There’s even a decent chance that overclocking will reduce the performance of a system these days because processors will throttle back quite readily at lower cutoff temps if the extra heat is not efficeintly removed.

      I think the model is already quite limited but will survive because its the nature of a computer to be far faster than needed most of the time…but when it needs to be fast, its never fast enough, even with multi-processors smoothing out the workflow.

    • #1238763

      Personally, I think the author is on something as opposed to onto something. LOL

    • #1240184

      I disagree with the whole premise of the story. The industry is simply doing what all businesses do, deliver a product the consumer wants, in this case the fastest hardware possible. That will not hurt the desktop segment at all. If anything, the high price a very small segment is willing to pay for that extra bit of speed will eventually, faster than in any other industry I would be willing to guess, trickle down to mainstream desktop. Granted there are some components that probably will not, i.e. Extreme Editions of CPUs. I just don’t see how this hurts the desktop pc.

      Now if the industry stopped making mainstream / entry desktop pcs to cater solely to the higher paying enthusiasts then you might have something. But, they still make mainstream and entry pcs and maybe just a bit cheaper, lower markup, because some enthusiasts are buying the higher end stuff, higher markup.

    • #1240424

      Lets face it, there are many, many more main stream, entry level or better consumer PCes being sold than those top of the line “extreme” PCes. Manufacturers are not stupid, of course they will cater to the masses. If they stopped making anything it would be those high end PCes, although I don’t see this happening anytime soon either. I agree that those high end PCes eventual do trickle down to the masses, at least parts of them. Manufacturers will always find ways to cut corners to save a buck. Let’s hope this trend continues, because we all benefit from it.

    • #1240455

      The enthusiast market for do-it-yourself PC builders right now is fantastic. Never before has there been so much choice
      out there for hard core PC gammers, avid overclockers, and anyone just looking to build his or her own computer at home
      …and I’m all for choice. …and there are probably too few people doing this to have any negligible effect on the desktop computer market as a whole. I would say the market for Laptop computers has had more of a detrimental effect on the desktop market than anything.

      If anything, overclocking has never been anything but good for the entire industry, and any view to the contrary is just plain wrong.

    • #1245553

      As much as I do respect Olin Coles, overclocking is not “killing the industry”.

      On the contrary, the term “overclocking” is so broad that it includes,
      for example, common features like SpeedStep and Turbo Boost.

      I had to clear the RTC RAM in my main workstation yesterday,
      and it was a piece o’ cake to enter the BIOS and
      enable a tried-and-true overclocking profile that
      takes a Q6600 to 3.0 GHz dynamically in conjunction
      with SpeedStep (9 x 333 MHz).

      That is a STOCK SETTING on that workstation, not a “bleeding edge”
      experiment that needs liquid nitrogen to prevent toasting well done.

      If anything, the concept of “overclocking” needs to be applied to storage subsystems
      with as much enthusiasm as we find in the CPU and RAM sectors e.g.:

      http://benchmarkreviews.com/index.php?option=com_content&task=view&id=11178&Itemid=21

      How long did it take for a hardware RAID controller to use all x16 PCI-Express lanes?

      Also, if you bother to investigate the specs emerging for PCI-Express 3.0 (“Gen3”)
      we see that each lane will oscillate at 8 GHz and use 128/130 “jumbo frames”
      instead of the obsolete 10/8 serial protocol:

      http://benchmarkreviews.com/index.php?option=com_content&task=view&id=10291&Itemid=22

      Western Digital have now moved to their “advanced format” for the raw data structure on HDDs.

      So, a decision will be soon upon us: where will the conversion from 128 bits-to-4,096 bytes occur?

      At a minimum, advanced SATA channels should be transmitting 4K “jumbo frames”
      across the the interface, thus eliminating the hardware and logic required to strip
      off one start bit and one stop bit on every byte transmitted.

      This kind of thinking should also have the desirable effect of further improving the
      speeds of SATA/6G SSDs, assuming they will be available sometime during the
      next year (or two).

      (I get the feeling that the IT industry is holding back SATA/6G SSDs, as a
      way of maximizing profit during our national economic Depression.
      I really cannot blame them, given the vast number of crooks on Wall Street
      e.g. see Sony’s “Inside Job” narrated by Matt Damon.)

      Can we imagine storage subsystems with jumper blocks allowing fast interface speeds
      to be selected at will? How about auto-negotiation of that speed? This is already
      happening with Ethernet switches and routers.

      From a scientific point of view, we as a community of users and designers
      will never advance our art if we don’t “push” subsystems to see if, when and where
      they will break.

      MRFS

    Viewing 6 reply threads
    Reply To: Desktop PC Platform: Killed By Overclocking

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: