Technology Collusion?
October 11, 2004 12:59 AM Subscribe
I've been told the computer hardware industry is based on holding back the release of existing technologies so that people upgrade in small increments. Is this true? If so, wouldn't the first company to break the rule get a chance to dominate the market? Is there some kind of backroom deal going on?
One reason is that if say ASUS was to come out with a motherboard that supports PCI-Express, but there are no PCI-X cards around, what's the point? Everyone has to wait for at least the major players to be ready to launch around the same time. I think a good example of this is DDR II memory, it's supposed to be better than DDR, but unless I'm mistaken, there are no common chipset motherboards out there that support it.
posted by riffola at 1:22 AM on October 11, 2004
posted by riffola at 1:22 AM on October 11, 2004
I think the question may be tuned to the clockspeed situation in CPUs.
posted by Keyser Soze at 1:35 AM on October 11, 2004
posted by Keyser Soze at 1:35 AM on October 11, 2004
It's not just clockspeed, but also a question of buspath and bandwidth.
Outside of graphics cards and the gaming market, computer hardware's designed more for the professional and enterprise markets, as those are the customers who'll spend the most on complete systems and support. Newer form factors that'll allow better better heat compensation and dataflow are due to become commonplace by the end of the decade. It'll be rolled slowly, so IT departments and small businesses can plan their budgets accordingly, while making as graceful a transition as posible from whatever legacy products they own.
One reason consumers have a somewhat longer wait is due to issues regarding peripherals. With so many competing standards, it's tricky to tell which items/manufacturers are likely to endure. Joe user/Grandma may not see the need for updatiung to 64-bit operating systems/software when their current stuff runs just fine. But as such systems and architecture become more commonplace, the pressure to move ahead will build, just as it had when NTFS began pushing older Windows setups out the door.
posted by Smart Dalek at 6:36 AM on October 11, 2004
Outside of graphics cards and the gaming market, computer hardware's designed more for the professional and enterprise markets, as those are the customers who'll spend the most on complete systems and support. Newer form factors that'll allow better better heat compensation and dataflow are due to become commonplace by the end of the decade. It'll be rolled slowly, so IT departments and small businesses can plan their budgets accordingly, while making as graceful a transition as posible from whatever legacy products they own.
One reason consumers have a somewhat longer wait is due to issues regarding peripherals. With so many competing standards, it's tricky to tell which items/manufacturers are likely to endure. Joe user/Grandma may not see the need for updatiung to 64-bit operating systems/software when their current stuff runs just fine. But as such systems and architecture become more commonplace, the pressure to move ahead will build, just as it had when NTFS began pushing older Windows setups out the door.
posted by Smart Dalek at 6:36 AM on October 11, 2004
And just as OS X eclipsed 8.6/9.x as well.
posted by Smart Dalek at 6:37 AM on October 11, 2004
posted by Smart Dalek at 6:37 AM on October 11, 2004
I don't think there's coordinated collusion to keep new technology out of the marketplace, for the very reasons you describe, but there's certainly plenty of cases where a deliberate control of supply is used to inflates prices. RAM comes to mind. Various manufacturers cripple products so they can compete on different purchasing price levels (Intel; ATI with their 9800 PRO/XT cards; Canon with the 300 Rebel/10D; etc.)
posted by Civil_Disobedient at 7:27 AM on October 11, 2004
posted by Civil_Disobedient at 7:27 AM on October 11, 2004
I thought demand was slowing for super-fast high end computers. Ten years ago an "average pc" that could run current games and software would cost nearly $2000, but nowadays you can get by pretty well with a $700 system.
If there's less demand for super-fast stuff at the consumer level I can see manufacturers holding back.
posted by bobo123 at 8:42 AM on October 11, 2004
If there's less demand for super-fast stuff at the consumer level I can see manufacturers holding back.
posted by bobo123 at 8:42 AM on October 11, 2004
bobo123 - Maybe that's why monitors are still so expensive. As soon as LCD's hit the market, 20" CRT's dropped to sub-$300 levels. [old man voice] Back in my day, 20" monitors cost a grand, easy. [/old man] Now a crippled LCD of the same size costs as much as the old CRT's used to (the crippling being inherit in the design of LCD's -- slow refresh rates, inaccurate color off-axis, no good-looking scaled resolutions). But it weighs less, so "Yay LCD."
And while system speed isn't as crucial as it used to be, top video cards still fetch a small fortune. The price of being on the cutting edge of gaming, I suppose.
You could also argue that, perhaps passively, the software and hardware companies are colluding by constantly churning out OS's that are bogged down in the mire of "value added" glut, requiring faster systems and more RAM just to boot.
posted by Civil_Disobedient at 9:47 AM on October 11, 2004
And while system speed isn't as crucial as it used to be, top video cards still fetch a small fortune. The price of being on the cutting edge of gaming, I suppose.
You could also argue that, perhaps passively, the software and hardware companies are colluding by constantly churning out OS's that are bogged down in the mire of "value added" glut, requiring faster systems and more RAM just to boot.
posted by Civil_Disobedient at 9:47 AM on October 11, 2004
This thread is closed to new comments.
When Intel developed the "C" processor, they first released it at nearly optimal speed. The 3.2c (3.0?) was very expensive, and very fast. The manufacturing costs, however, were great at first. After slow sales, Intel clocked down the CPU and released it at 2.4 and up.
Overclockers were quick to point out that if you bought high grade RAM, you could hit 3.0 or higher from a 2.4c. Sales skyrocketed. AMD muscled in, by selling high quality CPUs that were simple to overclock. Overclocking has a positive effect on midrange computer equipment as a whole. This didn't really answer your question but I thought you may like to know either way.
posted by Keyser Soze at 1:08 AM on October 11, 2004