It may not yet be a widely known fact but computer chips have stopped getting faster. To improve performance, manufactures instead have to add more cores to each chip. And improving performance is imperative for chipmakers: most of the incentive to upgrade comes from the desire and need for that performance spike.
But what if the average user doesn’t need all that performance? To rephrase, what’s an office manager going to do with 12 cores on her or his desktop? Given that the tasks a typical user now does can snugly fit on a smartphone or netbook have we hit a wall in desktop chip performance? If there’s no mass consumer demand, chipmakers may at some point forgo the desktop market entirely.
Grim meathook future for a girl in the high performance computing business, whose financial feasibility relies on the crumbs from the consumer market. Gamers demand high performance for ever more realistic graphics, which drives the GPU market. If desktop chipmakers are going to stay afloat there’s got to be a huge market for chips which scream.
So what’s going to drive the CPU market? The answer is in what is at the heart of computing for most modern casual and corporate users: the internet. The internet needs to get more, well, needy. 3D browsing metaphors may well do it but the future will hopefully bring further surprises.
0 thoughts on “The future of the internet”
I think the internet is pretty needy enough as it is with Adobe Flash making even reasonably powered computers seem sluggish. And we already have hi-def videos on YouTube that require a decent CPU.
Good points–but I believe even Flash’s bloat and more HD won’t require 12-24 cores
Gamers are also driving in some extension the CPU market, and there are also task that require CPU ‘POWER’ like video editing, 3D modeling, etc, but i don’t see any other requirement of multiple cores besides watching HD videos on a browser for the overall user.
Yes, the Internet but not in the sense that you’ve concluded. Behind the scenes the Internet is just a huge bunch of computers connected to each other and those computers who serve the information to others are power–draining. Look at the server market, you’ll see that it is somehow different than it was, say, 10 years ago. Why it is different, or should I say which component is the most relevant to discussion about CPU cores? Well that is virtualization technology. Now I’d say that server market (in whatever way – super computers from universities, server farms from big companies et cetera) is driving CPU–market. And now when everything needs to be *in cloud* it’s more real than ever.
actually, the server market seems to be following the consumer market (google using off the shelf commodity hardware rather than specialized stuff, and the supercomputing industry bending GPUs to their needs rather than getting hardware designed for it). but you are right in that the future could instead hold the consumer industry needing to follow the footsteps of the server industry.