NVIDIA trumpets the death of the CPU

The recent conflict between Intel and Nvidia moved from the market to the personal arena, where the two competitors are throwing offenses at will. The latest hit comes from one of Nvidia's vice-presidents, in a private e-mail message claiming that the CPU is dead and has long since run out of steam.

The private mail message was intercepted by tech website Inquirer, and holds Roy Tayler's opinions regarding Intel's central processors. The letter is dated April 10, but the final recipient is currently unknown.

"Basically the CPU is dead. Yes, that processor you see advertised everywhere from Intel. It’s run out of steam. The fact is that it no longer makes anything run faster. You don’t need a fast one anymore. This is why AMD is in trouble and it’s why Intel are panicking," Tayler claimed in the message.

"They are panicking so much that they have started attacking us. This is because you do still [need] one chip to get faster and faster – the GPU. That GeForce chip. Yes honestly. No I am not making this up. You are my friends and so I am not selling you. This s*** is just interesting as hell," he continued.

However, Nvidia claims that the above message does not reflect any official stance whatsoever. According to the company’s spokesman Brian Burke, the message is not a public statement and "the views in Roy Tayler's e-mail do not mirror the views of Nvidia."

It might be true that the e-mail message reflects Tayler's own opinions, yet, the company stated a while ago that "you need nothing beyond the most basic CPU," in order to get things done. This means that Nvidia thinks that the CPU might not be dead yet, but it's just one step closer to its grave.

Intel, of course, completely disagrees with Nvidia's allegations. It couldn't be otherwise, given the fact that the company is at the moment the biggest CPU manufacturer in the world and its CPU business accounts for the lion's share of the revenue.

"We believe that both a great CPU and great graphics are important in a PC. Any PC purchase - including the capability level of components inside it - is a decision that each user must make based on what they will be doing with that PC," said Intel spokesperson Dan Snyder

This was taken from Softpedia.com

108,802 views 39 replies
Reply #1 Top
There's a shock. Talk about an exercise in pointless rhetoric.

GPU manufacturer rubbishes CPU manufacturer.

CPU manufacturer rubbishes GPU manufacturer.

Ego is the scurge of this Planet ....

Regards
Zy
Reply #2 Top
So what. We don't really know if that e-mail is real or fake. If it is real there isn't much anyone can do. I will keep buying pc's and pc games as long as they are there. I don't own a PS3 or an x-box, I just keep ungrading my pc until I can't upgrade anymore, then I will build or buy a new one.

I haven't read anything about PS4 or a new version of the x-box on the horizon, so, they might be as dead as the pc. We will just have to wait and see where this all goes.
Reply #3 Top
Hmmm a guy at one company says some silly stuff to a guy in another company.

This is important...why?
Reply #4 Top

The day we don't need good CPUs is the day we can all start using the original Pentium. Oh yeah, that's a grand idea. In fact, I'm gonna try that when I build my next computer. YOU HEAR THAT, TAYLOR?! I'LL SEND YOU THE BENCHMARK LOGS!!! You need a good CPU and GPU working in tandem to get a good system off the ground. I like nVidia GPUs. I like Intel CPUs. They are the two biggest sellers of their prime products. Why trash each other? I don't think nVidia SLI technology will ever replace the Intel Core 2, or even the Pentium 4. This article just puts into light the problems with corporate America: the absolute desire to trash your allies to get some more publicity on your products, when you could just as easily set up an economic alliance. Stupid Corporate CEOs.

Reply #5 Top
I can foresee a merger of the CPU and GPU (multi-core processors with both GPU style and CPU style cores), but the GPU will definitely not totally replace the CPU.

The GPU will help a lot when you have a lot of data to crunch, but relatively few instructions.

If you have very little data to crunch and lots of instructions, however, the GPU will be slower than the CPU. Much, much slower.

For multimedia stuff it's great, but not for something like Microsoft Word.
Reply #6 Top
I use Maple 11, a math symbolic calculator, matlab and mathemtica. I can only say that my GPU will not help one bit with all three of these programs, hence, i need a good muti core cpu.
Reply #7 Top
hee hee... the VP doesn't care what the CPU is so long as the GPU is nVidia and it's fast as hell. He's quite happy to have AMD and INTEL beating on each other; they'll put out faster CPU's. Just a nice way for nVidia to get a little press and some obscure VP to get a chance to apply for ATI  :LOL: 

I rather miss the old days when if you wanted a bit more ram on your video card you could add standard memory modules to it. I'd tweak my GeForce 7950 GT 256mb up to 2GB.
Reply #8 Top
For multimedia stuff it's great, but not for something like Microsoft Word.

MS Word?

Please, it is a word processor. It requires very little in either GPU or CPU.

GPU's are needed for intensive gaming or other real-time graphic needs.
CPU's are needed for intensive processing, like rendering with Lightwave or 3DS Max.

They are both needed in their respective fields. And neither can replace the other.
Reply #9 Top
well he is right technically. A cpu is pretty weak alone. thats why a Gpu is needed right?
Reply #10 Top
I have an old AMD 4200 CPU which is way slower than most things sold today.
Yet because I have an NVidia gf8800, I can run the latest games in full detail and hae it run smooth..
The only reason you need a really hefty cpu is for pushing excel to it's limits (I mean 500k rows doing lots of calculations), running a full 3d package, or something like that.
Most people browse the web and don't play games.. You get ripped off by shops telling you you need the latest CPU to do this faster.. It makes no difference.

Most important things in a PC are the GPU (for gaming) and the Hard Disk (The main cause of PC slowdown).
You don't even need fast ram, just make sure you have enough (1gb at the moment or 2gb for gaming) - Quantity over quality.
Reply #11 Top
Please, it is a word processor. It requires very little in either GPU or CPU.


DirectX 9 has a limit of 65k for the number of instructions on each stream processor. Even word processors blew away that boundary a long time ago.

DirectX 10 removes that limit, but it's still expensive to use the GPU to just do some processing without a lot of data, due to the way stream processors work.

Perhaps a virtual machine or even just the OS itself is a better example. The point is not everything we do uses a lot of data with very few instructions.

CPU's are needed for intensive processing, like rendering with Lightwave or 3DS Max.


Actually, rendering is very easy to parallelize, and some parts of ray tracing can be done on a GPU. In fact, I wouldn't be surprised if we start seeing some real time single bounce ray tracing done in some games. We're practically to that point where it's possible with our shaders. In fact, I think a lot of today's shaders borrow from ray tracing techniques.

They are both needed in their respective fields. And neither can replace the other.


I would agree! That was the point I was trying to make:

"If you have very little data to crunch and lots of instructions, however, the GPU will be slower than the CPU. Much, much slower."

Most people browse the web and don't play games.. You get ripped off by shops telling you you need the latest CPU to do this faster.. It makes no difference.


I totally agree. Web browsing never takes much power in any way. I find it amusing that it's boasted about so much whenever some new computer is released.

You don't even need fast ram, just make sure you have enough (1gb at the moment or 2gb for gaming) - Quantity over quality.


Agreed. I find that more than anything else, having lots of RAM makes things a lot faster. I'd double your numbers for Vista, though.
Reply #12 Top
One thing to bear in mind is that all the graphics card reviews now seem to focus on 1900x1200 and top performance GPUs. Lower (and more relevant for most people) resolutions mean that the CPU performance becomes a bigger factor.

Although I think for most gamers any modern dual core would be nice and they won't need any more for a while unless they want to play Supreme Commander.
Reply #13 Top
I use Maple 11, a math symbolic calculator, matlab and mathemtica. I can only say that my GPU will not help one bit with all three of these programs, hence, i need a good muti core cpu.


WWW Link How about a supercomputer built using graphics cards?
Reply #14 Top
WWW Link How about a supercomputer built using graphics cards?

From the article:
For specific applications that can be massively parallelized GPUs are much faster than CPUs

Keywords are "For specific applications".
Reply #15 Top
Like saying the CPU is dead has got be more retarded than Nietzche.
What does his R&D section have a biological computer?

If anything the market might start leaning towards embedded, less general purpose CPUs over the plug and play types, but I seriously doubt that market will disappear even with an energy crunch. Like playing FPS is the only use for a CPU? Might as well get the next gen console instead.
Reply #16 Top
One thing to bear in mind is that all the graphics card reviews now seem to focus on 1900x1200 and top performance GPUs. Lower (and more relevant for most people) resolutions mean that the CPU performance becomes a bigger factor.


Depends on what you're using the video card for.

For videos, this may be true: A lot of video processing involves scaling, conversion, de-interlacing, decompression, decryption if you're using DRM, and a whole bunch of other stuff that varies based upon resolution.

For games, however, this is not true. Games are resolution independent: They're all mathematical representations of polygons until near the end. Raterization is one of the last steps in the pipeline. By the time you're taking a hit on performance based on resolution, everything is already on the GPU, and the vast majority of difficult calculations have been done.

One thing that people should avoid is mixing up movie and game performance. They have very, very little in common.
Reply #17 Top
I seem to recall an article in a PC Gamer or the defunct GFW interviewing Gabe Newell (I believe) about the death of the GPU. Something about why have two processors when they can now be combined into one multicore processor.

Why not have a multicore processor where 4 are cpu's and 2 are gpu's? One processor to rule them all.
Reply #18 Top
Look at how big your high end graphics card is and consider just how much would need to be added.
Reply #19 Top
I seem to recall an article in a PC Gamer or the defunct GFW interviewing Gabe Newell (I believe) about the death of the GPU. Something about why have two processors when they can now be combined into one multicore processor.

Why not have a multicore processor where 4 are cpu's and 2 are gpu's? One processor to rule them all.


I would agree - I do think it may be possible with multicore becoming the norm that some cores can be specialized. In fact, I think that's the path AMD, who now owns ATI, intends to take.
Reply #20 Top
Isn't much of what comes on a graphics card there because it has to be duplicated for the gpu? Just place the gpu in the same chip as the cpu and suddenly you can remove most of the hardware on the graphics card.

Look at notebooks. The graphics card is embedded into the motherboard already and it doesn't add that much more to it. Just slide the processor into the cpu processor and it probably gets easier to make. RAM is already merged together with notebooks. And system RAM is rather cheap and easy to upgrade. I'd add another 2 gigs into my system if it added to my graphics processing.

We just need the software that will understand which cores to access for what it needs.
Reply #21 Top
Well there are a few problems with that: graphics embedded in motherboards is much lower performance than discrete graphics cards. Similarly, system RAM is much slower than graphics RAM, and the path from CPU to your system memory is still much less effective than that on a graphics card. Graphics relies very heavily on memory bandwidth.

I like the idea of integrating a GPU into a multicore processor, but it's probably going to start out as just improved performance for the baseline. Getting it into enthusiast machines will likely take a bit more work.

Lastly, there's not much point in deliberately making a multicore GPU. GPU's are inherently parallell anyway. SLI and Crossfire and all that exists only as a sensible commercial alternative to making one incredibly powerful card that wouldn't have enough demand to be markettable.

Meanwhile, Cobra, are you sure about games being resolution independent? That would run contrary to just about every graphics card test I've ever seen. It would also run contrary to my results when I load up a game and change the resolution and watch the frames per second. Perhaps we've misunderstood each-other somewhere here?
Reply #22 Top
Well there are a few problems with that: graphics embedded in motherboards is much lower performance than discrete graphics cards.


Tha gap is closing, though. The latest laptops I've seen are starting to come out with pretty impressive graphics. A year ago, I would've completely agreed with you. Today, though, it appears there is a serious push for better graphics on notebooks.

Similarly, system RAM is much slower than graphics RAM, and the path from CPU to your system memory is still much less effective than that on a graphics card. Graphics relies very heavily on memory bandwidth.


Agreed. This is the case even in desktop machines.

Lastly, there's not much point in deliberately making a multicore GPU.


The idea they are discussing is to place the GPU onto a CPU.

Meanwhile, Cobra, are you sure about games being resolution independent?


Yes. Triangles don't get blocky as you scale to higher resolutions. They maintain a sharp edge at each of the three sides, although shading might hide the edge.

It would also run contrary to my results when I load up a game and change the resolution and watch the frames per second.


The frames per second should go down as the rasterizer has to draw more pixels.

My point is not that resolution won't affect your framerate. Indeed, resolution very much affects framerate!

My point is that rasterization happens on the GPU, not the CPU.
Reply #23 Top
I would agree - I do think it may be possible with multicore becoming the norm that some cores can be specialized. In fact, I think that's the path AMD, who now owns ATI, intends to take.


Intends to take? They're going to ship one by the end of the year.

Isn't much of what comes on a graphics card there because it has to be duplicated for the gpu? Just place the gpu in the same chip as the cpu and suddenly you can remove most of the hardware on the graphics card.


No. That's not how it works.

A dedicated graphics card does have its own memory bus for its own memory pool. However, these are not duplicates of what the CPU has. Oh yes, the CPU has a memory bus and a memory pool. But the graphics card's memory architecture is designed for one thing: rendering. Blisteringly fast sequential access of memory. Most graphics cards use specialized memory, specialized memory controllers, and specialized memory caches to achieve monstrous performance.

Calling these duplicates of the CPU's version is simply wrong.

AMD's combined CPU/GPU has strengths and weaknesses. One of the biggest weaknesses with actually doing real work (non-graphics related) on the GPU is that talking to the GPU takes a long time. Transferring a texture from main memory to GPU memory and back takes a long time.

With the GPU on the CPU's die, and both of them using the same memory pool, transferring data back and forth is very fast. Dynamically generating textures and meshes on the CPU will be much more performance friendly than with a graphics card.

Of course, the biggest downside is that, without dedicated memory and a huge, high-speed memory bus, GPU graphics performance will be significantly lower than for a dedicated graphics card. The GPU-on-CPU die will need some very smart caching.
Reply #24 Top
Woah woah woah, i see alot of people saying CPU's will always need to be rediculously fast, but think about this for a second......

You've seen all those random linux boxes running with very little hardware but puttering on serving webpages forever right? thats not very CPU intensive, not many calculations needed, just simple file transfer really.....

WWW Link

"I expect the relationship between CPU and GPU to largely be a symbiotic one: they're good at different things. But I also expect quite a few computing problems to make the jump from CPU to GPU in the next 5 years. The potential order-of-magnitude performance improvements are just too large to ignore."

He talks about some Folding software that is run on a GPU and CPU, and the GPU completely smokes it. Certain math runs much, much faster on GPU's than CPU's
Reply #25 Top
The architecture behind a CPU and GPU are radically different. I mean your talking some pretty intense micro-circuitry design differences in a GPU to handle polygons, shaders, and textures and CPU's to handle pure mathmatical computation.

The guy from NVIDIA is essentially right there's really not alot of growth for CPU's whereas GPU's have alot of room to grow, plus if you add a dedicated physic's processor that probably going to work alot better the closer it is to the GPU.

I think whats going on is Intel screaming out to NVIDIA that there about to seriously enter the graphics market and NVIDIA pretty much answering back go ahead we have years of RnD on you, oh and just like your trying to incorporate the GPU and CPU we can do the same thing on our end. NVIDIA has the upper hand in terms of the technology and Intel in terms of the market should be interesting to see how it plays out over the next 5-10 years.