GPU/CPU Computing

What led to the transition from CPU-centric computing to very heavily GPU-centric starting in the 2010s? Is there a necessity for CPU computing whatsoever aside from the sheer standardized nature of CPU architectures? The advent of GPGPU computing might have one believe that the usefulness of CPUs are limited to very few tasks - and that's only due to the inability of programmers to fully utilize GPU architectures. Am I missing something? What even is the point of a CPU anymore?

Attached: 1028GQ-TRT.png (2048x1271, 3.69M)

The computing is still CPU-centric.

The bitcoin fever sueely had something to do with this trend. That and vydia ofc

Undoubtedly - that and server-side rendering forwarded to what are effectively graphical dumb terminals. An interesting paradigm, but why even have a CPU? Seems like it's going to soon become nothing more than a fallback when the GPU fails, particularly now that Parallel Studio exists and what have you.

> What led to the transition from CPU-centric computing to very heavily GPU-centric starting in the 2010s?
Floating point requirements, primarily.

> Is there a necessity for CPU computing whatsoever aside from the sheer standardized nature of CPU architectures?
Sure. GPU's are still ASIC's, where CPU's have a more general use.

> What even is the point of a CPU anymore?
Something still has to feed the GPU's. And there's I/O (memory, storage, network), as well as all the general purpose stuff a computer requires.

Mining rigs are a good example of this though. 6+ GPU's on a system, with at most an i3 and 8GB RAM. The CPU doesn't do much, but it still runs the OS and I/O.

>All CPU computing
Running
>CPU + GPU
Driving a car
>GPU only
Launching a car at full speed towards the destination without being on the car

t. never written an opencl program

Some problems simply benefit from having a fuckload of low power, highly parallelised cores with a low amount of memory thrown at them.
GPUs happen to have a fuckload of low power, highly parallelised cores with a low amount of memory.

if you want to turn a planck into sawdust you do it at home with a blender in 3 minutes
if you want to turn a truckload of wood into sawdust you take an hour to drive to the lumbermill, get it done in 30 minutes, then drive back

you wouldnt drive there for one small plank
you wouldnt try to process a truckload of wood with a blender

Gpu computing is very limited in use. Cuda is the only real viable platform, openCL is a toy and a poor knockoff

Because GPUs are dumb as bricks and you need a CPU to feed them data to chew.

Nice

You also wouldn't ask the lumber mill to make a smoothie for you, their not set up to do that efficiently. :)

Because GPUs are fucking shit at branching and random memory access.

GPUs cores are like trains.

One train can carry 64 wagons. All those 64 wagons are taking the same route.

CPUs cores are like trucks.
If you have 64 trucks each truck can take a different route.

>inb4 he counts ALUs as cores

>a car analogy

Attached: 1499908712618.jpg (228x216, 10K)

GPUs are suited for applications that can be generalized in floating point arithmetic and matrix operations because they are designed solely to do those things.

Most things cannot be generalized that way easily, if at all.

> hah! car analogy! //doesn't provide better analogy

B-but SIMD is love ... SIMD is life

in his defense it was a pretty top notch automotive analogy

Parallel workloads, primarily floating point workloads. 4,000 risc-v cores can process a 3D render or a recursive deep learning algorithm orders of magnitude faster than a handful of x86 cores.

Look a road with curves!
> CPU - walking on road
> CPU+GPU - drifting around curves
> GPU - doing donuts until out of gas

You're retarded, congratulations.

Many modern CPUs use a SIMD architecture.

A GPU is like a class of special ed students. They can pump out hand turkeys like no tomorrow.

A CPU is like their wrangler. She can perform complex tasks, wrangle the GPU cores, and generally not be retarded.

Attached: 57309257.jpg (509x435, 37K)

>literally zero penetration in academia, R&D, cutting edge fields like ML
>only used as a back-up option in a handful of rarely used features in a handful of pop art programs

Very nice. OpenCL is so great, it's really useful and used everywhere. Well, except it's not.