This system is claimed to be capable of two petaflops

This system is claimed to be capable of two petaflops.

So what will it actually be used for, will it be creating skynet?

Attached: dgx-2.png (1320x693, 488K)

>So what will it actually be used for
BIDGOINS :DDDDD

my guess would be cloud computing, hosting a shit ton of VMs so you only have to buy shitty laptops for your employees to connect with

playing minecraft

a computer to hold your consciousness

But can it run Crysis?

used to simulate ur mum ebin

>NVIDIOTS
>capable of anything
lol, amd is where my money at

That's a big GPU....

Attached: snob.gif (1200x1638, 791K)

The next Disney shitfest

for u

Attached: 1521220050335.gif (340x325, 45K)

sadly

it has 1.9 tensor flops
FP16: 480 TFLOPs
FP32: 240 TFLOPs
FP64: 120 TFLOPs

compare this to the project 47 of amd with 1 petaflop of fp32 2 petaflops of fp 16 and it gives 60 gigaflops per watt..almost double as nvidia LOL

>Literally just cramping abunch of useless shit together in an attempt to awe
It'll be about as useful as the pyramids. I'd imagine that thing costs more than what 90% of the worlds population earns in a lifetime

>This system is claimed to be capable of two petaflops.
[By whom?]

Attached: cosby.gif (480x270, 2.55M)

I have two of the first generation ones in my office. My company informed me that I have one of these and the two of the new GPUs on the way.

>Xeonist
>not EBYN
they had to fuck it up at the end

Simulating reality. Literally.

> only 1.5tb of system memory

How abour no?

But does anyone care about Project 47?

>projet 47
This meme, the gpu are not even on the same link

Attached: AMD_EPYC_Radeon_Project_47_800.jpg (800x533, 73K)

i really like the aesthetics of Nvidia cases

Most realistic approximations of the "horsepower" needed for a decently-able AI system is ~20-25 Petaflops, so no.

Normal company 2018:

- "Mhhhh. How will we call our new product?"
- "Tesla"
- "Ok"

Cat picture recognition.
That's about the only application to AI.
It's just that the internet gets flooded with Petabytes of them every day.
So now we can classify them automatically.

Well, the title most likely had 'cat' in it, but disregard that.

AI is useless.
This will all come crashing down a few years from now, when investors start asking results.
It's the 2000 bubble all over again.
Even cryptomonies are a better use of that GPU supply, at current price.

Why/how is AI useless?

Idk, can you think about any application besides muh car driving?
You can't.
Because that's the most automated task humans do every day.
And they're even bad a that.

That was...underwhelming.

>Recreating wetware in software isn't a technological achievement we should strive for because it can't play games

Attached: 1518251690859.jpg (600x600, 57K)

>AI is useless
will this meme die already? since when did automating tasks become useless? AI is just automation on a whole other level.

Well, please tell me wtf AI in its current bacteria state can do for us?
Automation?
It will just kill people and get cancelled.
Data classification?
Well yeah, who the fuck did that job anyway?

I like the idea, but we're 30 years too early for it to be any much use..

>wetware
Had to google it, but you're way off.
Just learn how real neurons work, and how much of them a brain holds (well, the whole fucking nervous system is neurons, so that's not only the brain)
What we're experimenting is nothing like neurons.
They called it that because they were hipped on the 'results'.
A single real neuron is more complex than most AI shit right now.

every gen 10 has its own link moron

RAYTRACING A SCENE BIGGER THAN 4x4m

i dont know you tell me they took red under its wing to develop software solely for them

i think thats enough

>we need to build a skyscraper
>well shit, that will take us few years, better not build it

Attached: 1518691908408.png (578x566, 12K)

Well, to follow your analogy, we're building a skyscraper with the wrong plans, and using toothbrushes as bricks.

Mining obviously

?

Nvidia is making some big money from Tesla and their computing branch, also thanks to their shilled cuda and the rise of neural networks. Am I right?

How is amd doing in this field instead?

you lack imagination. A.I. can stop credit car fraud,stop malware or help you treat disease better.

You lack results.

Bump can someone answer me?

I'm sure automation is going to die out too
>any day now

Attached: 1522352009187.png (1178x661, 654K)

yeah but there are other applications too, we might also actually start to get processors dedicated for just ANN. You have chinese ones for that already. I don't remember the name in my head, but you should find it easily

only on meme learning shit, on normal stuff is a lot less

You are fucking retarded.

neural networks
nah, for crypto nv-link and fast CPUs and buses are useless, which means it's way cheaper to just use 1080TIs hooked up to shitty motherboards.
nope, way off. this is for GPU computing
you think you're hot shit, huh? you aren't. most machine learning libraries use CUDA, which is incompatible with AMD. some frameworks can use OpenCL but it's slower than nvidia because amd doesn't optimize their code as much, and no nv-link, which means nobody uses AMD for machine learning.
t. brainlets
machine learning (basically pattern recognition) is being used right now in all kinds of services and industrial processes

system had 512GB coherent memory on GPUs allow massive neuronal networks, next gen applications on deep learning huge image process,huge text analysis, scale All 16 GPUs as one single GPU easy programming.

Underrated.