So today amd revealed their new line of workstation gpus's and something caught my eye

so today amd revealed their new line of workstation gpus's and something caught my eye

"Support for Pixar's open source USD system. USD is becoming THE interchange format for animation software, and what most movie studios are building their pipeline around. By providing higher quality, raytraced rendering in the viewport, artists can gain a better visual of their scene, at a point where they may have previously been working with wireframes. See a short clip of this working here:"

guess now we know why the radeon pro ssg was created..

Attached: 38814959_1359765794156927_4947570529902002176_n.jpg (843x960, 98K)

Other urls found in this thread:

instagram.com/fegalvao_/
anandtech.com/show/12579/big-volta-comes-to-quadro-nvidia-announces-quadro-gv100
tweaktown.com/news/58669/nvidia-beats-amd-real-time-8k-video-editing/index.html
developer.nvidia.com/gameworks-ray-tracing
twitter.com/SFWRedditGifs

instagram.com/fegalvao_/

Thanks. The only relevant info in this thread.

based

what does this mean for a brainlet

Graphics are for stupid fucking trannies who don't understand the math that does the work. I've had to describe shit like 4d numbers to you retards and it's a chore. I do sysdev and work with txt like a man, get your pussy shit out of here brainlet

not OP but basicly this gives amd a big advantage on the rendering farm industry since they have a product(radeon pro ssg) that is unmatched basicly

If you can't explain something as simple as quaternions, that says more about your inability to teach than the knowledge of your audience.

anandtech.com/show/12579/big-volta-comes-to-quadro-nvidia-announces-quadro-gv100

tweaktown.com/news/58669/nvidia-beats-amd-real-time-8k-video-editing/index.html

No advantage at all, SSG is garbage and destroyed by Quadro P6000 in performance let alone the mighty Quadro GV100

Tomorrow, Nvidia might announce the Quadro P6000 successor, embarassing SSG garbage even more

I write ocaml spec. I don't explain quats because I don't have to. Suck my 4d dick and enjoy your commercial internet that leeches off the 400k/yr I make writing NetBSD, cuck. I didn't get a a PhD to teach.

holy shit that girl is fucking disgusting

Attached: 400.jpg (489x400, 157K)

ya ok larper.

>posts 3dpd's instagram
What am I supposed to do with that user?

Attached: hinagiku red face.jpg (500x281, 26K)

Touch yourself. Or whatever. It's up to you.

>suggesting to touch myself while looking at a 3dpd
No thanks I'll pass

Attached: eww.jpg (408x510, 59K)

Attached: ooga-booga.jpg (592x366, 26K)

Leaks show that Nvidia is gonna release a new RTX-series, R for RayTracing. Titan V with it's tensor cores was just a taste of what's coming, the next couple of generations ray tracing in hardware will become a thing. First it's gonna be a feature for content creators to augment their work flow, at GDC there was already talks about embracing it and updating workflows around it because that's the way we're heading. But within a couple generations it's gonna come to consumers as well, because now we have denoising algoritms made with neural networks that are so effective at reducing the noise in low iteration ray tracing that it's feasible to start introducing it into games. I suspect they will make hybrid solutions first, only using it for reflections or shadows or similar, then gradually we'll see raytracing taking over.

Also, this is /3/ stuff really.

Attached: 1372030067145.jpg (891x1048, 270K)

>400k/yr
I bet your mom has the nicest basement in the whole neighborhood.

>shits on 3d women
>posts Kizuna Ai who is technically 3d

Hardware and licensing costs.

Nvidia shareholders aren't too keen on waging a price war at this market.

Ray-tracing isn't taking over gayming scene anytime soon unless they are willing to go back to 640x480@60FPS again.

Ray-tracing will remain firmly in the professional graphics arena.

@2fps lmao

Traditionally you'd be right, but what Nvidia is doing is to use the tensor cores in Volta to acellerate denoising. In the future, most likely it will become a fixed feature with hardware denoisers taking us a lot closer to real time applications. Check out Nvidia OptiX. I think it's gonna come faster than we expect.

developer.nvidia.com/gameworks-ray-tracing

Believing tweaktown

Out of every article only tweaktown saw this therefore it's real..

Yeah it's the same let's see
One GPU with a massive 2tb storage
Vs
one GPU one Kingston 1000 32 tb of ram and a beefy cpu
yeah I can see the difference lol

Damn, can't wait to see shitty devs who can't even make a 2D game that doesn't require a 5GHz 8700K for running at 60fps in 1080p trying to do raytracing.

Ugh! I don't live with her, she lives with me!

>So much clown paint that she looks uncanny valley-like
So... this... is the power... of 3D pigs...

Attached: puke off a boat.jpg (705x556, 411K)

LOL

>hardware denoisers
You have no idea how any of this works.