I've just checked and GigaRay is not actually a unit of measurement

I've just checked and GigaRay is not actually a unit of measurement.

Attached: Image result for nvidia.png (960x499, 251K)

Other urls found in this thread:

youtube.com/watch?v=jY28N0kv7Pk
youtu.be/jY28N0kv7Pk?t=19m29s
twitter.com/NSFWRedditGif

I've just checked and shitting in the streets is not actually a unit of civilization.

Why not?

CuckRays is.

I've just checked and OP is definitely not not a faggot.

Mentioning it on itself is fine but the issue is that there is no frame of reference for the audience since it's that new and during the presentation they failed to mention any other more conventional units.

6 TIMES FASTER THAN THE TITAN XP
AMD BTFOREVER

>not actually a unit of measurement
>there is no frame of reference
Actually AMD has been promoting and developing raytracing for a while now, their GPUOpen Radeon-Rays page says FirePro W9100 manages 100-300 megarays per second.

Giga is higher than Mega

>billions of rays per second
>literally used by any real time ray tracer

I guess gigahashes and gigapixels aren't units of measurement either.

If in 1908 Ford had asked the public what type of transportation they wanted they would have asked for faster horses. Kudos to Nvidia for taking a risk to their profits to advance graphics technology rather than taking the safe road like AMD.

>being a filthy Ramlet

Xp forever.

Attached: IMG_1871.jpg (3264x2448, 861K)

That analogy doesnt work, Nvidia is selling snake oil.

The Model T is snake oil. That newfangled mechanical malarkey will never replace horses.

>Actually AMD has been promoting and developing raytracing for a while now

Actually raytracing has existed for more than 300 years and the equations have been perfected since and are being used on a daily basis in physics.

Applying those formulas to 3D rendering has been a wet dream for many programmers since the early 90s, but the computing power needed to solve these equations (one per ray) is too much for real time rendering.

And when you look at the BF Vagina demo, the result is actually far from what you would expect from a static rendering with raytracing

post THAT image

Attached: large.jpg (500x281, 13K)

no the other one

We got you senpai

Attached: 1534876290080.png (1198x772, 585K)

no the OTHER other one

Attached: manning.jpg (600x641, 37K)

Attached: 1534862711839.png (1828x1740, 999K)

still no!

nope

Nvidia jumped the shark on this one. They've lost me as an unpaid promotional poster.

what the fuck is this circlejerk?

The one you just joined. Whip it out

Airplanes were not a mode of transportation at one point

>6 TIMES FASTER THAN THE TITAN XP*

*In raytracing workloads. Other games will solely scale with the increased amount of CUDA cores and some IPC improvements which partially get eaten away by slightly lower clockspeeds.

The 2070 has 20% more CUDA cores than the 1070.
The 2080 has 15% more CUDA cores than the 1080 but clocks the highest.
The 2080ti has 21% more CUDA cores than the 1080ti.

On average I'd expect an 18% bump in performance in regular games.

Ray tracing produces lower quality images and as shown in demonstrations doesn't even provide a performance benefit. Game devs are going to push this because Nvidia will pressure them to do so, and in a few generations Nvidia will use this like they used PhysX to further solidify their monopoly.

Early graphics accelerators sometimes reduced performance and produced worse images. But look where GPU technology has gone now. We're in that painful introduction phase on realtime raytracing now.

And no that doesn't make the products worth buying or using today. It just means this technology may well become a standard thing that produces better images in the future.

>Ray tracing produces lower quality

Brain-let.

Ever heard of SSAO? How about screen space reflections? Pretty much every new rendering technique/effect that has taken off and proliferated among AAA titles since Crysis has used some sort of ray marching hack. Ray marching being equivalent to ray tracing with significant limitations, but more GPU friendly. You aren't going to see any more visual enhancements from rasterization with ray marching hacks sprinkled on top. If you want photo realism you need to incorporate ray tracing. Hybrid renders today. Maybe pure unbiased methods in the future.

>cuda cores
GAMES DON'T USE THAT SHIT, ONLY JIGGARAYS

nether is teraflop when you look into it.

These are actual quotes from the gamescon presentation user

Not with GPUs it hasn't. There's no frame of reference to other cards. So when ppl say that the 20xx series is 10x faster then 10xx, that may not be true with gaming since there's more to gaming than shadows.

How much time do you have for midweek madness? Does all the good shit really last only for a minute?

If it was 10 billion rays per second, that's 80 rays per pixel per second PER FRAME at 60fps.
If it did 80 rays per pixel each frame there wouldn't be so many fucking artifacts in the heavy ray tracing demos.

checked

>On average I'd expect an 18% bump in performance in regular games.
Note that the Titan V had 50% more CUDA cores than the 1080Ti and only about a 20% increase in performance in games.
This new arch is pretty much just Volta with a raytracing accelerator asic.
Perf increase could be as low as only10%, as high as 30%, for the 2080Ti compared to the 1080ti.

You named some ray tracing techniques which look nice and run on current hardware.
Nvidia showed proprietary garbage which looks and runs shit even with their new overpriced hardware.
You just check mated yourself hard there.

>If it did 80 rays per pixel each frame there wouldn't be so many fucking artifacts in the heavy ray tracing demos.

Are you referring to a particular demo? One particular implementation of RTX ray tracing doesn't say much about it. That's like saying "hey look I found this video on Youtube of an OpenGL test that looks like shit. OpenGL must suck."

>Nvidia showed proprietary garbage which looks and runs shit even with their new overpriced hardware.

Again, it sounds like you don't have all the facts and are jumping to conclusions. FYI Nvidia has proposed to Khronos a set of Vulkan extensions for ray tracing. AMD and Intel will be able to use these if they want. Hardly "proprietary garbage."

"Gigga Nigga RTX Ray" is the proper unit

Attached: Snug.jpg (590x582, 48K)

>Are you referring to a particular demo
I said which demo I'm referencing. The UE4 one. The star wars one. It looked like shit on the single 2080Ti. It was very noisy.
On the other hand, it looked great on 4 Titan Vs.
Yet they claimed 1 2080Ti was better than the 4 Titan Vs.

Which page should you refresh on for midweek madness? The countdown page, the B-stock page, the 10XX page, or the specific model page? Need to cop that $300 1080.

>star wars demo

youtube.com/watch?v=jY28N0kv7Pk

Can't see it. There's nothing there besides the typical Youtube compression artifacts. It looks cinema quality to me. Maybe you should get your eyes checked.

Why didn't you link to the actual point?
youtu.be/jY28N0kv7Pk?t=19m29s

Like he says THAT IS THE ONE RUNNING ON 4 V100 YOU FUCKING MORON. YEAH I SAID THAT ONE WAS FINE. He fucking says it. The machine with 4 V100s in the fucking background behind him.
They showed at the gaymer conference running on Turing and it looks shit.
And you know what? There are still some artifacts in this one that I said is much nicer. They're just not so distracting and bad, but they're there, and you should have noticed them.

Get your head checked, moron.