NVIDIA GeForce RTX 2080 3DMark TimeSpy result leaks out

>Unfortunately, we have no information on how many cores were used (and that might be important considering RTX series feature Tensor, RT and Shader cores)

videocardz.com/77763/nvidia-geforce-rtx-2080-3dmark-timespy-result-leaks-out

>inb4 titanxp is high
Thats the starwars oc edition

Attached: Screenshot_20180828-054843.jpg (1080x1189, 298K)

>NVIDIA’s RTX 2080 can beat out a GTX 1080 Ti without using 2/3rds of its cores.

Attached: Screenshot_20180828-055343.jpg (1080x986, 170K)

Cores are an arbitrary unit in GPUs, die size is more consistent, and looking at the G(?)T102 die size, G(?)T104 should be similar in die size to GP102

You can't just use the other cores to increase performance. They're there to do ray tracing.

No, it's using all 4352 CUDA cores but none of the Tensor cores (544 Tensor cores) so it's using 4/5ths of the total GPU.
Tensor cores can do more than Ray Tracing and even be used for gaming (they aren't any slower than CUDA cores for fp32) since they use ~8x the power per core clock for clock (at least on Volta they did, maybe Turing gets the power delta down - it'd have to or those 544 Tensor Cores will use almost as much power as the 4352 CUDA cores, which will nuke performance* due to card power limits)
'RT Cores' aren't doing any raytracing at all, they're for nVidia's noise filtering special sauce.

*This is possibly one reason why the Tomb Raider demo sucked at 30fps with RT on, we'll find out when real cards hit the wild and OC'ers kill the power limits.

>only 6% faster than 1080 to
>But muh low quality DLSS upscaled textures
I seriously hope you guys haven't pre ordered this shit

>6%
It's 30% between the 2080 and 1080
It's 6% between the 2080 and 1080Ti.
ie. the 1080Ti just dropped a price tier AND managed an Intel level perf boost - that's without considering raytracing at all.

I pre ordered the GTX 1080 almost 3 years ago, now I will do the same shit with the RTX 2080.

Fuck the waitfags.

Are you actually ignoring that the 1080 it is cheaper than 2080 or just a bad shill?

>I-IT'S GOING TO BE SLOW
>F-FLOP
>W-WAIT FOR AMD
OH NO NO NO NO

Attached: 4563548343253.png (653x726, 34K)

DLSS is literally a low quality upscale of textures to improve performance.
This is like Nvidia fucking up colors to get ahead in performance

>raster
Now imagine in games where the other 2/3rds of the GPU are used.

>Only 6% faster than a cheaper card that is going to be even cheaper now that they're releasing a new series
You have to be an idiot to get the 2080

>Synthetic test
>No details on settings
>Twice as expensive
>But it's faster goys!
Wait for hardware unboxed to benchmark it dolts.

Not a shill, also didn't look at prices, they mean nothing until I see shit on the shelves as pre-annouced prices are almost always wrong.

>tfw you'll see the day Nvidia cards are getting 60-80fps while AMD are in the single digits once RT benchmarks come out

Attached: 1510593104325.jpg (646x687, 69K)

You mean the other 9th of the GPU?
2944+368 = 3312
3312 / 368 = 9

4352 + 544 = 4896
4896 / 544 = 9
Tensor cores make up 1/9th of the GPU by core count.

>tfw Vega is actually better at raytracing since it can use any of the 4096 cores for rapidpackedmath where nVidia can only use their special Tensor cores, which number only 544 on 2080Ti and only 268 on 2080.

;)

Can the RT and Tensor cores on the 20xx series perform the same type of rendering tasks as a traditional shader on top of their new processes or can they only perform their specific task which helps boost performance?

I'm slightly confused by this, if a game wasn't created with RT in mind, does that mean a portion of the GPU will basically be dead weight? I can already imagine driver or comparability issues where RT and or tensor cores prevent you from playing non-RT/Tensor optimized games.

>twice
>not even an order of magnitude

Pathetic.

Far more than AMD :)))

Yeah. AMD is too busy sodomizing Intel.

>I'm slightly confused by this, if a game wasn't created with RT in mind, does that mean a portion of the GPU will basically be dead weight?

Yes.

Twice is an order of magnitude in base 2 ;-)

Ti has +3GB VRAM and is still killed by this card. It probably uses more power too.

Waiting for AMD desu

>Waiting for AMD desu

Attached: 1801841.jpg (499x363, 154K)

Same, I'm not paying these bullshit Nvidia prices, and with the news that AMD is using TSMC for Navi and Ryzen 3000 we should not be waiting for too long. If you bought these Nvidia cards at these prices you are a chump, plain and simple.

Oh nooo, AMD won't have the lead in the epic high end ultimate $10000 USD gaming cards owned by 0.000000001% of gamers. Whatever will they do?

Keep waiting since nothing is worth upgrading from a 290X 8GB yet.

I wish I bought the furry nano when I could.

>AMD is using TSMC
Amd has used tsmc for every gpu since they bought ati

You really don't, 4GB VRAM is a killer now.
It's better than a 290X 4GB, but in a lot of games a 290X 8GB will have far better average and 1% framerates, since it won't be running out of VRAM.

Wrong, they've been using GloFo for the past few years for desktop GPUs, they've used TSMC to make Xbox/Playstation chips

Aside from the APUs since Llano, Polaris was GF 14LPP.

Oh, so was Vega10 GF 14LPP, forgot about it existing for a minute.

Nvidia won.

You can't win when there are no competition

If the benches are true, the 2080 will be cheaper than the 1080Ti

I'd rather wait 2 years than have to deal with nVidia's shit drivers on Linux.

>gayming
>linux

This stupid meme should fucking stop.

>GPUs are only for gaming
>you can't game on Linux
You're dumb

>mfw still 980Ti
Spent $850 on my 980Ti classified at launch. It'll do 1630MHz on the core and an additional 950MHz on the memory if I enable LN2 BIOS and decouple the power target to temperature ratio. As long as this card gives me 60FPS on high at 1440p, I'll keep it.

Attached: react.gif (190x200, 1.16M)

> video cards are only for gayming durr

what a fucking faggot

>buy a 799 gpu for your meme os
>not use igpu
>not use a profesional gpu

It's a 2080 mouth breather not an 2080ti so it's 2950 Cuda cores not 4350

1080 was 16nm
2080 is 12nm
1/3 of 2080 isn't used in test.
8nm is faster than 16nm.
I fail to see the problem here.

1/9th.

Please do heed the rumor tag and take this with a grain of salt – we do not know if the test was done using final drivers (which in the Turing’s case can make a world of difference) and the run is almost certainly not using the Tensor Cores – which constitute almost 1/3 of the die space.

Or removing dithering
Or lowering shadow resolution on max settings on their cards

> everyone has 5000 dollars to spend on a professional gpu

>buy a 799 gpu
Who ever said that? And why not?

1/9th the core count.

wrong the new ai cores can get you more fps as well

I don’t care what its performance is, I refuse to pay that much for a GPU when I barely even play modern games anymore.

>If the benches are true, the 2080 will be cheaper than the 1080Ti
No it wouldn't, the die is bigger so the margins would be lower. Nvidia wont do that. 479mm^2 vs 529mm^2. The 2080 uses a more expensive chip.

Attached: Nvidia GPUs.png (1183x2381, 209K)

I didn't mention cores. I mentioned physical space. It's using half the space, and getting more performance.

it's not using half the space.

All this has convinced me to get 1080 TI, thanks Jow Forums

Does this mean finally I can buy 1060Ti 6GB since the price is cheaper now?

Attached: 1516904578975.jpg (300x300, 25K)

Heh there's no 1060 ti kid

>30% faster in Timespy for twice the cost
Wow that's my favorite game sweet I can't wait already preodered to play Timespy all day!!

I think you mean 1070

Oops, yeah I meant 1070 Ti.

The 6GB 1060 should have the ti suffix since there's more cuda cores than the 3GB version

I literally only give a fuck how the 2070 performs, too bad it launches later

GeForce cards are

>DX12 benchmarks

WHERE ARE DX11 BENCHMARKS, AKA THE ACTUAL BENCHMARKS THAT CURRENTLY MATTER IN 2017

true

Gotta sell those returned 300,000 Pascal GPU's somehow.

>paying more than 250$ for a single pc component
Imagine being this retarded.

i ran out of patience waiting for the 2080 and got a 1080ti a week ago. seems like i did ok if that benchmark is to be believed. i was still stupid not to wait for confirmation that the 2080 wouldn't offer much more, though.

>ok
Obviously better. You're missing the part where the 2080 here is overclocked about 25%.
Even with a 25% OC, it's only 6% ahead of the 1080Ti.

Or sorry, a 13% OC. Still, point stands since the 2080 is $750 and 1080Ti is around $600.