NVIDIA RTX 2080 Time Spy preliminary benchmark leaked – 37% faster than a GTX 1080 and 6% faster than a GTX 1080 Ti...

NVIDIA RTX 2080 Time Spy preliminary benchmark leaked – 37% faster than a GTX 1080 and 6% faster than a GTX 1080 Ti without using AI core (DLSS)

wccftech.com/nvidia-geforce-rtx-2080-3dmark-timespy-score-leaked-clocked-at-2ghz-and-beats-a-gtx-1080-ti-without-ai-cores/amp/

Attached: geforce-rtx-2080-gallery-a-2060x2060-1-1030x600.jpg (1030x600, 28K)

Yup, it's a flop.

>6% faster
>14% more expensive
progress

xx80 barely beats xx80 Ti

even a xx70 series like GTX 1070 can beat 980 Ti. disappointment

>clocked at 2 Ghz

I will buy 1080 Ti instead than this bulllshit $799 RTX 2080.

This is comfirmed as fake . It s an post .

lol biggest GPU performance jump ever 1080ti owners are going to be sad as fuck.

most gpu gen jumps are 20% but 9/10 was already shit and only like 5% if you overclocked. now 40% 1080ti owners BTFO.

a 980ti with a bios mod overclock is only 5% behind 1080ti. you cant biosmod a 1080ti and you have to do complicated micro soldering to overclock it properly.

basically gpu improvments have been 20% for last 10 years then 9/10 is 5% if you overclock and now 10/20 is 40% buying 10series was literally the worst time to buy.

>curry tech
still posted on Jow Forums

>it's a flop because I won't buy it
Quite literally a sign of autism.
Nvidia guaranteed is going to book record profits again simply because AMD has no answers to the 10 series let alone the 20 series. Just because it's an overpriced and overhyped product doesn't mean tons of retards won't eat that shit up.

If legit that's a fairly high jump compared to recent years (1000-series being the outlier).

Attached: 1531004808218.png (1280x699, 128K)

>pay more
>get less

Sounds like just the card for my new Intel rig!

>2GHZ
OH NO NO NO NO NO NO

>only just beats last gen in benchmarks that dont utilize any of the new features
DOA.

>GTX 670 edging out GTX 580
>GTX 770 = GTX 680
>GTX 970 edging out 780ti
>GTX 1070 barely edging out titan and 980ti
Now
>RTX 2080 barely edging out 1080ti

I'm going to fucking shoot lisa in the head if AMD doesn't dropkick Nvidia with a strong competitive well-priced GPU release this year. Fuck Shillvidia. 4 years for a bitcoin ripoff with barely any performance improvement.

But that top-end segment is for idiots. The real money is in the cards that AMD is making now, let nvidya have their pissing contest it's all bullshit anyway.

>6% faster than a STOCK 1080 Ti whilst overclocked
What a disaster.

Attached: soygasm.jpg (283x324, 20K)

>a 980ti with a bios mod overclock is only 5% behind 1080ti
Whaaat for real? Keeping my 980ti for a while longer then.

% faster than a STOCK 1080 Ti whilst overclocked
>What a disaster.
37% faster than a 1080, that's great. I'll wait a few more months before buying a laptop.

No, that's complete bullshit, unless you're running on LN2 or something. My 980 Ti Classified did 1524MHz and was about 5% short of a stock regular 1080, and they don't go much, if any further than that without exotic cooling. There's no way you're getting anywhere near a 1080 Ti.

I wonder how much of this post is fantasy

I want a 6-core processor too, so it makes sense to wait a little longer to get the latest from both intel and nVidia.

competitive price? no more...haven't you heard? GloFo doesn't make 7nm for AMD GPUs anymore, TSMC will be the sole fab, monopoly at work.

you didn't bios mod it you idiot even with stock good 3rd party cooler it can hit 5% behind a 1080ti and well ahead of a 1080 and if you don't like the temps put a 14$ cpu cooler on it and use a 3$ pcie riser cable to sit it upside down on bottom of case if it doesn't fit.

you own a 1080ti now and are butthurt.

only way to oc a 1080ti that much is with soldering tiney tiney chips with resistors that is super hard.

ohnonononono

Attached: 6XOUrKT.png (1710x963, 752K)

>2GHz

whomst gets GPUs early so we can get real benchmarks? Holy fuck I'm so overdue for a GPU, but I may just buy a cheap AMD and wait some more.

>6%
OH NO NO NO

I am too. just keep boycotting Nvidia until that greedy chink lowers his prices

may buy those bullshit Korean/Chinese cafe cards

>muh numbers

If you don't understand that the point of the RTX series is to encourage devs to implement real-time ray tracing, you need to stay in your Thinkpad thread with all the other poors.

Yeah. Fuck consumers. We raytrace now.

can't wait for those used Nvidia 1060 5gb from chinese cafes at 1050ti prices.

>encourage devs to implement real-time ray tracing

if it's anything like multicore support, i'tll pay off in a decade or so

why the ui look fucking shit
also
>button prompts
uninstall.exe

Also, why have shadows and reflection qual not maxed in a demo.. guess its not as well optimized yet.

> he REALLY believes in fake reviews

oh boy why do you hate nvidia so much that you are deluding yourself with fake reviews ? what did nvidia do to you ?

nah it'll be more like nvidia gameworks, since nobody really does anything to help devs multithread shit
nvidia has a financial interest in making this tech a success, they couldn't give a fuck about multithreaded support

probably not all that optimized yet, but showcases almost always use controllers even on pc because they often let "journalists" play and those fucking braindead morons can't understand how m+kb controls work because they really just wanted to push some agenda when they started and don't even really like games

Wow this is actual complete fantasy this post

That was on "high" ?

Attached: itjustworks.png (1920x1080, 984K)

Imagine what low looks like!

those are not raytraced shadows. Also this game is at least a year from being completed so whats your point ?

You can't fix so many things in only one year.
Even without raytracing we can make absolutely stunning shadows, this looks like low setting from 2012

DON'T READ THIS
DON'T LOOK AT IT

JUST PREORDER NOW YOU STUPID FUCKING GOYIM
NOW NOW NOW

Attached: 1535040993500.jpg (324x278, 10K)

> You can't fix so many things in only one year.
LOL. 200 developers and 1 year and you are saying that they cannot fix so many things ? you know nothing kid

6+ years to get to this result and you tell me that they can fix everything in a year ?

Guess I'll hold onto my 980ti til the 2180

>raytracing off
Is that a joke?!

Its a lot of work building a game, movie or writing a book.
This is global shadows, yes this will be fixed.
It just wasn't a focus of the game play being shown.

tw3 got very high and ultra setting
it actually changed shadows in tw3 not just resolution of it

Why would I be sad?
This shit is more expensive than a 1080Ti was.

>RTX 2080 barely edging out 1080ti*


*for more moneyz

>Believe in fake reviews
Literally who.
Even OP link isn't a review.

6 years+ is all the time it takes to write a story, create quests, dialogs, characters, world design, modeling, animations.

Changing how shadows look or the lighting is not time consuming specially if you are using dynamic lights. Same reason a remake is easier to do than the original creation.

Do you work for the Nvidia marketing department?

Most of that is just an adaptation from the game CP2020 tho. Making the engine was the most difficult part

since eth is tanking.
i'm not gonna buy one.
waiting for navi 7nm instead.
still mining on my single 1060 just in case.

Honestly, witnessing the madness in GPU and CPU prices these last few months, I'm starting to think consoles are actually the reasonable option.

I have a 980ti Hybrid. Please provide a link to the mod you use

> if someone likes nvidia its because they are shills

are you a leftard by chance ?

> 30fps cinematic experience
> movie-games

yeah nah, ill stick with glorious pc and 60+ fps.

Well, most AAA are shit nowadays, so there's that.
But for the price of one of these, you barely get a semblance of a graphics card nowadays.

who is forcing you to buy the latest card? I dont understand.

why people complain about prices if you can buy a 1080 or 1080ti and be happy for at least 3 more years ? do you realize all companies are going to target 1080s for a long ass time ?

some people are very entitled, they want to always have the best hardware, the latest tech without having to pay premium for it. Id say 2080tis are for researchers and early adopters (those who like 4k 5k and 8k gaming) who have the cash to afford them.

We don't actually know whether this is a 2070 or a 2080 though. All we know is that it isn't a 2080Ti because the memory doesn't match. If anyone actually bothers to look at more than the title, the actual model of the card they're testing here isn't specified in the benchmark result and 2070/2080 aren't distinguishable by their memory configuration, it's the same.

And that's when you realize that was their plan all along

second this, people feel too entitled to shit they don't need

I mean, I do have a 1080Ti, but that was way over-budget for what it does. Looking forward to CP2077 to make it work somewhat.
I mean, it was nice doing benchmarks for a few hours, but then it's back to Kerbal Space Program and Factorio, so not exactly useful.

The problem with these new cards is that they actually changed the segmentation.
xx70 is last gen xx80. xx80 is last gen xx80Ti, and xx80Ti is last gen Titan.
That pricing is insane. You can buy all consoles for the price of a 2080Ti. Just let it sink in.

then it better bring down the price of a 1080ti. But fuck we deserve a better gfx card after all these years.

> That pricing is insane. You can buy all consoles for the price of a 2080Ti. Just let it sink in.

they can charge all they want for the new cards, again no company will be targeting 2080ti tier cards for many years and people get mad simply because they want to be able to have the the most powerful GPU all the time.

have you seen steam stats ? the majority are
on 1070 and below tier cards.

you have a 1080ti so you don't have to worry about anything until 2020-2021.

Now we know you will be buying something more powerful next year because thats how people work, they want the best shit even if they are only going to be playing kerbal space program for the majority of the time.

I initially bought the Ti because 4k gaming.
But it's not much better than my crossfire rx480s in games that actually play that require GPU muscle. (RotTR, Deus:EX MD).
And I lost Freesync in the process.
Well at least, it's better than Vega.
Then there's outliers like KCD that barely run on medium settings to achieve 60fps.
From what I've gathered, the CP2077 demo was on a 1080Ti, but the video was 4K, so there's a chance I won't have to upgrade for it.

>no company will be targeting 2080ti tier cards for many years
What companies 'target' isn't necessarily relevant. One company might target 720p 30FPS, while another targets 1080p 60FPS, a 3rd instead targets 1440p and a 4th wants their game to run well at 4K. There is no single standard to target on PC, it depends on what hardware each player has and what image quality and performance each player wants.

If a company targets 1080p 30FPS on a mid-range card, that mid-range card won't make you very happy if you want 1440p 60FPS because that's what monitor you have.

>From what I've gathered, the CP2077 demo was on a 1080Ti, but the video was 4K, so there's a chance I won't have to upgrade for it.
If you want 30FPS, maybe. Otherwise you can banish that thought entirely, a 1080Ti can't even hit a consistent 60FPS in Witcher 3 at 4K so I doubt CDPR's next game is going to be less demanding than the one they released in 2015.

>a 1080Ti can't even hit a consistent 60FPS in Witcher 3 at 4K
Well that's a lie, because my crossfire rx480 did, granted with gimpworks off.

Yes user, if you lower settings the game runs faster, wonderful revelation you've got there. Maybe a 1080Ti will run Cyberpunk at 4K 60FPS with low/medium settings, that would be possible.

Never gonna buy it, but man that card looks sleek as fuck.

Didn't try it on the 1080Ti, but it should be better with gimworks.
Also, about CP2077, remember, this thing still has to run on consoles.

THANK YOU BASED NVIDIA

FAKE NEWS IT COULD BE A 2070 AS WELL

Attached: 1529770238923.jpg (1494x2048, 628K)

Also, here's my big problem with 20xx series:
If I spend as much as I did on 10xx, I'll get the exact same performance.
That's not how generational updates work. And Nvidia is gonna lose a lot of money for not realizing this.
I mean sure, they'll sell to ecervelate fanboys. But how much of the population is this?

> There is no single standard to target on PC,

there are targets. The target for the vast majority of games these days is 60fps at 1080p. You want 4k or 144hz ? you better get ready to pay premium for something better.

> If a company targets 1080p 30FPS on a mid-range card, that mid-range card won't make you very happy if you want 1440p 60FPS because that's what monitor you have.

So ? that doesnt mean the company or the card maker is at fault, its your fault because you want the best of the best all the time without having to pay for it.

Also why on earth do people think a card should be able to pay all games at same exact resolution and fps ?

If game A performs great on a 1080ti but game B is shit on the same card does that mean the card is shit ? in any case the ones you should be blaming are the companies who create such shitty software.

Witcher 3 also runs on consoles but just barely misses a 60FPS average at 4K with HairWorks entirely disabled. Not only does this mean that it WILL dip significantly under 60FPS anyway, but if you actually enabled HairWorks at max quality it wouldn't even come close to 60 even in an average with even worse dips.

Consoles are going to have the typical console experience with dynamic resolution turning everything to mush in demanding scenes and a cinematic 20-30FPS. On PC you won't want randomly dropping resolution and you will want (at least) 60FPS, getting that at 4K won't be easy.

Attached: witcher3_3840_2160.png (500x330, 17K)

>If I spend as much as I did on 10xx, I'll get the exact same performance.

where did you read that ? are you believing the fake reviews ? do you really think you'll get the exact same performance on a 2080ti?

nobody has published official benchmarks so far so take everything people say with a huge grain of salt.

witcher 3 wasnt even made with 4k in mind.

Well, did you look at OP's pic.
spent 775€ on 1080Ti.
2080 is 'From' 850€ for somewhat more 3dmarks, but not much.

I'm also believing it because 2Ghz, and Cuda core counts.

It wasn't demoed with a 1080Ti either.

ein minuten bitten

the card was running at 2ghz and still beaten 1080ti? by only 6%?

kek JUST BUY IT GOYS I-ITS NOT A FLOP I PROMISE

This is pretty good if it's a 2070, actually.

Have to assume it's ultra-high settings.
Gimpworks gimping its own brand.

>6% faster than a GTX 1080Ti

OH NONONO

That would be pretty good if the price wasn't retarded.

Wait, I fucked up. see 2080 is like last gen 1070.
The only problem is, the pricing didn't change.

It's max with HairWorks entirely disabled
>In the interest of neutrality, we ran the game with HairWorks disabled.
If they had enabled HairWorks at max quality it would probably lose like 10FPS if not more.

It's a turd

Nope, maybe there were some patches in there, but I never saw it dip below 60 on AMD hardware, mind you.

Oh wait, max settings most likely means 8xAA. which is useless at 4K.

Are you seriously claiming you ran absolutely maxed Witcher 3, including HairWorks, on an AMD card and never dropped under 60FPS at 4K? You'll have to provide some proof because that is not plausible in the slightest as AMD doesn't even have a card as fast as 1080Ti, not to mention one even faster.

a 2080 (NOT 2080ti) being faster than a 1080ti is bad.

lol go back to your basements.

see

I ran maxed Witcher 3 except Gimpworks at 4k 60fps on crossfire rx480s, with 2x FXAA.