1080ti vs 2080ti

imgur.com/a/s3VBq3h

docs.google.com/spreadsheets/d/10XMjq1rjjOvr1g-DeMiRlWGOWaw9k3ElbTUfWJploKg/edit#gid=0

youtube.com/watch?v=tJLzlmLHORY

Attached: benchings.png (2188x1052, 133K)

Other urls found in this thread:

hothardware.com/news/battlefield-v-ray-tracing-hitting-sub-30fps-4k-geforce-rtx-2080-ti
twitter.com/VideoCardz/status/1035246270548504577
twitter.com/NSFWRedditVideo

Literally who

Looks like games that are more compute bound gain the most, graphics bound titles much less.

What's the numbers on the left?. temps?

looks like they designed it to hit 60 fps average at 4k ultra

Gigarays.

DELETE THIS
DO THE NEEDFUL SIR

Attached: 1518256744102.png (653x726, 35K)

Fake and Gay

Yeah niggers but how do they compare in render times? EEVEE, Octane, VRay, I want to see some numbers that are actually meaningful.

Skeptical this is real, zero shots of the gpus, violation of nda. Would be easy to fake and in other videos he shows off the hardware and in this he doesn't, could be making fake bench graphs for views and has no partnership with NVIDIA.

On the other hand the channel does have some hardware reviews, 100k view videos in some cases and 90k subs (which could all be faked).

Also how can he have battlefield v benchmarks, closed beta is over and open beta hasn't begun yet correct?

If he was able to benchmark Battlefield V why not Tomb Raider? Could be fake

You're talking to random /v/ermin shitters who don't understand that FPS is a meaningless metric for anyone who actually cares about performance.

Render times in ms is all that matters.

Why are we getting graphs for shit titles? I would rather see how well it could run Nier Automata with the AA maxed out.

You right now

Attached: mount stupid.png (613x481, 33K)

No, render times in ms tells you far more of the story. FPS is completely useless because it scales non-linearly, and its only useful property is that it's easier for an absolute brainlet to understand it.

The moment you touch a graphics engine even once you'll quickly see why FPS is such a useless metric.

They're literally just inverses of the same metric. The only difference is in the ambiguity of whether you're seeing instantaneous or sliding average framerate, while render times are all instantaneous. Saying you want consistency in frametimes is exactly the same as saying you want consistency in framerate.

over double the price, for a 20% gain.

I'll skip this one.

>No, render times in ms tells you far more of the story

lol no it doesn't

FPS is used for performance comparisons like these because it's a unit that is relevant to users

lol people trying to sound smart.

you 1080ti sucks compared to a card worth 100$ more. deal with it.

>get 100 frames in the first 20ms
>101st frame takes almost a second to render
M-m-muh 100fps!
Frame latency is king.

Attached: 1401165966291.jpg (250x250, 33K)

>roachtech

>being technically correct but unable to accurately describe the concept he's talking about
the very definition of mount stupid

Attached: perfection.gif (397x360, 55K)

Yes, they are, but the difference is that FPS isn't linear and is massively misleading.

If I am considering two graphics card and one averages 145 FPS while the other averages 155 FPS, then the average frame time difference is only 0.4ms.

Now, if you later see a benchmark with the same avg. frame time delta between these two cards, then the faster card averages at ~61 FPS, while "slower one" averages 60 FPS, which is absolutely negligible. The problem here is obviously that some moron is going to say "it's 10 FPS slower in x game" ignoring the fact that FPS scales non-linearly completely.

>unit that is relevant to users
Except when it isn't, right? Like outlined above.

The only "relevance" it has is that you want a card that performs at avg. of 60 FPS for your 60 Hz display at its native resolution in the games you play.

>Twice the die size also with three extra special cores
>still can't 60FPS 4K
>futureproof like a DeLorean

Nvidia, what happened. You were supposed to deliver us greatness.
You feed us gimmick to upsell 7nm
You're going to raise the next gen GPU prices even higher.

>+$100 a year keeps the gaymers queer

>Oh fuck, my frametime is above 16.66ms!
Said nobody

Actually, that was a terrible example and I feel bad.

The point I'm making is exactly that a tiny amount of frame time difference is borderline irrelevant, but looks absolutely huge when you get to really high framerates, despite the fact that it's the same tiny frame time difference.

You're sort of just agreeing to the point I'm making here.

you're both dumb and arguing over incongruous frivolities while Nvidia is preparing us for an ever larger dildo with even less lube.

re:

>Nvidia, what happened.

AMD threw in the towel.

>you're both dumb and arguing over incongruous frivolities
I don't consider it that, because I see stupid fucking brainlets talk about 20 FPS slower as a huge deficit when the framerate is already over 150+ for both cards all the time.

Yes, FPS is most definitely misleading.

without gaytracing its literally 60 fps you turd look in the OP

This is FAKE, all the results are fabricated.

> No footage of a 2080 Ti shown in the video
> BFV could not have been tested with a 2080 Ti, closed alpha ended before RTX announcement, open beta has not begun. Impossible to run BFV after closed alpha ended.
> New recent footage of BFV posted by YouTubers/Reviewers was played/recorded at Gamescom and they were allowed to keep it. It had no benchmarks and was running Ray Tracing.
> Going against NDA would be serious business.
> BFV benchmark shows BF1 gameplay in the preview panel in the video.
> Benchmarks seem suspicious and do no correlate to what is known about 1080 Ti FE benchmarks in the past.
> This channel never had any Gamescom coverage, kind of odd they now have a 2080 Ti so fast with only 100K subs. Only much larger channels are typically given such access.

Heres the thing I will never understand about these graphs. It seems like every time a new gpu is released the comparisons between the previous and new best always seem so skewed.
Like at what graphic settings do you need to run pubg at on a 1080 TI to not even hit 45 fps.
I feel like every time I see these graphs they want me to believe that my 1080 TI is somehow insufficient vs the new and top of the line gpus, when I run my games at 1080p and I max out pretty much every game with 60-100fps.

>implying an nvidia graph would start from 0
fake and gay

Fake and gay. Actually wait for benchmarks in September and not these rumored leaked (((benchmarks))).
My 1080ti will be okay for the next decade or so unless Nvidia gimps the shit out of it with their drivers.

This "review" is so bogus it hurts my feelings to even have to explain it. 4K at ultra? 70something FPS in BFV? With RTX turned on you say?

hothardware.com/news/battlefield-v-ray-tracing-hitting-sub-30fps-4k-geforce-rtx-2080-ti

The 2000 series is a showboat failure. The 2080Ti can't maintain 60FPS at 4K/max across the board with old games even using this generous fake review. The 2000 series is setting up a price hike for the 7nm generation. RTX is a new, unoptimized mess which adds essentially a blurred fill-in filter to compensate for the supreme lack of rendering power.

This shit is obsolete from day 1, it's a gap filler, it can't stand the test of time because it can't perform on even old games, god forbid the newer bleeding edge titles coming through.
If you buy a 2080 or 2080Ti you either have more money than sense or you're setting yourself up for a massive dose of buyer's remorse in 12 months.

Get this weak shit out of here.
Why would all the major review sites not have a single review article/video the day the cads can be bought? Why is the only reporting from a "hands-on" review event set up and controlled by Nvidia?

Could it be that RTX is a frame killing gimmick undeserving of anyone's attention?

Looks believable and about what I expected. Meh if true.

However we don't have any confirmation this is real since the cards are not out yet.

So, uh, how did they get one? Cards haven't even been sent out to testers yet, unless Nvidia gave them closed room demos and let them benchmark?

They probably did like I did for my benchmark and pulled some numbers out of their ass and put them into Excel.

Attached: Capture.png (495x446, 7K)

twitter.com/VideoCardz/status/1035246270548504577

>People need to realize there are many *disappointed* tech-journalists/influencers who did not get, and will not get a Turing sample. Such *leaks* are often made to annoy NVIDIA, rather than share something informative to discuss.

>Well, the rule of the thumb: if someone is not showing you the cards, *whispers* they probably never had them in the first place.

Keep on posting the fake benchmarks, FUDers

You're not stoping Turing from selling out

>100% more expensive
>only 37% more performance on average
lol

> Screencap'd
I just recently downloaded 'real time' ray tracing suites and sdks. When you actually come to understand how this works and performs, you understand how much bullshit has been layed on top including performance numbers. Imagine how beta everything is that they only gave game developers 2 weeks with these cards such that they didn't even implement the ray tracing pipeline in the right order.

Attached: 1532286432649.gif (300x186, 446K)

The rule of thumb when launching a product and putting it up for sale is to clearly detail its capabilities and performance ...
> SO
people don't have a chance to shit on your product... That is, if it actually is as amazing as you say and not dogshit like the FUD claims it to be

We hit 14/12nm brick wall

with all of the fuckups occuring beyond 14nm and overstated gains, it seems we sure did. 7nm dropped by goflo. TSMC only person going ahead. No info on the yields. Microsoft 10nm disaster. Microsoft declaring their smaller process NAND flash is getting 50% yields. Time for MCM and larger summed footprint. Enough w/ the nigger tier monolithic dies and stupid costs/price therein. Split this shit out to seperate chips and connect it w/ Phy (pick your poison : infinity fabric, nvlink, team blue link).. wtvr. No one cares.

>falling for the meme again

Attached: 970.jpg (625x352, 72K)

What other options do you have at this point? AMD is a fucking joke. I wish they weren't because I love competition in the marketplace, and have freely swung between ATI(AMD)/Nvidia and AMD/Intel for years, simply opting for the best performer at the time. Now, unfortunately, there's no real alternative to Nvidia if you give any amount of shit about performance.

Official drivers are not out until the NDA lifts. Anything you see is mostly fud.

If you're happy with midrange though, the RX 580 is a very nice card.

Get a GTX or RX, RTX is 80% more money for 20% more performance, why would you do that?

How the hell is this FUD if it shows that the card is good.

The FUD comes from the fact that none of it is verified and could just as easily be someone making up numbers.

hey bro, those red bars are higher than those blue bars
wow bro

based and redbarred