Nvidia RTX = bigger flop than VEGA

I just HardwareUnboxed review.

Paying double the money for 10 more frames in the "Average FPS" category while having the same 1% minimums as the 1080Ti is a big yikes for me

also the 2080 vs 1080ti is exactly the same performance rofl but you are paying like 200 bucks more for the 2080

and Shadow of the Tomb Raider can't hit 60FPS average on the 2080Ti even (50FPS on 1% minimums) (AND THIS IS BEFORE THE RTX PATCH THAT WILL COME IN NOVEMBER AHAHAHAHAHA. IMAGINE HOW MUCH FPS U GET WITH NVIDIA RTX ENABLED IN THAT GAME, LIKE 20FPS ON 4K ROFL ROFLR OFLF)

Attached: 2018-09-19 16_46_04.png (1335x956, 432K)

Other urls found in this thread:

techpowerup.com/reviews/NVIDIA/GeForce_RTX_2080_Ti_Founders_Edition/38.html
youtube.com/watch?v=A8VrFUi79yo
twitter.com/SFWRedditImages

Are people looking at the same reviews? All i see are stunning improvements over the previous generation.

Stop cherrypicking """"Bad"""" results.

>being this asshurt that you wasted 1400 bucks for 10% more performance over 1080ti which costs half the price of 2080ti

stay butthurt Nvidiot

>20-30% performance increase now
>will be 50+% with DLSS
What's the problem here?

You you want I buy vega?

at 70% the price?
either way is still cannot go 60 fps consistently

Attached: 4859b0aaebe01286c4ac00be07b484ec.jpg (564x767, 70K)

don't forget it uses 100 more watts aswell.

Power consumtion and Money

Holy SHIT

Fucking HOUSEFIRE up in this BITCH

>350Watts (that's 200 more watts for 10% more FPS compared to 1080ti)

and this is WITHOUT THE RT CORES BEING USED. With Ray Tracing and DLSS involved (if they game supports it) power consumtpion will probably be like 480 Watts rofl rolforlfolrof

HOUSE
FIRE

I want to ask all the "lol vega housefire" cunts to share their thoughts about the RTX nuclear bombs KEK

Attached: 2018-09-19 17_16_08.jpg (1298x848, 135K)

So that's why Nvidia told Shadow of the Tomb Raider developers to delay the RTX patch to November.

How much die area does the RT cores have? Like 40%? That's gonna add a fuckhuge amount of power draw when RTX is ON :DD

If you have a 1080TI keep it, if you don't and need to upgrade buy a 1080TI preferably from some retard who "upgraded" to a 20xx card

DLSS is looking like almost as big a meme as primitive shaders

No

Number of games taking advantage of DLSS: 0

>stunning improvements
the last two gens for nvidia had the 70 series match their 80ti cards. With the top 80ti card besting the prior gen's 80ti by about 50-60%

This gen we have the 80 BARELY in SOME cases match the 80ti from the prior gen, all while sporting 3GB less ram and underperforming in 4k. Meanwhile the 80ti of the new gen only beats the prior gen 80ti by 30%

All with a huge premium. The 2080 is 100 more then the 1080ti, for less 4k performance and slightly better 1440p performance. meanwhile the 2080ti is 700 more then the prior gen 1080ti. Nvidia claims this is due to RT, and this is an early adopter tax of sorts. However RT is vaporware, with zero games enabling it at launch, even titles that NVIDIA demo'ed RT with.

you have to be one HUGE team green fanboy to swallow this load of shit.

Vega is a housefire and performs like shit, so who's going to be convinced by this?

>20-30% performance
>80% price increase
>0% of games support DLSS
Oy vey, stop kvetching and BUY, goyim!

Attached: Nvidia logo.gif (600x338, 3.41M)

Oof

Attached: Screenshot_20180919-164251.png (1440x720, 371K)

techpowerup.com/reviews/NVIDIA/GeForce_RTX_2080_Ti_Founders_Edition/38.html

>NVIDIA has once more made significant improvements in power efficiency with their Turing architecture, which has roughly 10-15% better performance per Watt compared to Pascal. Compared to AMD, NVIDIA is now almost twice as power efficient, and twice as fast, at the same time! The red team has some catching up to do, as power, which generates heat, which requires fan noise to get rid of, is now the number one limiting factor in graphics card design.

Don't believe the lies of AYYMDPOORFAGS trying desperately to save their dying, nearly bankrupt company

AMD have no answer, NAVI is exclusively for Sony's use. Vega is such a turd that AMD's next generation consumer GPUs are rumoured to be derived from Polaris, and that will barely. BARELY. Compete with the 2030 or 2050 GPU

Maybe Intel is our last hope for any competition.

>"nearly bankrupt company"

ive got a green team card but please, theyre in the middle of bankrupting intel atm with a 300% stock surge

The answer is to stop chasing the dragon, you retard. Even an RX 580 or GTX 1060 is enough to thoroughly enjoy every single game currently on the market. Stop lusting after having every pointless, performance-sapping option maxed out and providing so little difference in image quality that you can't notice it without a side-by-side comparison.

youtube.com/watch?v=A8VrFUi79yo

>just WAIT for wonder driver

Attached: 6467474764.png (216x432, 163K)

>vega 56 faster than 1070
truly fine wine technology

So after reading/watching reviews. They're decent upgrades but only decent with very short windows for heavy testing? Money value is questionable?

>14w more on idle than vega14 at 12nm
JUST FUCKING LOL

Learn English pajeet

>4k
Now witness nvidiots spam 4k Benchmarks because it makes their shitty card look better

Why are modern games so fucking hard to run?
The graphics haven't gone up by that much compared to 5 years ago but somehow even a 1080 Ti can't run these games at 60fps anymore.
Are game devs just getting lazy and assuming everyone is running multiple 1080 Ti's these days?

At this rate we'll never be able to run games at 8K

Attached: deku.png (900x720, 287K)

Shitter here. Any rumors on 2050 yet? Do you think they'll even throw a bone to use entrylevelets?

+100 watts +cost. KEK

>he's too poor to own a nice 4K screen

Attached: 9265445.jpg (716x716, 110K)

Optimization, don't have to make sure hard copies work by themselves anymore.

>paying for pixels you can’t even see

Me and my RTX friends can enable RTX and DLSS features to get a real life like experience while you guys stuck in the past with MUH MILLION FPS and fake shadows and reflections... C'mon is time to move on! No one even going to watch your non RTX stream anymore because the new standard is already here and now. Don't be stubborn and support gaming and technological progress with Turing!

What, did you genuinely expect nVidia to crank out miracles?
Well too bad have GeForce3 2.0.

H1 2019.
Might as well wait™ for 7nm shit then.

>B-but everyone knows vega is a housefire!!!
>W-whos going to be convinced by this???
Nvidiots can't even face the truth

We're talking about monitors and not TVs, idiot. You actually use a monitor close up.

cope harder poorfag

Not close enough to where 4K matters.

I'd rather have a high framerate with my 160hz monitor. You'd get smashed with your 30fps RTX meme buddyroo.

Phones get to that point with 400+ DPI at 1080p on a 5.5" screen. A 4K 27" display comes out to a whopping 163 DPI. Distance from the device is pretty similar, you're full of shit.

My face to screen distance is three feet, 4K wouldn’t be differentiable for me. I’d need to get my face closer than is comfortable for there to be a difference. Then there’s the issue of windows 10 having shit scaling at 4K which means I’d have to run the monitor at 1080p when not gaming.

>Three feet
That's not normal

I’m 6’6 so I need to be further back for my elbows and legs to be comfortable. And I run two monitors so the extra distance lets me see both of em.

You don't have to be an AMD fanboy to recognize that the prices of the RTX cards are horse shit.

Nvidia needs to release a possibility to enable DLSS for any game and let users train their own NN.

>2018
>be adult
>care about Video games

Rip your life

only sane people at nvidia are the developers and other IT guys

real talk, Can raytracing be utilised for minecraft shaders?

Am I the only 1080p-fag here?

tsmc 12nm is just 16nm++++++

Literally no performance improvement whatsoever.

Which would be fine if they sold them in the correct price tiers.

So if you were paying 500$ for the 2080 (1080ti performance)
and paying 700 for a 2080ti then this would be an ok upgrade generation

A 2070 at 350$ would be the top seller.

But they are MORE expensive that previous generations that had the same performance.

They are doing this to force you to buy up old 1000 series stocks

after that they will lower the prices to the same price that 1000 series was on release.

Its a massive rip off and ray tracing is absolute bullshit.

AMD can easily implement ray tracing on their cards and all they need is extra compute power to push it alongside normal rendering in game which is easy to do as long as they move to an MCM design in navi.

lowest tier Navi its a gtx 1080 performance in consoles which means its heat and power restricted heavily so a desktop variant of that same card clocked up will be closer to 1080ti and it will be MUCH cheaper.

Attached: radeon vegana.jpg (1280x450, 26K)

Navi is not a MCM design.

Rumors are all over the place about navi. Even if the inital designs for AMD next gen are just a 4094core vega with tweaks and slapped on 7nm that will still be a 1080ti-equivalent card which we now know its identical to the 2080.

>Rumors are all over the place about navi.
Their fucking CTO said that MCM is far and away.
>Even if the inital designs for AMD next gen are just a 4094core vega with tweaks and slapped on 7nm
It's not.
Gfx10 = another new uArch.

'course you cant see them if you dont own a 4K monitor

Who are the bigger Jews here:
Nvidia
Cryptominers

OPEN

nvidia for sure. crypto and decentralized currencies literally gives (((them))) nightmares. could you imagine if banks became obsolete?

Ahh. Great to see a civilized discussion happening.

So where's the ray tracing benchmarks so everyone can test your much touted feature NVIDickyafags?

The human eye can't see more than 720p.

Guess I'm not human then.

Attached: 1536912626725.jpg (222x227, 5K)

Or you're just deceiving yourself.

i cant believe its come to the point where people make videos telling you to lower your settings to make the most of your hardware, something that was common sense previously

Attached: 1522839673821.jpg (447x589, 47K)

They never even got to use the Turing cores this number will go up

Attached: 1536980018197.jpg (960x540, 56K)

yeah nothing could possibly go wrong in this scenario

Someone do an image of a house (rtx off) then next image (rtx on) with the house on fire.

/Thread

under this logic the following entertainment mediums are also unacceptable for adults
>music
>art
>cinema
>fictional literature
after all video games are nothing more than a modern synergy of various traditional art forms.

Attached: 1511705730037.jpg (1416x668, 163K)

ati gpu drivers on linux were absolutely atrocious back in the day, whereas now they're fairly open source and in the mainline kernel whereas nvidia now has atrocious linux support

Unironically intel iGPUs are the most freedom respecting.

it's a shame how cyclical things are

But stallman still recommends older nvidia gpus like kepler iirc because they can be used with nouveau and firmware that gets loaded by the kernel I think.

Bro, you're fucking blind, get some glasses, 4k, even at 3" on an 18" screen still has giant pixels you can see all three separate subpixels at, 27" is so huge you don't even need to set DPI scaling.

is this true? i remember it being nvidia having the better drivers for linux and ati being absolute dogshit
>i havent used linux in around 15 years though

>and firmware that gets loaded by the kernel I think
firmware that doesn't get loaded by the kernel*

Until very recently Wayland wouldn't even start up with an Nvidia GPU. Nvidia was insisting that devs support their special snowflake rendering APIs instead of the community developed APIs, but eventually the community caved. Support is still shit though.

AMD has completely open source drivers

Looks like my 1080ti purchase was worth it. I'm set for awhile

*7nm arrives for gpus*
nothing personnel kid

AMD have had a road cleared for them to take over.

Watch them fuck it up.

Does it need a separately trained NN for each game? Or can a generic one be made that would work on all games? Edges are edges, and the result doesn't need to be absolutely perfect, just better than FXAA.

Nvidia has a strong history of doing things that fuck over AMD, but now they've released Turing and it's less of a leap forward than it should have been, the ball is in AMD's court.
Sadly AMD also has a strong history of doing things that fuck over AMD, so it would be no surprise if they utterly botch this.

These cards really are disappointing. The pricing is ridiculous for the performance offered. Even if they sold at the same price as the Pascal equivalents, they would still be underwhelming and nothing to write home about, as they are right now it's just ridiculous.

To top it all off, raytracing is nowhere to be seen at this point and DLSS is basically a fucking scam. DLSS literally renders the game at a lower resolution and then upscales it into a blurry mess. They are trying to pass their shitty image upscaler off as some novel AA solution, when the result is just blurry shit with all fine detail in the image obliterated.

AMD BTFO

Attached: RTX.png (1070x648, 9K)

Hardware unboxed was shilling it before it even released by saying "they have seen things you haven't" etc. Not giving them a view.

Attached: ff.jpg (1080x1080, 167K)

Insane price increase
nearly 2.5 years of time later
equal performance

GTX 1080 launched at 549 but was generally sold at 499 its whole life
GTX 1080 ti was 699 but sold around 729 its whole life
RTX 2080 is 799 but is selling for 849?
RTX 2080 Ti is 999 but selling for 1249?

We expect 30-50% performance gains per generations for same priced products.
2080 should be 599 at most
2080 ti should be 799 at most
The worst part is these will sell and be sold out for the next 6 months

Attached: 25501876294_91f6466627_h.jpg (1600x968, 705K)

Attached: 1535133199016.jpg (562x530, 30K)

i like how they make single number differences look so big

Attached: DneIEsLVsAA_Ipa.jpg (1199x769, 182K)

>DLSS isnt injectable
its useless, and having a RTX 2080ti means i can throw everything at 4k maxed out setting with just FXAA on

>GTX 1080 launched at 549 but was generally sold at 499 its whole life
>GTX 1080 ti was 699 but sold around 729 its whole life
What rock have you been living under? Not long ago you were lucky to find a 1060 for those prices.

Do you even need DLSS at 4k?

cuckpin

Has ANY REVIEWER made a comment on this ? Digital Foundry maybe ?

DLSS renders the image at a lower resolution (50% pixel count) and then upscales it to native screen resolution. It's literally an image upscaler. The result is blurry and lacks fine detail. The performance increase comes from rendering at much lower resolution, there's no magic and the image quality suffers.

DLSS 2X actually renders at native resolution and then attempts to produce an image similar to 64x SSAA. This is a true AA solution, but will not come with a performance boost, just less aliasing. There is also a significant disadvantage to both DLSS variants in the fact that they need to be implemented in each game in particular and you need to download neural net data through GFE in order to use it.