Inb4 you have to compare the RTX 2080 with a 1080 Ti because they're priced similarly

youtube.com/watch?v=s86bsnYbh8I
youtube.com/watch?v=M4uD7tpPSKs

>inb4 you have to compare the RTX 2080 with a 1080 Ti because they're priced similarly

Attached: 1447174425222.jpg (796x805, 119K)

Other urls found in this thread:

blogs.unity3d.com/2018/03/29/amd-radeon-rays-integrated-into-unitys-gpu-progressive-lightmapper/
twitter.com/NSFWRedditImage

wow it's another boring-ass 20% faster upgrade, only this time it's not the same price.

more like 30% even 50% in some titles
"just wait fags"-rejoice, we actually won this time
idiots who bought a cheap 2yo GTX a few weeks before RTX's release are absolutely SEETHING right now

I wonder how much of the price is AMD providing no competition at the high end, and how much of it is how XBOX HUEG the Turing GPU die is.

Attached: GeForce-RTX-2080-TI-Graphics-Card_2.jpg (700x467, 128K)

he has some retarded variance, his testing methodology is absolutely flawed. Wait for the real benchmarks.

40-50% increase in performance for 80% price increase? Nah thanks

thing is, things won't change until mid-ish 2020
AMD's gpu lineup for 2019 will be basically refreshed 7nm vega chips
2020 is when they're set to release their REAL next gen of gpus

Le 4K meme

Even though I doubt it I hope they'll drop in price.
~650$ seems reasonable during a time when there is literally no competition.

I think it has mostly to do with the select titles. If you check his other comparisons you'll notice a similiar trend between RTX and other GPUs.

>650$ seems reasonable
nvidia cucked you so hard you actually fell for it

Wow it's like you only read half of my post.
I'm fully aware that these prices are still ludicrous but it's kind of AMD's fault for shitting the bed in regards to their GPU department and letting Nvidia get away with this.
If you're an enthusiast and want top of the line hardware there is no way around getting bent over by Nvidia and accepting the anal fissure.

4K bottlenecks do not tell the entire story. Wait for 1080p and 1440p on an OC 8700K.

I mean, if they take the process node savings and reinvest them back into VEGA's bottlenecks with more ROPs and CCUs couldn't that make them competitive considering how much of the RTX core is basically dedicated to double precision, machine learning, etc which Nvidia is trying to shoehorn into gaming applications with DLSS and raytraced reflections and soft shadows.

The fact that game developers are leaping onto Nvidias dick with raytracing and DLSS coming out soon after the hardware is available shows how much game devs don't give a shit about AMD. Nvidia dominates absolutely.

>~650$
It's about 900 in my country
>1usd=4romanian lei

Attached: Screenshot_20180919-130725.png (1080x1920, 297K)

seriously
who gives a shit about 4K performance

I'm saying they are kind of reasonable AFTER the drop to like ~650$, which is probably not going to happen anytime soon.

Normies don't give a shit about 1440p at higher refresh rates. All they want is 40k/60 already.

What games/developers are you referring to? As far as I can remember all they were showing was Metro and BFV which are nothing but instruments for marketing their new products. I don't think raytracing will be the next big thing.

Literally -1% who have 4K monitors.
Most people run at 1080p and probably 10% at 1440p. Like I said. The benchmarks will show how the performance really is.

It will. Just not for a long time.

Nvidia are taking advantage of the crypto bubble burst and no competition. Nothing more.

why if you think the prices are
>ludicrous
why did you post
>seems reasonable
you shouldn't give Nvidia any credit

Just like HairWorks, right?

Reasonable in the context of not having competition. What do you expect? Nvidia is a business and they're gonna profit off of every goyim stupid enought to buying their products.
What I'm trying to say is IF you really wanna shell out that much cash at least wait until they drop in prices significantly.

The bubble burst though (for the second time).
I've heard theories though that they've set prices for the RTX 2000 series high enough based on performance, so that they won't cannibalize old stockpiled GTX 1000 cards now that demand has cratered (see the glut of used cards on Ebay) to keep AIB vendors happy.

yeah and physx

You're still trying to justify it, stop talking positively about the cancer of a company it is.

>capitalism dictates maximization of profits
>nvidia follows suit
>somehow their fault for not having competition that lets them get away with these ridiculous prices

If you had a product and could literally name any price imaginable and people would still buy it I guaran-fucking-tee you, you absolutely would.
I'm not condoning this kind of business practice but it's just the way things are.

Good thing is that these types of companies become lazy and as soon as a new competitor enters the market and gives them a run for their money they desperately try to damage control while Jow Forums just watches them crash and burn. See AMD vs Intel.

never buy first gen of a new technology or series, let the beta testers market do it

This shit is fake because it is based on Nvidia leaked benchmark.
Trust me OP, it is much, much worse than that.

The ammount of TIME and MONEY that it would take to get nearly anywhere near nvidias performance would be insane, so I doubt anyone bar AMD could tople nvida. Unless some billionaire company feels like giving it a shot for some reason.

>30-60% perf increase
>and that's without any RT stuff
Damn happy I didn't listen to all the AMD shills screeching about 5% increases and bought a 2080Ti.

inb4 ray tracing is a meme (until AMD announces their own ray tracing stuff then it will be the future).

Attached: 4563548343253.png (653x726, 34K)

What time is the embargo up? Reviews drop today right?

Why do people buy these cards if they're not playing 4K?
Just buy a poor card, suited for your poor display panel.

>In before they dare to compare the RTX 2080 with a top AMD card.
I-I-I-I CAN"T HANDLE IT. PLEASE STOP MAKING IT HAPPENS!

Attached: ayy.png (623x808, 418K)

>poor display panel.
>says the 4klet
Enjoying that ghosting, 60Hz, 8bit, TN, or any combination of those? Bought a 2000-series for the ray tracing, not the 4k performance, couldn't care less about 4k, using 1440p and not switching for a few years still.

Yeah, I'm fine.
Enjoying your CPU bottlenecking your graphics card?

to brute force every bad optimization possible.

lol do you really play games at 60Hz?

Dude its almost 2019, 144hz has been mainstream for a few years.

Yes, I do, but I don't play FPS, so I don't care.
I'm more about image quality than muh response time, because you know, every 144Hz gaymer is a 'pro-gamer'.
Also, kill yourself, how long are you gonna shill for no-video and intel on this board? Is it your fucking job or something?
For your information 144Hz just happened because 3D failed and they needed to unload their mountains of panels. Turns out you idiots all bought into it.

I'm locked into Nvidia because I have 2 G-Sync monitors.
Was going to get a new computer with a 2080 but I'll wait for the price drop after they sell all the 1080s.

>4klet ACTUALLY plays games at 60Hz
OH NO NO NO NO

Attached: 1452514484422.jpg (1462x1462, 189K)

Well, 60Hz is fine. When I play FPS games, I don't plan on becoming a twitch whore, or 'making it' to the big scene.
It's alien to most of you, but I do it for funs.

>144Hz just happened because 3D failed and they needed to unload their mountains of panels
I hope you're only pretending to be retarded, I can't go back to 60hz after experiencing 144hz.

And I can't go back to 320x2...1080p after playing 4k.

>mfw an actual herzlet talking shit about gaming hardware
enjoy your console tier performance LMAO

He buys AMD and plays at 60Hz.

True hero.

going from 144 hz to 60hz is harder than going from 4k to 1080p

From my experience, people don't know what 4K is unless they experience it.
Hopefully, all of you guys realize what you were missing on in a few years.

lol how big is ur monitor kid?

Retard.
4K is impossible on AMD video cards.

I did experience it. 144hz > 4k.

How big is your dick, Aishiterusawako Awesome brainlet?
It's 28". And you know what, I have a 27" 1080p panel next to it for comparison, so I can tell you it stays turned off all the time.

Well, you did it wrong, or didn't play a game that benefited from it.
Now, here are the games I play most, and you tell me why I should buy a 1080p 144Hz monitor:
Factorio
Kerbal Space Program

Your display does 9420 pixels per inch per second.

Mine does 19580.

Your monitor is garbage I guess is what I'm trying to say.

>1080p 144hz
Step it up and get a 1440p 144hz fucking poorfag.

My math was off screen is at 165Hz not 160Hz so its 20192

Buyer's remorse making you believe only your configuration is correct.
I also has a TV, so I can output at more pixel/inch/second if you want.
Or, I could play on a 15" panel and increase that metric as well.
You're not very smart, are you?

But why?

-60% perf increase
>100% price increase

Attached: 1446618369833.jpg (419x480, 58K)

>165Hz
Yeah, you're such a victim to marketing.
I guess Quake 3 is a good game.

You are not very smart.

about 8% wow, it's bollox

Attached: Screenshot from 2018-09-19 14-16-49.png (1920x1080, 1.11M)

>500W TDP
burn baby burn

>paying $400 for slightly better shadow and reflection
>at 1080p
>45fps on Metro
>60s with dips in BFV
>Silky smooth 35 fps in Shadow of Tomb Raider

Were at least 2 years too early for this shit.
Also, for fuck sake, its a cut down Quadro that Nvidia is forcing down on its consumer because they believe there is a lot of retards that would lap it up.

PoE actually.

Attached: penis.jpg (2560x1440, 1.27M)

Here's the thing, NoVideo left the door wide open for AMD to make a comeback at non ray-tracing things.
It's up to them to take it, but the recent 7nm Vega announcement looks like what it is.
NVidia could have absolutly destroyed AMD, but, because all that real estate is useless on their GPU, they might be able to compete.

How does 165z helps you if you're not playing FPS games?

Looks and feels much nicer.

There is an uncanny valley type threshold that you notice when you go higher and there is no going back to 60Hz ever.

But Steve over at GN DID test that. The 2080 still preforms within 5fps of the 1080ti.

Says a guy who dropped a grand for a tech that's only available in a couple games and will drop down performance to that of a card that's half the price

You realize animations aren't sampled to this rate, do you?

Even 120Hz is very different than 60Hz

THIS IS AMD'S FAULT

Animations aren't static.

OH NO NO NO

Attached: 1537364151506.jpg (1896x738, 183K)

get with the times
people used this kind of reasoning years ago when high refresh wasn't nearly as popular as it is now. High refresh is nice in ANY game.

>that would be $799 + tips

Attached: rtx-2080-fe-bench-far-cry-5-4k-high.png (810x650, 51K)

Well, I'm of the opinion that we should have had 1080p 75Hz-100Hz, but because 3D failed, they just shoved this down your anus, and you all took it.
Having to delid, and watercool and all that shit is a direct result from this.
Then there are retards that will fucking buy 10000Hz display just because they exist.

Dude you sound schizo.

the titles and the numbers are literally the same as with the "leaked" nvidia showcase about the benchmarks...

What game do you achieve 165fps constant on?
I'm pretty sure even fucking crysis doesn't pass that.

competitive games like cs:go/overwatch etc. can all do very high frame rates

>buying from pcgarage

>>and that's without any RT stuff
and you think **rt stuff** is gonna increase the perfomance? or gonna choke down amd? cause you know amd already has its own async version on each engine but the ue4 right?....

(not to mention that you wont see any ***RT STUFF**** for a long time cause the dxr path is still on experimental pipeline without even a release day LOL )

Yeah, that's a good use of higher framerate.
I don't play those games, so why the fuck are you shitting all over my choices?

should i just get a 1060 as a poorfag? it'll be while before 2060 which'll prob be expensive anyways and AMD equivalents.
i play at 1080p

Also, 'competitive'. Keep dreaming.

somebody call nine wan wan

Attached: power_peak.png (500x890, 42K)

RX 580 is a godsent for poorfags

1060/rx580 is very fine for 1080p 60fps on about everything out there.
It also comes at more than half the price.

Looking at 2080 review right now.
Why the fuck didn't NVidia just do a 1080Ti refresh, without the +50% price nonsense.
They're basically the same fucking card for the same price.

you do know that ray tracing MUST be async right?
plus if you bothered to read the whitepaper you know that nvidia is doing ray tracing only on a single triangle on screen and waits for it to intersect with a ray
meanwhile amd does things quite a lot different..
blogs.unity3d.com/2018/03/29/amd-radeon-rays-integrated-into-unitys-gpu-progressive-lightmapper/
but you know they actually use features that they can USE NOW instead of waiting for the dxr to come out of the experimental path in order to be used

Well, if the performance is there, it's still more efficient than anything out there.

Wait.
Are you hoping to do raytracing at 1440p 165Hz?
How much did you blow in that display?

>entire presentation about ray tracing
>entire marketing about ray tracing
>2/3rds of the GPU about ray tracing
Anyone who is surprised by this is a retard, it's been marketed and presented as a ray tracing card from the start, if you aren't doing ray tracing why would you buy this? Hell they even enabled 10/12bit on RTX cards, these things are obviously not aimed at video games, even if that was Nvidia's intention.

There's a very good chance the 2080 IS a 1080Ti with added RTX stuff. That 1/3rd dedicated to raster? I bet my ass it's a 1080Ti with some modifications.

Is everyone just going to ignore ray tracing and DLSS? Literal game changer features?

who on earth even told you that?