Is the rtx 2080ti really worth it?

is the rtx 2080ti really worth it?

Attached: 6291646cv1d.jpg (1616x1518, 173K)

Other urls found in this thread:

youtube.com/watch?v=Ml8ome5d4Ig
twitter.com/AnonBabble

sure, why not.

can you even afford it

If you have money to throw in the trash, there's no other GPU that does the 4K meme better for gaming.

not now but sometime next month i will

No. 7nm gpus are coming soon (tm). There's little point dropping 1.2k+ on a 14nm gpu at this time.

Yes if you're a multi billionaire with a private jet and big mansion in hollywood hills

Buy amd GPU please.

is there even a AMD gpu that's equivalent to a rtx 2080ti?

It is the only option if you want to go 4K. I would still not recommend pay so much for a GPU, it is beyond stupid.

if you're buying one might as well buy 2 desu

WAIT FOR NAVI 20
A
I
T

F
O
R

N
A
V
I

2
0

what about sli 1080ti's ?
that would cost as much as a 2080ti

its not worth it then, theres much more important stuff to buy.
start small (rx 580, they are dirt cheap as they are 1060 6gb) then work up

Define your criteria for equivalence.
Answer might be a solid maybe.

SLI is meme tech
2080ti > 2x 1080ti

Just buy a 2080

I'd say that my vega 56 is twice as good as a 2080ti at gulping power

2080 is literally just a slightly faster 1080ti
is NV link any better?

>is NV link any better?
Sure it is technically better but the root of the issue is lack of developer support in most newer games. It's actually worth it if you're doing actual work on software that uses >1 gpu

if RTX turns out to actually be useful you're SOL cause a 1080 ti trying to run rtx is not even comparable to a 2060.

Though i think RTX is mostly a meme, if you can find a 1080 ti for 400 go for it

the upcoming playstation and xbox consoles plan to utilize it so it might be a good investment
idk really

Please sir do the needful and wait for Navi

Attached: be978a985e440c3a9680f0e4b5094f09da65ce52577626caf7f11f7b23b3501c.jpg (800x411, 152K)

If that's true than go for 2080, tech trends go where normie money flows

then*

No.

Radeon VII is their top end card, same price and performance as the 2080. It is slower in some areas than the 2080, though.

For GAYMEN definitely go with a NVIDIA.

I went with nvidia because I don't game.

>hahaha look guys I posted the India meme again xdddd
Fucking forcing so hard, my man.

You using CUDA?

NVIDIA OpenCL perf has always been dogshit.

I just got a Radeon VII and it's a pretty damn good GPU especially water cooled. Or else it runs hot like my asshole after Mexican food.

It's a beast computationally (my daily work requires this) and I GAYME (forgive me Jow Forums) at night sometimes after work and it runs it fine at high frame rates.

God damn poojeets, log off please.

>You using CUDA?
Mostly. It's pretty much the reason why I switched brands. I have found myself less irritated with everything, and somewhat zenlike.

CUDA is very mature, but NVIDIA is a cunt for crippling OpenCL performance basically trying to kill it since it's open source.

Historically AMD made cards that were better computationally but OpenCL hasn't improved for a while.

Hopefully Vulkan (which is supposed to replace OpenGL and OpenCL) will take over and get better development.

All these issues aren't my problem, and I'm not willing to fight a fight that isn't mine. I just buy what works.

RTX Titan

nm has never meant shit in terms of better gpu pricing. Moving to a lower nm just gives you your better performance per wattage

>water cooled radeon VII
how do you feel about wasting your fucking money on a custom cpu block for a card with hardly any options

>Tfw rtx titan is only tu102 at $3k, not tu100 with hbm2

Seeing the change from Maxwell 28nm to Pascal 14nm, I would skip Turing if possible

I doubt the jump will be as good as we had with 9xx to 10xx Turing is already down to 12nm but the price to performance has been ruining by costs outside of the silicon efficiency. Pipeline changes/gay tracing/tensor cores/etc seem to be where the performance gains for the 20xx series came from and ruining pricing.

>12nm
That's just a tweaked 14nm with a larger reticle limit no? At the very least next gen's cards should be quite a leap in perf/w and absolute perf.
I expect NV's insane pricing to persist since it werks for them. That is unless AMD can release something people will buy.

Breh, I was just telling you how it is.

Glad shit works for you, though.

The fuck you even saying? I like the card and it works fine.

I'm on a dual booth (Wangblows/macOS) Hackintosh (iMac Pro clone) so I have to stick with AMD.

So far so good. Upgraded from a Vega Frontier Edition, already see a big jump (20-25%).

Also feels good not supporting GAYVIDIA, I hate their fucking business practices. Fuck Jensen.

It's not just CPU die shrinking, it's also performance improvement.

For example, Pascal had better memory compression than Maxwell, that's why you saw such a big jump in perf.

RTX is a meme for now, but it's definitely the future.

When GAYMEN on consoles (ie upcoming PS5) in 2020 start supporting realtime Ray tracing, they will use AMD GPUs which most likely will have an open source dev toolkit. No console company is going to use NVIDIA anyway.

my point was you paid extra to cool a card just to get it to stop melting and being noisy thus defeating any potential price/performance gain you had

>Also feels good not supporting GAYVIDIA, I hate their fucking business practices. Fuck Jensen.
Seems like you choose your hardware based on identity politics instead of performance. To each their own I guess.

>RTX is a meme for now, but it's definitely the future
With the exponential difficulty in node shrinks, we're fucked in this aspect. Games today are only doing 1 spp with ML denoising and to truly achieve lifelike qualities, we'll need 100-1000 spp.

~$140 for the waterblock from EKWB, no big deal. Are you that poor you can't afford it?

There's a $99 one on Ali Express from Byski or whatever the fuck it's called, it's a good one too.

The gain is lack of noise, because I hate fans. My current system is a 9980XE with 64GB RAM and it's a workstation in a silent environment, I don't want to hear those shitty fans.

And also the fact that it stays below 105c for the Hotspot and 70-75c for GPU core on full load is a plus for me.

Yeah we all know AMD GCN architecture is shittily optimized and runs like hot garbage, but who cares. It's a great card for what I need it for.

Not really, like I said I am forced to use AMD due to it being a dual boot system (No Pascal support for macOS).

I would have probably went with a 2080Ti....if I didn't need to do any of this.

For the record, not supporting AMD is a plus in my book, and I vote with my wallet. NVIDIA has disgusting business practices and if I had the option to not support Intel either, I would take it in a heart beat.

It'll come eventually. RTX shit is definitely a marketing scheme for now, and who the fuck wants to run 1080p games and blow it up to 4k? That's retarded.

Realtime Raytracing has been in the works for so many years, I'm glad it's coming to us soon but not this year. Maybe 2021 will be a good year for it.

>not supporting AMD is a plus in my book

Meant not supporting NVIDIA*

>MacOS
>Not a blue haired faggot

>paying 140 dollars for a water block
did you try just getting a better card

>Not supporting nvidia supposedly, because of nvidia's practices
>Uses macOS, and most likely owns apple devices
Lel. You seem like an asshat, but keep fighting the good fight user.

Maybe I was in my teens. U mad?

Like what? There’s only one type of Radeon VII.

You seem angry and jealous.

Attached: BE534C03-2E89-4C0D-9E44-F5E4B0E8A5F9.jpg (3492x4656, 2.25M)

It just exists for the richer in my opinion

>is the rtx 2080ti really worth it?
I mean, why ask for advice if you can afford the best GPU money can buy? What's the point?

Why yes, of course it is!

Attached: 6VgZUGp.jpg (1080x1920, 239K)

I remember wait for vega

If you absolutely have to play the latest games at 4K 60fps, sure.
For anything less, go with a 2070 or Radeon VII

Used 1080tis go for 500-600$

port royal 4k 120fps gpu when

Attached: sad_pepe.jpg (980x625, 37K)

youtube.com/watch?v=Ml8ome5d4Ig
You're asking for a gpu that's 10-15x faster than a 2080ti. Come back in 20 years.

When did this happen where half of all news articles are just blatantly ads? Did they all get together and agree that the average person is a retard and it’s pointless hiding it? Or do journalists have that little respect for us? Or did the quality of journalism go down so sharply that half of the journalists are now just incapable of hiding ads in their articles? I can’t pinpoint when this happened but it didn’t used to be this way

It happened when hardware companies got too large and with it, have the capital to buy mediocre journalists. Good journalists either quit to pursue other interests or are scouted by said large companies for their technical knowledge to be full time employees.

Bro the ps5 will be able to do that when it comes out next year. Pc gamers literally never stop getting btfo. I hope you aren’t right that it will take pc mustard race over a decade to catch up to the next console because the crying will get really old

this user asked for a 4k 120fps 3dmark chip. not a 4k30 one. 3dmark is also designed to be very graphically intensive at the time of release that no current-gen cards can run it smoothly at max settings.

That may explain tech news somewhat but it’s everywhere now. 99 percent of news articles are either shilling politics somehow(for or against a politician for example, pro or anti liberal, anti comminist, pro gun rights etc etc) or shilling a product. It’s like actual news stopped existing when I wasn’t looking.

I bought 2 2080tis when they first came out. Why would I wait? When the 2080s came out I just threw my 1080s in the trash. Why shouldn't I just throw the 2080s in the trash when the 7nm comes out?

I'll be getting 2 as well

>When did this happen where half of all news articles are just blatantly ads?
So much this. Half of product reviews and news are blatant shills for the products. Last 4 years have been a horrible monopolisation of media, especially when only a few companies actually control your searches (google, etc).

Dude didn’t you see the leaks? The ps5 is gonna do 8k with above 100fps and ray tracing. Literally gonna btfo every gayming pc out there. Did you miss all the crying? Sorry to be the one to tell you your rig is now obsolete, there’s no way anyone’s pc is gonna be able to play games made for that beast. And if Xbox manages something similar pc gaymers are totally screwed. No more modern games for you. Better start saving up for a PlayStation

No I swear search engines are silencing competitors that don't pay google to boost their results.

This.

For reals if you can afford it why not. As long as you don't have like some 60hz shitty monitor or a Microsoft intellimouse go for it.

>no hbm
>two digit process
no

no

I really doubt the ps5 is gonna do better then 60fps at 8k

You might want to get that autism checked since he's clearly being sarcastic.

What? The developer of the ps5 literally said that shit. It’s not like he’s pulling it out of his ass. Obviously that dude is exaggerating some but it’s pretty clear the ps5 is gonna btfo every current pc.

Me too. Sadly. But even the amd shill God AdoredTV said it was shit long before it released; navi can't really be a disappointment since our expectations aren't that high. As long as they don't use hbm those tards can sell it for low prices and nobody will cry about the flaming dumpster fire that is GCN.

Hdmi 2.1 supports 8k output upscaled. It does not mean it will render at 8k

8k@60fps output resolution maybe but no way for an actual game with decent graphics.
AMD would need to pull out architecture based performance gains of maxwell and pascal(higher clock speeds not only due to process) combined to hit the performance necessary, also that capable chips with 64CUs is rather unlikely so gcn would have been majorly redesigned, not to mention that more CUs need more area and increase cost significantly. Also the power budget would have to be huge even with incredible architecture advances. Just not happening.

But please, convince us and send the link to your source.