RTX super cards

Are you getting any of the RTX super series gpus this week lads? Got me a AORUS RTX 2070 8GB Super coming in the mail, hoping it will be a good upgrade from my 1060 3 GB.

Attached: download (1).jpg (624x624, 52K)

Other urls found in this thread:

pcworld.com/article/3407768/custom-amd-radeon-rx-5700-series-graphics-cards-will-arrive-next-month.html
twitter.com/SFWRedditGifs

why not the 2080 one? You aren't poor, are you?

Im not getting a 4k monitor so I thought it would be a waste of money to get the 2080, im running 1080p

Couldve just gotten a 5700xt or a used 1080ti for about the same performance minus meme raytracing for a much lower price depending on your location. Where u live a used 1080ti is aguably the best value

2080 super is garbage since it's the same overpriced chip. 2070 and 2060 supers actually got the chip from next level up.

A 2070 super for 1080p is already a waste of money. You should be at 2k 144hz minumum. I have a 1080ti which is similar in performance and i run 2K 165hz (of course not all games saturate my fps max). If you want to play 1080p 144hz a gtx 1080 or a 2060 should be the absolute max, and even then a 980ti, 1070 or 1070ti would be fine too

I'm waiting for custom Navi cards and deciding between 2060S and RX 5700 most likely.

Have sex

144hz g sync monitors are quite expensive my guy

off my site, unvirgin

You'd honestly have to be mentally handicapped to go with nvidia right now. AIB 5700XTs are coming out in like what, 2 weeks?

Attached: Screenshot_20190720011802_Firefox.jpg (1080x1423, 460K)

You're gonna have to wait a while for third-party coolers there bud, after that you're only 80 bucks or so away from the rtx equivalent.

The adults are talking, user. Zoom along now.

Not him, but he isnt wrong kek

>"Custom Radeon RX 5700 series graphics cards will arrive next month

>AMD Radeon chief Scott Herkelman said they'll arrive mid-August"

pcworld.com/article/3407768/custom-amd-radeon-rx-5700-series-graphics-cards-will-arrive-next-month.html

Also once you hit 2 GHz on these things you're essentially outperforming a 2070 super

No. because then id be signalling to nvidia that they can get away with anything and ill just be a good docile pay pig. Nvidia needs to feel the burn this generation. fortunately most consumers arent as retarded as you and know restraint

He is though. 2.1GHz 5700XT already outperforms 2070 super on a shitty DX 11 title. AIB models would be able to get pretty close to that.

Attached: RE2.png (1335x1212, 51K)

>generates 70% more heat than the 2070 super

>need to pay for third party cooler

>no DLSS

>terrible driver support for any games released after 2018, causing massive compatibility errors.

No thanks.

Is there a list of titles that use ray tracing?

Assetto Corsa Competizione
Atomic Heart
Battlefield V
Call of Duty: Modern Warfare (2019)
Control
Cyberpunk 2077
DOOM Eternal
Enlisted
Justice
JX3
MechWarrior 5: Mercenaries
Metro Exodus
ProjectDH
Quake II
Shadow of the Tomb Raider
Vampire: The Masquerade – Bloodlines 2
Watch Dogs Legion
Wolfenstein: Youngblood

yeah lol a 30% power increase for 5% more fps. Don't overclock your rx 5700 xt to 2.1ghz with a stupid +70% power limit and overvolt. Just keep it at 2ghz with an undervolt and enjoy a good stable efficient card. Also the rtx 2070 su can overclock too. Why does it need to beat the 2070 su? It's cheaper.

>DLSS
Honest question: Do people really enable this dogshit smear filter?

Attached: Untitled.jpg (2560x1440, 1.69M)

Oh so just BFV and "literally who?" games. That's disappointing desu senpai.

>Shadow of the Tomb Raider
post the FPS man

>buys an expansive 4k monitor to get a super clear detailed image.
> turns on a blur filter for more fps
Just buy a 1440p monitor if you want the fps.

sharpened image looks almost (sometimes even better) than native 4K tho

I second this. I'm running a 2060 and it performs more than comfortably, I'd say anything beyond that at 1080p is wasting your money. That said, 2060 is a pretty solid bang for the price point

DLSS: a technology that decreases image quality but increases fps. Wow i have never seen that before. Less quality gives you more fps?? Nvidia really out did themselves this time.

He needs to do the mental gymnastics to think that AMD is better because he's a fanboy that only cares about his team vs other big team. Sad really

5700xt isn't worth buying over a 2070s until AIB and even then, chances are it'll still be slower overall, as there's not much OC headroom now compared to the golden age where you could get +20% performance with a few sliders

I was talking about dlss, not the radeon sharpener.

>2080 super is garbage since it's the same overpriced chip.
It's the same chip, but has a few additional CUDA cores unlocked on top of the OC. The vanilla 2080 isn't a full TU104.
>Complaining about value at the high-end
You want the best, you pay for it.

Lol it isn't the same chip, do you even know what the fuck you're talking about?

Nvidia supports freesync

>Lol it isn't the same chip
Both 2080 and 2080S are TU104

In very limited selection of monitors

this. might have to manually enable it, but support for freesync from nvidia is getting much better

It's a god card for the money, you can get about 5% more fps for free if you undervolt, you can get a little bit more out of the card if you also increase the power limit by 10% or so, but after that the performance gains are small and power increase very high.

In a few limited types that are just as expensive as the g sync ones

Is ray tracing the new physx?

Hello brainlet

i'm not gay

Is that a yes or a no?

It just makes the scene look different. No one who is actually just playing the game looks at where light sources are coming from and if they make sense.

PhysX did well, but ray-tracing already has so much development behind it, spanning decades, it was already recognized as the future of rendering even before Turing. It's much, much bigger than a physics SDK.

ray tracing has always been the holy grail of computer graphics. the retards who have been shitting on it know fuck all about the industry.

2060 super is the best 1080p card because although it's better suited for 1440p in normal use, if raytracing takes off (and it looks like it may given that AMD are following suit with their next-gen console cards), that extra performance will be needed. The performance hit RT causes will be absorbed by the buffer of excessive performance the card has.

Confirm my reasoning for me please Jow Forums

>In computer graphics, ray tracing is a rendering technique for generating an image by tracing the path of light as pixels in an image plane and simulating the effects of its encounters with virtual objects. The technique is capable of producing a very high degree of visual realism, usually higher than that of typical scanline rendering methods, but at a greater computational cost. This makes ray tracing best suited for applications where taking a relatively long time to render a frame can be tolerated, such as in still images and film and television visual effects, and more poorly suited for real-time applications such as video games where speed is critical.
This ray tracing thing doesn't seem like a good thing for vidya so why are they forcing it into all the gaming cards?

because we are beginning to reach the point where we can do it *just* fast enough for it to be viable in games

With lack of competition, Nvidia wanted to create some marketing hype with gimmicky new tech and jacked prices up. Ray tracing will probably become mainstream at some point with GPUs getting better so the performance hit is kinda relative.

The issue is the performance penalty, not the image, which is superior in fidelity and easier to manipulate.
In theory, we could build a fairly cheap RT-only card, and produce RT-only games, and we could, perhaps in a few years, make games with all around better renders that look about as good or better than most of the rasterized solutions we have today.
The issue is rasterization is the standard we've settled on. So we're slowly weaning ourselves off it with hybrid rendering of both rasterization and ray-tracing, where we ray-trace those elements of the game that look worst with rasterization, like reflections, which tend to look inauthentic and sometimes glitchy when not raytraced.
There's still a performance penalty today, but as we perfect hybrid rendering and add more hardware to the task, we'll see better and better results.

So is it better to wait a few generations and let others beta test the tech before spending big bucks on a ray tracing card?

Should have gone with Msi instead. They make some of the best custom cards. Aorus looks shit with that design, it looks like it'll impede airflow. It looked much better with the 1080ti.

If it's within the same price range as the 2060, then absolutely yes -- also is it 8gb?

If I were you, I would consider cards based on their rasterization performance. I wouldn't buy a Turing card just for the raytracing, but hey, having the option to turn on the feature is a plus.

>paying top dollar for a 1080 ti
>my pc isn't strong enough to play quake II rtx :(

I'm fine with my 1660.

Yes, i already got the RTX 2060 Super, a decent upgrade from my previous GTX 1070 i believe.

Already got a 2080 Ti so no.
Don't buy aorus models by the way, they have bad thermals.

indeed, 1080p is deprecated

>already have 2 2080ti's. so no

wow user your mom let you have 2 2080tis?

I bought a 2080 Super. I went from 80 to 140 fps in Overwatch. I don't know if I really gained much elsewhere. My 1950X seems to have latency issues and possibly CPU limiting.

Attached: 1540765068371.gif (270x188, 1.78M)

No, it was your mom.

my condolences

MSI fucked up by skipping VirtualLink, anybody even slightly considering a meme helmet someday have no reason to buy from them.

WE MEMED ANOTHER ONE BOYS

Attached: chrome_ZNBZ0mpJ2W.png (1266x705, 397K)

>buying 12nm cards when we know for sure nvidia have access to 7nm
how retarded do you have to be to do this?

Attached: ISHYGDidn't Ask For This.jpg (250x250, 24K)

amd are so shit that they need 7nm to compete with nvidia's 12nm

Nanomemes don't mean shit when the performance is the same.

If they'd bought out a Super 2080 TI I'd probably have got one, but now I'm gonna wait until next year and see what comes out just before Cyberpunk 2077 releases.

>cost per frame
lmao

Just got the RX 5700 as an in-between gpu.
Will get whatever RTX is in the market when Cyberpunk releases.

Freesync is supported. Have it working just fine on my 1080 Ti @ 1440 144hz

Wow cyberpunk is literally all I'm interested in from this shit list.
RTX is a fucking meme

>higher resolutions is a meme
>DVD is a meme
>MP3 players is a meme
>smartphones is a meme
>more than 640kb is a meme

Arma 3 has ray tracing I think.

Higher resolutions have started to give diminishing returns in terms of visual quality. It isn't just about finding a gimmick. They're trying to find a new way to improve visual that normal people can actually notice.

It's not a meme as much as it is thoroughly not worth it yet. There's a long way to go before it becomes commonplace. As is, the hardware is expensive, the performance penalty is high and the supported software is low. It'll be interesting to see where it is in five years.

I got a RTX 2080 cheaper than 2070 Super, suck my dick.

the shitty inno3d one?

What monitor?

RTX Titan Super when?

I got the 2070 Super. It's pretty great. Getting bottlenecked hard by my i5 7600K in more intense games though.

You mean, is ray tracing going to become the backbone lighting method of the most popular game engines around, just like Physx is? Yes, undoubtedly. It's only a matter of time.

Gigabyte Windforce. Not the greatest card but I can live with it for the price.

ray tracing will be a fad that dies out in a gen or two. it's the curved screen of vga cards lol.

Had a 2080ti since April the drivers suck windows 1903+ also

Yeah I had mouse corruption. Thought it was my new GPU but after trying older ones, worked fine.

>paying for paperweights that are about to become obsolete
You can't justify RTX for futureproofing right now.

Maybe not for running ray tracing (though I'm certain optimizations will gradually make it more and more viable),but as "standard" cards they are easily an investment for future-proofing unless you're aiming for 4K.

>Are you getting any of the RTX super series gpus this week lads?
I'm probably getting them some week in 2024.

t.poorfag

>Maybe not for running ray tracing (though I'm certain optimizations will gradually make it more and more viable)

the fact that both microsoft and sony basicly said that they will develop games only with whatever tech amd will bring on rdna 2.0 pretty much makes those first gen rtx cards obsolete already since from the little things we know is the fact that amd will use separate cores from the actual core to work as async only during RT process(compute path basicly)while having the rest of the (graphics only pipeline) to work on the rest its fundamentaly different from what nvidia is doing

I still have my trusty Radeon VII, and I'll ride this sinking AMD "high-end" ship until I die, boys. The captain always goes down with his ship (and also this is only an editing workstation computer anyway, I don't play anything besides shitty esports titles)

>mention nvidia
>at least 7 desperate amd shills immediately spring to the offensive
Why is it always like this?

the worst part of this is the fact that whoever will buy an AIB card first will be lucky
litecoin devs that are actually trying to develop the algo says that the card is already above vega

>he's not interested in Doom Eternal

Imagine having to cope this hard.

>tfw entire lineup of t-h-i-c-c EVGA XC Ultra cards to choose from
>cooler designed for 2080 Ti
>2060 Super XC Ultra for 480€
>2070 XC Ultra for 500€
>2070 Super XC Ultra for 600€
Wat do? Willing to pay premium over other cards with the same chips so I don't get a wimpy two slot cooler that's hot and noisy. On the other hand I'm not entirely sure I even need a new computer yet.

Attached: card3.jpg (900x600, 102K)