I compared RTX 2060 with RTX 2070 and apparently I would pay 46% more for 2070 but only get 10% more performance

I compared RTX 2060 with RTX 2070 and apparently I would pay 46% more for 2070 but only get 10% more performance.

How important is the RAM of the graphics card for gaming? 2060 has 6 GB and 2070 has 8 GB so sure, 2 GB more RAM, but will I notice anything? My current old card only has 2 GB so I would still get three times more if I get RTX 2060.

Attached: gfx.png (800x368, 209K)

Other urls found in this thread:

newegg.com/Product/Product.aspx?Item=N82E16814930006
videocardbenchmark.net/high_end_gpus.html
twitter.com/NSFWRedditGif

Go be a good goyim somewhere else.

huh?

Attached: wtf.gif (352x200, 881K)

>knows math for gpu
>Doesn't know math for cpu

The 2060 is based

don't fall for those tricks, get a used 1070 or 1080

huh?

seems like it but i thought id ask the experts, ergo this thread. but i get a bunch of nonsense replies.

huh?

Attached: george jerk.jpg (298x427, 35K)

Get the 2060, if you go for the 2070 don't go for it, go all the way and get a 2080 ti. 2070-2080ti shitty value.

>2080ti
no. i wouldn't do that. 208% price increase and only 29% performance increase. not paying three times as much for 29% more performance (and 11 GB ram instead of 6 GB, which i STILL haven't gotten advice on how much it matters)

Attached: incorrect15.jpg (599x800, 70K)

>if you go for the 2070 don't go for it, go all the way and get a 2080 ti
>Spend literally three times as much because, hey, might as well!
Get back in the oven, Shlomo.

Its an amd shill board is why all the nonsense

i see, does AMD provide a raytracing graphics card?

Attached: greenweirdman.jpg (600x654, 32K)

>looking at 2060/2070
>thinking either are usable with RTX ON

not a single card at the moment supports real raytracing
and the 2060 sucks dicks at rtx

Every time I start to forget how stupid this board is someone suggests buying used electronics. Unbelievable

No they dont. Look at these seething amdshill replies btw

you wouldn't tell lies, now would you user?

Attached: 4chumblr2.jpg (700x454, 261K)

At 1440p 60fps running most games on ultra settings with RTX turned off I typically sit around 5gb of Vram usage on my 2070, very occassionally I have seen around 6.6gb of usage at the absolute most

For 1080p the RTX 2060 is basically unbeatable, at 1440p its very,very good but the 2070 may be worth if you care more about actually using RTX when some games with it actually come out or maxing out very graphically demanding games

Games will definently touch 6gb, cyberpunk will most likely, best to spend more so you don't have to replace it within a year or 2.

nice, some actual information! thanks dude. i only have a 1440p screen and aren't planning on getting 4K anytime soon. if i do i could probably live with still playing in 1440p or in 4K with high instead of ultra (or even medium).

noticed that there's a short 2060 card available, maybe i should get that one and try to find as small case as possible

it's not like it's a hard limit anyway as far as i know, textures can still be loaded from ssd when needed so there will just be some more pop-ins if all VRAM gets filled

Attached: 5026141.jpg (600x600, 42K)

GTX/RTX *60 cards are generally a sweet spot these days. Below that you pay not too much more and get a lot more. Above that you pay a lot more and get a little bit more.

Did Radeon mess up their architecture or something? RX Vega 64 seems to be more expensive than RTX 2060 but only have 90% of the performance... Though Radeon has 8 GB ram instead of 6 GB.

Is "more RAM" the only way Radeon can compete?

Attached: thinking3.png (512x384, 171K)

Radeon really doesn't have anything to compete with Turing.

What's wrong with used electronics? They can't really get fucked up and still work. They either work or they don't.

What about a vega 56? That has 8GB of vRAM and is faster than the RTX 2070 in most games now. You can find them for $300 on newegg.

newegg.com/Product/Product.aspx?Item=N82E16814930006

Attached: Screenshot_2019-04-04-14-49-51.png (1280x720, 321K)

Depends on the game, the brute FP32 advantage of vega shaderd IS THERE but games have to take advantage of it. That said there are a growing number of games doing just that.

Attached: DiRT (1).png (1328x1225, 61K)

i'd trust this list the most, not individual reports of it being better in some games:
videocardbenchmark.net/high_end_gpus.html

Attached: living the dream.jpg (650x487, 76K)

Dirt is a shit game and the biggest outlier reviewers could find

>synthetic
>literally 0.000000000% correlation with real world performance
Oh so you're just mentally and physically disabled? Got it.

Things is that list is slowly but steadily growing especially with driver updates. 18.0-19.0 saw 10-20% fps uplifts alone.

>56 scoring better than 64
Just go to cpuuserbenchmark, don't use that shit you posted

Why do you think JEWVIDYA killed %60 series SLI? Do you think it was an accident? Some sort of mistake?
The %60 cards were always the better deal. I ran two 760's at 1080p for over 5 years. Welcome to reality OP.

Attached: Capture.png (815x388, 33K)

You can still use DifferentSLIAuto

>150 dollars more for 2 more gbs of vram and a 10% overclock
Turing cards are a scam
avoid like the plague
Nvidia is doing this to force planned obsolescence on uninformed consumers