New release

New release.
New opportunity to laugh at people who wasted hundreds of dollars extra on an RTX 2070 over a much cheaper card which performs the same.

Attached: vivaldi_2019-02-06_16-07-11.png (1338x1309, 157K)

Other urls found in this thread:

phoronix.com/scan.php?page=article&item=rocm-20-linux50&num=1
twitter.com/NSFWRedditImage

Aren't you that guy who got eternally btfo on pcbg and you stopped posting there?

It's not only about performance. It's about quality. It's about knowing you have good driver support. It's about knowing your card isn't a cheap piece of shit made a poo in loo indian. It's about the brand.
AyyMD poorfags will NEVER know the feel of having the best gaming cards in the market. Cope.

No. Never been BTFO here. I'm almost always right.

Yeah I agree. The better image quality on an AMD GPU is also an important win.

>mfw I undervolt and OC my Vega 56 and beat a stock GTX 1080.

Attached: 1990s-uk-desperate-dan-comic-cartoon-plate-EXPM5K.jpg (1079x1390, 241K)

I can't read these martian runes. What is this chart showing?

Nvidia has deeper pixels

AMD drivers are better than Nvidia drivers and have been for years.

At least on Linux.

2080s are literally dying dueto faulty memory, brand cuck.

why is 2060 matching 1070? I thought people were saying it was slightly faster than a 1070 ti and like 2 fps below a 1080?

Nvidiafag Cope this hard

Attached: 1270320363609.gif (602x323, 374K)

Massive kek
Vega 56 matching a 2070 lmao.
11/10 damn I wish I had your job.

in cherrypicked games at certain resolutions, they aren't lying.

the real question is where is the 1060 3gb on this list :^)
is it under all of those because the 3gb limits it more than these older shittier 4gb cards?

Holy shit the 1050Ti did not age well at all.

Every time I see these Russian graphs it's always some obnoxious fag saying something controversial.

It's almost like you shouldn't take them at face value...

>tfw getting a cheap vega 64 on black friday

I love this thing

NOOOOOOOOOOOO I RETURNED A 1070 Ti TO PURCHASE A 2060! WHAT THE FUCK AAAAAAAAAHHHHHHHHHHHHHHHHHHHH

Attached: 1528934935276.jpg (220x229, 9K)

It ok you still can play rtx game at 15fps

That's what I get for saving $80 leaf bux by buying the cheaper card.

Thinking a used 1080ti is the best option for me. Used 1080ti are listed at $700 CAD, going to try and jew someone down to $550 or $600. Granted for that price it will be a low tier MSI or Gigabyte board, though I have a spare H100 so I should be able to cool it with Kraken G12 kit and hit 2GHz easily.

The only other option would be to wait for Nvidia to lower their prices and/or release 7nm GPUs because Navi, probably a year away (maybe 8 months)

>gimpworks free game
>suddenly a $350 vega56 is just as good as a $500 2070

Reminder Vega cards are actually amazingly good and they only look bad because Nvidia forces their gimpworks shit on developers.

Vega will age just as well as the 290 did.

sidegrade kek

Waiting for RTX 3000 series, on a 1080. AMD cant into gpu efficiency

>and/or
Wrong, their biggest competitor is next gen consoles. We are reliving 2005 again and Nvidia just released the "7800 GTX". After that they will have to btfo the consoles with a "8800 GTX."

Hell, RTX was made in the first place to help PC stay ahead of the consoles. This is why I don't understand why you knuckleheads here keep talking down on ray tracing, like it's not the greatest thing for PC since Crysis.

Bro I love Ray tracing I just wish the cards weren't so fucking expensive

early adopters always pay the premium

7nm wafer supply is tight, so I could see Nvidia competing via price cuts until 2020. I don't doubt that the 7nm 3080ti will be a monster, though I do doubt we will see a 7800GTX to 8800GTX jump.

7800GTX was 333mm^2 on 110nm
8800GTX was 484mm^2 on 90nm
90nm is 1.5x as dense as 110nm, the 8800GTX has more than 2x the transistors as the 7800GTX (681 million vs 302 million)

The 2080ti wafer is so big that it is at the limits of what the foundry can do for reasonable prices. Most likely most of the 12nm->7nm improvements will go towards reducing the die sizes to something more economical. I am not expecting more than a 1.4x improvement for most games, though ray tracing will probably show a bigger improvement.

Vegas performance per watt is still terrible tho.

thinking about grabbing a 2080ti for $999 while i can still get $550 for my 1080ti. They're both inflated so it sort of balances out.

>vega 56 literally on the same level as 2070
lmao what the fuck is nvidia doing how can a 600$ card can't beat over year old 300$ card?

RTX is nothing special - you can entirely do the same calculations on the compute cores and GCN has oodles of compute horsepower and asychronus workloads don't bog it down. RTX is Nvidia's attempt to have soem sort of secret sauce to push for DX12 because they have fought tooth and nail agaisnt DX12 and vulkan's adoption until they can gain some sort of critical edge. They really, really don't want DXR to take off precisely because of previously mentioned compute power of GCN means AMD can do the same thing without any extra dedicated hardware.

This is the result of Nvidia's (pretty good to be fair at the time) decision to split their architecture into a compute focused one for the datacentre and a gaming focused one for the desktop - Maxwell was the first generation of this split and Nvidia is fighting to keep that split.

It's not showing cuda, and Everytime I see an amdfag post I'm constantly reminded that they aren't content creators, or scientists at fucking all.

I'm not shilling for RTX in particular.

Nigh all the people skeptical about ray tracing don't like ray tracing of any form, like it's some HairWorks or PhysX gimmick we can live without and keep using using cheap tricks to produce our lighting forever.

They are modern day PC Luddites who most likely never lived through the 90's and saw the yearly improvements. To me it's just another day in the PC world to embrace ray tracing.

>you can entirely do the same calculations on the compute cores and GCN
while being 6x slower than 2080ti in DXR

maybe because AMD doesn't have cuda? Which card do you use? I am gonna buy a 1060 6GB for machine learning

>AMD can do the same thing without any extra dedicated hardware
HAHAHAHAHAHAHA
The amdelusion is off the charts. 2080ti is 600-700% faster than Titan V in DXR scenarios.
Titan V is faster than any vega in compute.

50 series never do.
same i got one for $399
p good for that price.

>I am gonna buy a 1060 6GB for machine learning
Nothing wrong with that especially if you're just a hobbyist. Even the low end GTX 1050 ti outperforms the most powerful AMD card at waifu2x, and etc with cuda enabled at least according to phoronix.

you know the RTX 2080 is like 1.6x the die size of a radeon VII
imagine if they were the same size lol? ayymd btfo novideo AGEN

This but unironically.
>what is openCL
>being a scientist with a $200 low-mid range 1060
kek get nothing done again lol
blender'd

pixel shader 2.0 actually mattered
rtx does not matter in the slightest

>imagine if AMD could magically have better technology
thats like imagining intel could compete in the graphics department ever

Attached: 1545744821991.jpg (514x459, 40K)

>what is openCL
Damage control abortion by apple after nvidia introduced cuda 10 years ago. Also deprecated.

I give credit where credit is due. Nvidia was the first to bring real time ray tracing to video games.

I suppose there isn't some standalone application out there where I can select a 1080p .mkv and it generates a 4k version is there?

>maybe because AMD doesn't have cuda?
Yea. AMD really fucks their buyers over when it comes to software. Maybe the should learn how to kode.
>I am gonna buy a 1060 6GB for machine learning
That's what I started off with actually.

They both have nearly the same number of transistors (13.6 vs 13.8 billion) and the 2080 is far more power efficient

Reminder that these Ngreedia shills are working overtime, trying to astroturf for the Radeon VII benchmarks in the morning.

Iirc a waifu2x fork, and maybe SVP.
It was pretty taxing on my AMD hardware when I tried to mess with it a couple years ago, and never tried it again.

>open CL
It's shit. A gtx 1060 ti running waifu2x with cuda embarrasses the fuck out of any AMD card using waifu2x opencl.
>being a scientist with a $200 low-mid range 1060
Why spend $200 when you can spend $100 on a gtx 1050 ti if you just want to dabble. Nvidia GPUs are actually useful, so I can see why you would be dumbfounded at the fact that even the low end cards can take advantage of cuda.
I guess it doesn't matter to you since you only care about vidya benchmarks with aa switched off.

Reminder:
CUDA.
Now shut the fuck up, and leave thread you piece of shit amd shills.
You have nothing now.

It's sad that amdrones always falsely claim how nvidia uses the same architecture and every new gpu is just a die shrink, while amd has been producing GCN garbage for a good part of the decade.

Attached: nagi.png (480x480, 239K)

Will the 2080 Ti ever drop below 1K USD? How long do I have to wait to be able to afford 4K@60hz

I liked my hd7970 that I bought around 2012. Second gen gcn was the best gen.

Mine still runs fine. My Nvidia cards from same gen and newer all died or couldn't stay relevant. I have multiple gaming rigs at home and the GCN cards generally last longer than my Nvidia. Driver support and longevity of the architecture is no joke.

1080ti still showing everything else up. truly the most based card.

phoronix.com/scan.php?page=article&item=rocm-20-linux50&num=1
>what is openCL
It's shit
>being a scientist with a $200 low-mid range 1060
Based high budget guy. Happy for you.

What's the name of that website?