NONONONONONONO KNOW

NONONONONONONO KNOW

LITERALLY TWO GENERATIONS BEHIND

Attached: Screenshot_20190210_130804_com.google.android.youtube.jpg (2240x1080, 979K)

Other urls found in this thread:

techpowerup.com/reviews/AMD/Radeon_VII/8.html
techpowerup.com/reviews/AMD/Radeon_VII/11.html
techpowerup.com/reviews/AMD/Radeon_VII/15.html
techpowerup.com/reviews/AMD/Radeon_VII/16.html
techpowerup.com/reviews/AMD/Radeon_VII/17.html
techpowerup.com/reviews/AMD/Radeon_VII/18.html
techpowerup.com/reviews/AMD/Radeon_VII/22.html
techpowerup.com/reviews/AMD/Radeon_VII/24.html
techpowerup.com/reviews/AMD/Radeon_VII/28.html
youtube.com/watch?v=tHxXgOTMVLc
twitter.com/NSFWRedditImage

Using tsaa on high resolution

>vidya

Delete this

Attached: 1548519850753.png (720x717, 565K)

>games
gross

>linux

Attached: 1541679039409.jpg (250x240, 4K)

>Using a buggy game as benchmark.
Color me surprised. It's a valve source engine. If you tweak the config you'll get immediate boost

It's kinda sad.

What about in tasks that use the high memory and bandwidth? If amd marketed it better and fixed the fucking drivers I can see it becoming a cheaper workstation card.

Oh I guess this is a buggy game also ;)

Attached: Screenshot_20190210_130835_com.google.android.youtube.jpg (2240x1080, 1005K)

Look at Vega 64 beating the GTX 1080. Fine wine at work. Wait a little for Radeon VII to get it's drivers fixed. It will beat the RTX 2080 handily.

1080ti faster than 2080 lmao at novideo

Go back to frog poster.

>Fine wine at work.
They never fine wined a decent cuda alternative though.

The performance isn't that bad for the price. The really bad thing about it is the fucking cooling solution sounding like a rocket engine.

Wine works with games?

Yes. Driver updates as well as game patches change performance over time. Sometimes better. Sometimes worse. But usually better. If they can unfuck undvervolting etc as well we could see a marked improvement.

#

Always been a Blue/Green guy myself (more money than brains, sweet, sweet CUDA cores, etc.), but if they get those down, you bet I'd very seriously consider AMD

>A card meant for shader optimized games at 4K does poorly on unoptimized titles at 720p
whoah

techpowerup.com/reviews/AMD/Radeon_VII/8.html
techpowerup.com/reviews/AMD/Radeon_VII/11.html
techpowerup.com/reviews/AMD/Radeon_VII/15.html
techpowerup.com/reviews/AMD/Radeon_VII/16.html
techpowerup.com/reviews/AMD/Radeon_VII/17.html
techpowerup.com/reviews/AMD/Radeon_VII/18.html
techpowerup.com/reviews/AMD/Radeon_VII/22.html
techpowerup.com/reviews/AMD/Radeon_VII/24.html
techpowerup.com/reviews/AMD/Radeon_VII/28.html

>Wine works with games
Depends on the game/software.

>you bet I'd very seriously consider AMD
So would everybody else.

>Unfuck undervolting
I’m out of the loop, can it not be undervolted? How bad are the drivers?

Bit shit right now. Unstable. Undervolting and overclocking are broken.

If you take out all the games released before 2018 the AMD card wins hands down. As we go forward more and more games are moving away from Gimpworks and single threading and with having to play with the Turing line of better compute and shader performance game developers help AMD get a boost as well since they have always been better at compute.

They are marketing it as a cheap workstation card with 1/4 FP64 and memory ECC (but not cache) and no Instinct drivers but it can do gaming but not so well and it's hot and hungry but hey maybe it can clean your house too.
Because this was a slap-dash Instinct rebrand, it ends up being bad at everything but cheap kinda slow double precision and lots of VRAM. And then you encounter the joy that is OpenCL support.
AMD barely produced any for these reasons. Not even board partners are bothering. It's a bizarre product with a tiny actual market.

>tfw you got a 1080ti for $500.

>If you take out all the games released before 2018 the AMD card wins hands down.
AMD also wins if you don't count all the things that the GPU can't do like temporal noise reduction, cuda, and don't mention the games post 2018that and GPUs just don't perform as well at.
If you do that then amd GPUs are truly fucking perfect.

>imagine bragging about buying a mined to hell 1080Ti that underperforms

At least user didn't buy an AMD GPU.

I'm sure it does LMAO

Attached: Screenshot_20190210_143424_com.google.android.youtube.jpg (2240x1080, 783K)

WAIT I WAS TOLD THE 16gB of HBM2 at high AA/RES would be better ?!!!?!!!!

Attached: download_5.jpg (119x144, 2K)

>drivers
they have are the best for linux

> 1080 TI faster than 2080

That there is the real story lmao

lmao that fucking chart

It's a bit of a wash, since the minimums are higher.

>linux
Lmao

AMD cards are notoriously overvolted, this isn't anything new. AMD chose to bin most of their chips at higher voltages to maintain higher volume. 100 - 200mV undervolts are pretty common on vega and polaris.

Yeah lets pretend you need AA at fucking 4K res.

>AMD cards are notoriously overvolted, this isn't anything new.
So what? AMD cards are also notoriously shit, and only good for low end festureless gaming.

Runs better than expected.

GPU space needs competitors

his videos are the best

Oh it is. At 8K. With 8x MSAA.
AMD is so forward thinking, they released a card a decade before its time!
Look at those sweet frames. All 4 of them.

Attached: radeon-vii-firestrike-8k.png (812x399, 15K)

not the performance, you're paying for the $320 HBM2 inside that card. almost a freaking half of its price. DAMN

But the point it that overvolting does 2 things:

1.) Noise + high temps
2.) Reduced performance

A simple 1,100 mV UV on vega 64 gives you a 10% boost in performance from stock settings AND reduces power draw by about 30%. Why do you think 56s/64s sold out within hours during the mining craze?

The point is AMD GPUs don't have CUDA, so they are worthless. You can under/over volt them as much as you want, but they still won't "lossless" upscale a video as fast as a GTX 1050ti with cuda enabled.

>If you take out all the games released before 2018 the AMD card wins hands down.
I was curious about this, so I decided to see whether it was the case. TPU's bench of 21 games put VII on par with 1080 Ti. 11 of these games were released in 2018/2019, 10 earlier.

I took the 11 2018/2019 games at 4k and calculated the difference between 1080 Ti and VII. The result? VII comes out just barely 1% ahead. This is a negligible difference, something that could be changed with the addition or removal of a single game. For example, if I were to remove Strange Brigade from the bench, 1080 Ti would be 1% ahead.

As such, there is no evidence that AMD would perform better in newer games based on this. I suggest using actual data when forming your opinions next time.

Attached: 1522854242376.png (500x770, 89K)

Not good vs Quadro
Not good vs GeForce
Ridiculous high price due to too much hbmeme

Why the hell this aberration is born

Nobody fucking cares about CUDA on cut down gaymen cards. Even the RTX 2080 has about the same double precision AVX compute of a $170 R7 1700 8-core processor.

Only quadro cards which have like 40X the double precision compute are used for CUDA. Also stop pretending that a bunch of /v/tards are going to be doing anything productive, those RTX cards are going to be used to play fartnight and cowadoody and you know it.

>Nobody fucking cares about CUDA on cut down gaymen cards.
You're retarded. You can literally use cuda on cut down gaming cards for useful everyday stuff, and you can't do this on any and GPUs.
>MFW I just upscaled my old 480p anime to fit on a high res display with a GTX 1060, and it looks fucking great.

You kinda proved my point though. They match each other in newer games.

Also stop pretending that a bunch of >/v/tards are going to be doing anything productive, those RTX cards are going to be used to play fartnight and cowadoody and you know it
Explain why amdfags only post gaming, and vidya benches?
It's, because AMD GPUs are worthless at anything else. They don't even have CUDA for fucks sakes, and vce is legitimate trash.

You kind of proved user right though.
>Vidya
Because it can't into anything else.

>1440p
>TSSA
who plays at that kind retard setup?

>Buh-but you need super expensive quadros to do that
lel just end your life you lying retard.

very few people are defending the radeon vii as a gaming card. it was obviously something thrown together because navi wasn't ready yet.

>i am just seething

>Yeah lets pretend you need AA at fucking 4K res.
You still need some AA, even at 4k, to help reduce shimmering or if you're on a larger display.

And they also match each other in older games. Their performance difference in older games and in newer games is hardly any different. If your hypothesis flips the other way around with the removal of a single sample from your set, then it's not a very good hypothesis. For work I need to try to turn statistically insignificant results into something relevant, because null results are much harder to get published. But with this your hypothesis can't even be entertained.

If you're gonna shill shitty gaymen CUDA performance then do it on dual-core processor laptops where it actually matters. If zen 2 is all it's hyped up to be it's going to make gaymen CUDA irrelevant.

Please be not of the moving of the goalposts please.
t. Ranjeet

>falling for the benchmark meme

the only reason i use nvidia is because CUDA

t.data scientist

>The Witcher 3 and GTA V runs better on Nvidia GPU's
How is that not a difference in older games?

4K monitors exceed 150 PPI you fucking mong. In fact the only thing AA will do in this case is cause BLUR.

>inb4 "bah eye gayme on a 4K TV"
ok retard

>I made a comment with no evidence backing up my claim whatsoever, and I'm interpreting an investigation that shows I'm wrong as my being right, because I'm just that dumb
You made me reply. Congratulations.

>If you're gonna shill shitty gaymen CUDA performance
I never mentioned gaymaning I'm not a worthless amdbitch.
>Please be not of the moving of the goalposts please.
Goalposts weren't ever moved by yours truly you brown fingered gay piece of shit.
Reread my posts, and make a convincing argument against the lack of usefulness of AMD GPUs outside of vidya.

It is just an aggressively OC'ed Vega 56/64 as far gaming is concerned. If you thought that 16GiB HBM2 would have made any difference. You are completely delusional.

Radeon VII = MI50 Instinct/FirePro reject with 1/4 DP performance.

>the only reason i use nvidia is because CUDA
>t.data scientist
Same, but I'm just the average end user.

>Deus Ex: Mankind Divided and Rainbow Six Siege run better on VII
Because of that. You can find older games where AMD performs better just like you can find older games where nvidia performs better.

You can do the math yourself if you want. There is no statistically significant difference between the performance differences in old games in comparison to new games.

>"MUH CUDA MUH CUDA MUH CUDA"
Isn't it amusing when /v/tards start pretending they're suddenly professional data scientists or some shit when they realize how much they got ripped off with RTX?

Also I really doubt 90% of the people who bought vega 7 will use them for gaymen. Actual professionals would 100X prefer to deal with opencl on a 3,000+ DP GFLOP $799 card than buy a gaymen RTX card with MUH CUDA that has like 10% of that same DP compute at best.

>Isn't it amusing when /v/tards start pretending they're suddenly professional data scientists or some shit when they realize how much they got ripped off with RTX?
It's amusing that you think that you need to be a data scientist to enjoy using cuda, and that you're constantly causing Jow Forums posters of being vidyaniggers when all you amdhomos only post about gaymaning. Just review thread.
It's also interesting seeing how just the mention of cuda destroyed you.

>"buh mah cuda, cuda cuda cuda"
Yeah whatever you say, professional data scientist.

>CUDA
Shoo shoo amdfag. Abandon thread CUDA has been mentioned.
This thread is no longer designated.

Neither Lisa Su nor AMD's investors give a shit about consumer graphics, they are the absolute lowest on the priority list. It's why we keep getting these piece of shit GCN cards that are only good for workstation compute tasks

>"buh mah cuda, cuda cuda cuda"
>Yeah whatever you say, professional data scientist.
You're just too sad. Please slit your belly.

>was just playing that dogshit at 119 fps on a 1080 with the graphics maxed out

is this a benchmark with 30 tabs open in chrome and 4 youtube players behind in windowed borderless?

We CUDA did it CUDA reddit, CUDA amd CUDA btfo.

CUDA, CUDA, CUDA, CUDA, hurrrr

CUDA!!!

this is actually pretty cool, it shows off that the crazy-high bandwidth memory has utility in more feasibly insane scenarios.

Good old Shillware unboxed, once again yet another reviewer cant seem to undervolt and overclock (Even with the now new and stable drivers as well, so what was the fucking point of the benching?).

youtube.com/watch?v=tHxXgOTMVLc

Still the only reviewer that has undervolted and overclocked the VII, to show what its more capable of doing (will be interesting to see if there is a waterblock made for these cards, and how they will behave with undervolts and lower temps).

See
>Reread my posts, and make a convincing argument against the lack of usefulness of AMD GPUs outside of vidya.
Do this instead of spamming low quality asshurt posts.

>"muh cuda"
Is this the new desu now?

>No argument
Did CUDA destroy the amdfag that much?
Did you want to post more misleading /v/ tier benchmarks on Jow Forums you dumb faggot, or did CUDA block your path?

They are selling off Instinct/FirePro rejects that ISVs/enterprise customer didn't want to bite. They are remarketing them to GPGPU hobbyist at small margins instead of having to recycle them at a total loss.
Nvidia does the same bloody thing with their Titan SKUs (This is how Titan brand was even born).

Yep, because Navi wasn't ready for CES and market pricing space allow AMD to resell their rejects with still respectful profit margins instead of razor-thin margins.

AMD RTG thought Nvidia would have setup 10 series pricing on their RTX SKUs instead of the price hikes.

I'll wait for a 0.1% and 1% fps performance reviews.

>0.1% and 1% fps
What is this supposed to mean?

>Source game requires a $1400 card to hit 144 in 2019

I love Respawn, but what the fuck?

AMD is a fucking joke and nvidia is expensive as fuck.
The GPU market needs a 3rd player.

Your eyes perceive the worst more than the best.

Aka 10 fps dips are lot more noticeable than the average of 100 or 150.

Attached: it me your uncle.png (696x580, 313K)

win10 is littered with bugs.
use win7 until EOL then switch to linux, bsd or mac

Attached: win10timeline.webm (854x480, 879K)

i hate the new timeline, it's so much slower on my intel integrated graphics. I can see the frames bro

>buy garbage laptop
>f-fucking windows 10!

wash your greasy hands

>"lossless" upscale a video
Wtf are we reading here in this go damn trapboard from those traploving novidiots.
This shithole is dumber than wccftech's comment section.

so what you're saying is that a vega56/gtx1070 is sufficient? gotcha

I remember when AMD fanboys used to post currytech articles almost exclusively when they were shilling Hawaii as hard as they could. My, how times have changed.

Who cares about radeon 7 .,... were just waiting on navi ....... july soon bros ...

Attached: just had enough.png (342x356, 136K)

Sufficient if you're a 60hz pleb, sure. Why stop there? Nothing wrong with 30fps and motion blur, right? As long as you can crank those epyc graphixx.

if (not nvidia) goto runlikeshit;

Uhh sweetie

Attached: Screenshot_20190210_130908_com.google.android.youtube.jpg (2240x1080, 996K)

>Reviews product at stock
>Omg he's a shill guys

Are you fucking kidding me?