The absolute madman

The absolute madman

techradar.com/news/amds-radeon-vii-graphics-card-dismissed-as-underwhelming-by-nvidia

Attached: 3S4p8oUURXjM9V9YVHyT2a-970-80.jpg (970x545, 51K)

Other urls found in this thread:

twitter.com/RyanSmithAT/status/1083959608371175424
twitter.com/NSFWRedditImage

NOOOOOOOOOOOOOOOOOOOOOOOO

WTF I HATE AMD NOW

No rubuttle AMDykes? Thought so.

kek what the fuck? Are we literally going to have to choose between a douche and a turd0sandwhich when it comes to processors?

he is right, radeon 7 is a disappointment, its not a bad card but its too expensive for what it is, amd at this point simply cannot compete against nvidia, rtx might not be ready for mainstream but its there and amd have nothing

>Rants for an hour
>A CEO
More like a whiny bitch

do you think his niece hates him
i bet family reunions are awkward

Should i upgradw my 970 to a 2060 for 1080p 144hz?

>*unzips Navi*

Attached: 1536886957476.jpg (720x676, 131K)

If its over 250w no is going to buy it. No one likes hot loud cards

Attached: 1538929233291.jpg (633x758, 156K)

The sad part is AMD is forcing their customers to choose nvidia. Instead of undercutting Nvidia and offerring a slightly more affordable card without RTX but delivering on the most impoartant part, performance, they've priced-matched their competitor instead by offering nothing to the gamer and demanding the same price.
Even if no one cared about RTX, it is suddenly now the main selling point of the 2080 vs the radeon7. Do you pick the inferior card with nothing for the sake of brand loyalty or do you pick the better card with newer features at no cost to you?

Had AMD released the radeon7 with 10-12gb of vram for $600 instead, it would have had a much more enticing sale to gsmers who don't care about RTX and are only after performance for as cheap as possible. Which is basically everyone, considering how lackluster RTX turned out.

>And if we turn on DLSS we’ll crush it.
>And if we turn on ray tracing we’ll crush it.
fucking based leather jacket man

Attached: 1516295748096.png (1228x1502, 1.07M)

Is his left arm nano carbon augmented with RTX ?

No the real sad part is that AMD barely matches geforce 20 performance with a 7nm die shrink while nvidia is still on 12nm. If nvidia wanted to they could do nothing except 2 die shrinks and crush AMD for the next couple of years with zero effort.

Raytracing is the new Hairworks

Wireframe graphics is all you need. Textures and lighting are marketing memes.

>Titan V - MSRP $2999 - FP64 7450 GFLOPS
>Radeon VII - MSRP $699 - FP64 6912 GFLOPS

Enjoy your volta sales dry up, leather jacket man.

twitter.com/RyanSmithAT/status/1083959608371175424
>FP64 is not among the couple of features they dialed back for the consumer card. The only things AMD has turned down for Radeon VII are Instinct drivers (obviously), PCIe 4.0 support, and the external Infinity Fabric link. All other Vega 20 features are intact

So he is literally saying the RTX 2080 is underwhelming?

What kind of fucking Asian is called Jensen

> if we turn on DLSS we’ll crush it.
You mean that thing that doesn't run or look any better than normal AA and has shimmering issues?
>And if we turn on ray tracing we’ll crush it.
You mean that thing that doesn't even run at acceptable FPS and looks like utter shit in motion?

Attached: 1e8.jpg (165x115, 5K)

This. Time to play some Tempest.

>this
I was basically already sold on AMD's new gpu before CES, I thought "hey it's going to be 7nm, have 12ish gb of HBM2, gain maybe 10% on the vega64, maybe sort of keep up with the 2080 and cost $500-$550". And then they released this overpriced garbage. Navi is vaporware at this point, just like Intel 10nm. I'll still stick to ryzen but AMD can fuck off I'm not buying another gpu from them until they get their shit together. I'll just buy a 2080 and send my hard earned money straight to Tel Aviv, I hope they're fucking happy.

As much I wanted AMD to do well with GPUs again, there's just no logic behind putting out a card at the 2080's price without the features that are exclusive to it (gay tracing, DLSS if you care for it)

There isn't a single thing the R7 has that the 2080 doesnt, a slight performance lead (and i seriously mean slight) isn't going to justify it.

Both Nvidia's and AMD's cards are underwhelming. There is no point of upgrading if you have the last gen cards. Unless you AMD's more VRAM that no one needs yet or Nvidia's DLSS and Raytracing that no one needs yet. Why are they both relying on gimmicks now?

It is, though. If you have Pascal there is zero incentive to upgrade to anything in the last generation with the big ass price hike.

>And if we turn on ray tracing we’ll crush it.
Funny guy

>without the features that are exclusive to it (gay tracing, DLSS if you care for it)
>Features
The ability to tank your fps or make textures shimmer is not a fucking feature.
Also, considering these goymicks are run through proprietary HARDWARE how the fuck is AMD supposed to implement them anyway?

AMD did the right thing.

They know only huge fanboys still buy their cards, and they won't care if their cards are expensive.

All the rest only want competitive AMD GPUs to force Nvidia to lower their prices, then they'll say "Thanks AMD!" and buy Nvidia.

And yet Radeon Linux support runs circles around Nvidia.

>resorting to ad hominem-tier of buttmad rants because your competitor was world first with 7nm GPU

rofl. Jensen Huang if you are reading this: KYS. I haven't bought an Nvidiot GPU since GeForce 9800GTX+ get fucked

Absolutely based. R7 is 9yo housefires GCN trash.

LEATHERMAN BAD

Attached: 1497642804318.png (802x799, 49K)

>All the rest only want competitive AMD GPUs to force Nvidia to lower their prices, then they'll say "Thanks AMD!" and buy Nvidia.
this, even if amd released a card 100% better than rtx 2080ti for $200 people would still buy nvidia because its nvidia.

His real name is huang jen hsun

I meant NVIDIA

Amd can implement faster checkerboard from ps4

but they keep you warm at night

Behold the Reichdeon VII

Attached: Reichdeon.png (1200x798, 1.07M)

AI upsacling memes aside, can you enable RTX on non RTX cards?
are thee any benchmarks? I heard even 1080ti can keep up with 2070 when RTX enabled, wonder what R7 be like

>No one likes hot loud cards
it feels so weird when 2009 memes still alive today, also 2080 is 250w card if you didn't know

Just like people who buy Apple because its Apple. The same would go for AMD if where as large as Nvidia or Intel so i am not fond of any of Nvidia, AMD and intel.

The GPU makers are being so anti consumer right now.

SR-IOV confirmed

Attached: 1541532544584.png (422x537, 173K)

upgrading on every gen is for richfags or for bottom scrapers who got the bottom tier card, value for money is still messed up after the coinshit craze but if you need a card now you don't have much of a choice, i agree that it's not ideal to be an early adopter of raytracing and DLSS, but they look promising for when future gen cards come out

it's called DXR, it's not exclusive to nvidia, they marketed it like some crazy magical thing but it's just a directx feature of course

>not a single thing the R7 has that the 2080 doesn't

what about 16GB of VRAM?

AMDBros?? WHATS OUR EXCUSE NOW

He is right. How a fucking 7nm card with the 2080 horsepower draw...300 watts...

post screenshot please

Attached: snapshot_21.47.png (1920x1080, 3.35M)

ikr
>2019
>not having as much VRAM as system RAM

hey man you dropped these

Attached: wood screws.jpg (2430x997, 620K)

> >2019
> some vidiyas, ex. Deus Ex MD, already consume > 6GB of VRAM for Ultra textures @1080p. 8GBs will become a necessity for 1080p in 2020 once new consoles go live, putting it that way a requirement of 16 GBs for 4K is out of question.

Is this real? If it has almost 7k gflops then who the duck cares about features? That shot can be coded into radeon rays later on and that's compatible with all gpus.

VRAM usage itself isn't a very good indicator of how much VRAM you actually need, because you can't tell how necessary the textures that get loaded there are. A good example of this was one of the CoDs a few years back. If you gave the game a card with a ton of VRAM (like Titan X with 12 GB), it would keep loading shit to there and never remove it. Technically there's nothing wrong with this, as unused ram is wasted ram, but it makes some people draw wrong conclusions from it. Does the game need +7 GB VRAM? Nope. It ran without issues on 2 GB cards.

The only way to tell how much VRAM a game requires is to test it with cards with decreasing amount of VRAM, and checking when you run into performance issues. Usage by itself won't tell you much.

Attached: 1531333633299.png (400x717, 61K)

It's confirmed to be 300w lol

>they've priced-matched their competitor instead by offering nothing to the gamer and demanding the same price.
What do you mean nothing? Do you think when you go buy a Vega II you hand over your money and walk back out the store?
Even if you meant "nothing new" it's wrong, which nvidia card has this much or this fast RAM?

I wish mommy Lisa would piss on my face

Refined horse shit is still horse shit.

No, he's saying that the radeon VII is underwhelming because it is currently the absolute best card from amd at the moment, and it can barely keep up with the 2080 while at the same time having less features.

nvidiots are the biggest npcs out there

>Navi
>"you'll hear more about it in 2019"

Way to inspire confidence in an MIA product there Lisa, I wonder how you guys fucked it up this time

nice GFLOPS, would be really nice if you had software that can use IT, LOL.

>Both Nvidia's and AMD's cards are underwhelming. There is no point of upgrading if you have the last gen cards.
My first Nvidia card was a Geforce 2. I am a "fanboy" by all definitions. However, after my Pentium2, I switched to AMD for obvious reasons. I work in a 8-10 year cycle for all hardware, and recently upgraded from a GTX 660ti/Phenom II and the only thing I could see lasting another 6-8 years was an intel chip and Nvidia. I found a brand new 980ti, and I was totally floored by what it can do. For the cost of new cards, it is hard to beat spending less than $200 for the previous generation's hardware. All GPU companies are guilty of rebranding shit. Nvidia has been doing it for years, and I figure ATI/AMD has as well. I don't think the future is in x86 processors, and I know that this will be my last powerful PC. Gaming is nearly dead to me, and the only reason I got the 980ti was the guarantee that everything I could throw at it will run without issues, most of which on max settings. I know my opinion is not based on cutting edge as well. I am more than content. For the purposes I use a PC (gaming mainly) they really haven't even impressed me with multi-core processing. Most of the games I play don't even implement it in a way that really makes more than a dual core necessary, and the ones that try don't make a difference when you are running a 4.6ghz I5.

I'm sorry you need to be lead around by a leash.

On the one SJW game that uses? The fuck

Why does he shit so much on his niece?

more games are gonna implement it
>just wait for zen
>just wait for poolaris
>just wait for zen 2
>just wait for 7nm
>just wait for navi
>wait for DXR? lool nvidia has no gayms xd

>Competitor says competitors product is bad, what a surprise.
Nvidrones will eat this shit up

>DLSS will beat it
dlss is absolutely shit and barely works
>RTX will beat it
yeah i love halving my fps

>SR-IOV confirmed
Are you fucking kidding me? This just made vega7 the go to card for linux host/windows vm if true, based AMD.
I fully expect it to be reconfirmed later that SR-IOV is not included though.

>how the fuck is AMD supposed to implement them anyway?
Here's an idea...AMD could invent its own proprietary technologies. Damn, life sucks when you can't just take what you want, doesn't it?

>AMDrones have entered bargaining phase

Attached: 1496098972845.png (882x758, 316K)

Why is Jensen even acknowledging AMD's existence? It feels so weird considering how smug Nvidia always is. Are they feeling pressure from possible Zen2 APUs next year? If they can fit 8 cores with a 570 tier GPU on it, we would be looking at a big change in the low to midrange sector.
On the Vega VII, I understand some of the disappointment, but why the doom and gloom? The fuck were you expecting? It's still shitty Vega, just on a smaller node. I bet these are just bottom of the barrel MI50s to calm things down until Navi. The price is bad, but at least it competes with Nvidia's 2080.

I pick the one with more memory because it will almost definitely age better over the next two to three years.

the gpu would look good to clueless boomer investors and kids because new gpu and 7nm, so nvidia taking a big steaming shit on it was the right move imo, if you google amd gpu it's still one of the top news

>things the R7 has that the 2080 doesn't
* 16 GB VRAM vs. 8 GB
* 7 TFLOPs (64 bit) vs.0.33 TFLOPs (hey buy a Titan or Quadro card)
* doesn't need a dongle for crossfire vs. nvidia rtx nvlink-bridge needed for sli (for $90?)
oh, and working Linux drivers

>Working Linux Driver
While I hate the state of nVidia on linux (especially when it comes to optimus) you can hardly say that the nVidia X11 drivers aren't at least as good as the windows versions.

found a use case scenario for radeon 7

Attached: 20190113215609_1.jpg (1920x1080, 206K)

I dont care about tdp if I can get the card with water cooling

The need for that level of RAM is extremely niche at this point. 99% of games in 2019 won't need that much. By 2020, Navi will be out and/or 11 series will be out and/or Intel gpus will be on the way
It's the same shit as RTX, but AMD didn't shill it so hard: 1TB/s and 16GB VRAM is the future, but because it's cutting edge, you have to pay extra for it and it won't be nearly as useful this gen as it will be next gen.

This is only because there's literally a setting in that game that lets you select how much VRAM you want to allocate to textures (up to 8gb). I doubt it even makes a difference after 1-2gb, the assets in the game aren't that impressive.

VII FP64 performance will BTFO novideo

holy goddamn motherfucking shit YES

nvidia's linux drivers have been wonky since the gf2 mx era. Even on my current 1080 ti with multiple monitors, it fucks up and can't seem to distinguish my monitors from my oculus's hdmi connection. Sometimes it just fails to load entirely.

You should buy amd for sake of not having a Nvidia only GPU market in the near future. You retards

NVidia doesn't have an implementation of GBM, so their proprietary driver is useless for the majority of Wayland compositors. Also the open source driver has been shit since the 900 series because NVidia requires signed firmware blobs for basic bitch things like hardware video decoding and power management.

is there anything better than watching Nviditards dig their own grave?

Attached: 5475f07b98fe34f609a8e5f46ead6b85--the-frog-frogs.jpg (236x249, 10K)

Nvidia may be shitty about things, but the performance I've gotten from this card has made the price I paid not hurt nearly as bad as I expected it to.

When I paid $600 for a vega 64 to upgrade from a 1070, THAT FUCKING HURT. And after analyzing the vega 20's performance I'm not convinced it's all that far beyond how far a vega 64 will overclock anyway.

Attached: photo_2018-10-23_13-03-11.jpg (1280x960, 131K)

No, the real real sad part is. Even though Radeon VII matches 2080, it is a house fire. Pathetic. It's not worth money they sell it for, because you'll be paying more with Radeon in the future than with, even, rtx2080ti...

Yea gpu's are pretty weak, but intel got btfo and always will, i care more about the cpus

you didn't research your product then if you bought a vega 64 from a 1070 expecting a huge upgrade. thats your own fault

i laughed for ages at the DLSS reveal

waifu2x implemented in hardware

Enjoy paying $2k for the 3080Ti you fucking retard.

Thanks user. Let me say thank you by inviting you to have some pork shoulders cooked on Nvidia’s Fermi(tm) grill with me for dinner.

Attached: 15A60C94-4492-41D9-B96A-04DB68ECF865.jpg (500x500, 32K)

as the guy who posted that pic, that makes sense. I was toying with the texture settings and looking at the difference in the preview thing below the settings, couldn't notice a difference after 2gb. Hell, there wasn't any major improvements after 0.5gb

The only thing I can see that being used for is "hey look guise, i can max out this game"

Nah... It will cost $1488

>Why are they both relying on gimmicks now?
Really gets the noggin joggin

Vega 2 =/= Navi