AMD is powering another of the biggest gaming platforms in the world as it forms the hardware behind the new Google...

>AMD is powering another of the biggest gaming platforms in the world as it forms the hardware behind the new Google Stadia system.
What the fuck bros!?!?!? Why did based Google go with AMD and not us?

Attached: novideo_btfo.jpg (796x805, 84K)

Other urls found in this thread:

killedbygoogle.com/
twitter.com/NSFWRedditImage

Why do people play games?

Cause nvidia doesn't want to make drivers for linux, also amd is cheaper and the hardware doesn't matter as much when you optimize for it

>tfw amd has to sell their gpus to google for a 1 cent profit margin

because when given the choice nobody would choose to be in business with nvidia
they are absolute cancer

This
microsoft went with nvidia for the xbox and they got absolutely fucked by nvidia for it

Consoles have all been using ati/amd since, except for the switch

AMD always had better offers and more stable drivers. Imagine the pain of dealing with shitty novideo drivers on a large scale project like Stadia.

>hardware doesn't matter as much when you optimize for it
Except you can't really expect all the games to be optimized specifically for AMD.

Nobody works with Nvidia twice if they can help it. Nvidia thinks itself invincible which is why recently its stock value was basically cut in half.

In addition to all the above, when was the last time a successful console used Nvidia? That's right. Never.

LoL. Watch as Google will pay upwards of half a million to optimize the have for their hardware to get 3x performance, saving them 3x in running costs.

What about the plays-. Or the xb-.

Hmm

....Switch?

Nintendo Swicth you retarded amdrone

and in comes r/pcmasterrace

>Nvidia thinks itself invincible which is why recently its stock value was basically cut in half.

No it's not you retard. It's because the market rallied over the memining bullshit between 2017 and 2018. The same thing happened to AMD but earlier than Nvidia when AMD went up to $32 per share and then plummeted down to like $17 and then climbed back up because Intel keeps fucking up their 14++ production which is currently fucking up supply in datacenter and embedded, which is also why RAM is so cheap right now.

Nvidia is showing the same pattern, big dip followed by a strong recovery. It's just the market overcorrecting after being bamboozled and recovering.

I think HBM is finally paying off for AMD as it allows them to make a denser solution for datacenters. Having open drivers on linux and SR-IOV helped too.

I imagine the margins are much higher than their console chips since google are buying server chips.

Attached: AMD-Radeon-Pro-V340-Virtualized-Graphics.jpg (1260x709, 338K)

I should hope it's higher margins. There's a good chance all the consumer Vegas with HBM are close to not profitable at all. Besides being a big die on its own HBM pricing structure likely hasn't changed a whole lot even with Samsung joining production and placing dies onto the interposer was supposedly expensive enough that Intel considered it but backed out of it, opting to develop a cheaper method before they did it. VII probably sells at a straight up loss.

chromeOS android use linux kernel, and as we well know, nvidia is the single worst yada yada

Attached: 1515947348524.gif (240x240, 482K)

>open drivers on linux and SR-IOV

/thread. Currently only AMD has SR-IOV, and while it isn't probably strictly mandatory, it is the sanest way to go forward with virtualized accelerators.

It's also hopefull that future Video Core Next asics would get a long overdued upgrade, at least vp9 encode support. Even better, av1. As Google claimed you currently need 25mbps for 1080p60 and, but a mere increase to 30mbps in the future for 4k60.

AMD have been focusing on semi-custom solutions for years, which is why they've been able to secure almost every contract requiring a graphics solution that's not just an off the shelf part. Nvidia have essentially no presence in that area. Even the one games console contract they secured for the Switch was simply repurposing existing Tegra chips.

Great, so people will associate AMD with input lag.

for the same reason why people read books, watch movies, play soccer, watch tv, travel, spend all night drinking and talking, browsing a korean basket weaving forum, etc.
if you're a little slow, then let me spell it out for you: Entertainment. To kill time.

Are you one of the efficiency experts out there?

really? i didn't know that about the switch.

Who cares? Stadia looks like shit.

Switch uses a tegra

because amd doesnt charge you up the ass for shitty and vram limited gpus

nobody wants to work with nvidia because they're a shitty company
only nintendo is worse than them which is why they worked well together

Games are already being optimized for AMD. Consoles make haeavy use of shader intrinsics and async compute.

AMD's a good deal! not only is it decent hardware, it heat's your house too!

Attached: 1553370285824.gif (815x815, 818K)

This isnt the FX era, AMD cpus arent hot anymore, if anything they are cooler than intel cpus and the only clown here is intel giving us the same amount the same amount of cores and threads for 9 years straight

>This isnt the FX era
No, but it is the GCN era, which is what Stadia uses.

>not wanting an excuse to say "it's hot in here" to your grill so you take off her clothes
t. incelvidiot

Yeah gcn is superior to nvidia's architecture because its both a gaming and a compute architecture so it can actually do ray tracing, not to mention that gcn gpus are known to always have more vram than nvidia gpus and they also tend to age like wine

What are we supposed to do between fucking lots of women?

>Yeah gcn is superior to nvidia's architecture
They are hotter, slower in most games, and more expensive to produce.
>gcn gpus are known to always have more vram
Which becomes a factor long after the core itself has become useless.
>and they also tend to age like wine
Well memed.

Also the Stadia platform is based on Linux which Nvidia is infamous for not supporting.

>he has a refractory period

Lol, get rekt noob

ok nvtard

>reddit spacing
>saying "noob" in 2019
kys

The only gcn lineup which is hot is the vega one and even the vega gpus are better than their nvidia counterparts (1070/1080)

the hd 7970 had more vram than the 680 ti and because of that it was more future proof and it could actually play games years later without stuttering, unlike the 680 ti, are you trying to tell me that the 1060 3gb and 2060 6gb had enough vram at the time of their release?

if they dont age like wine then how did the r9 290 compete with the 770 but it managed to beat the 780 ti years later? that means amd gpus age like wine or nvidia gimps their gpus with driver updates, either amd gpus tend to age better and last longer than their nvidia counterparts

You have exactly zero idea why GCN is the way it is and why Nvidia's more recent architectures share a lot of the same design goals as AMD had when designing GCN nearly a decade ago.

Attached: 1452717560680.jpg (593x452, 35K)

>if they dont age like wine then how did the r9 290 compete with the 770
The 290 CRUSHED the 770 on release. The 290 was roughly on par with a 780 with the 290x edging out both the 780 and the titan, hence why Nvidia released the 780ti and titan black to retake the performance crown.

normalfags don't care about that kind of stuff

on that topic normalfags dont know that amd gpus exist either

killedbygoogle.com/

Stadia will be killed in 3-5 years and no one will care about it

if its profitable it wont, why do you think google is shutting down google+? because not many people are using it and it doesnt make money for google

It will not be profitable at all for Google, period

>The only gcn lineup which is hot is the vega one and even the vega gpus are better than their nvidia counterparts (1070/1080)
All of GCN is power inefficient compared to their Nvidia counterparts without exception. Both Polaris and Vega. Vega performs worse in games than Pascal and Turing, is more expensive than Pascal (and possibly Turing, we don't know).
Very few people actually used Vega in those compute scenarios where it did well. It sucked in gaming, which is where most people reading this post bought one for.

>You have exactly zero idea
>You don't really understand *why* AMD products APPARENTLY do poorly! You didn't even read the white paper!
Brings me back to the heady days of the Vega launch. Lots of spin. No product worth buying.
Still waiting for Vega to really spread its wings as was foretold by rabid fanboys. Any day now. But I've always looked for the proof in the pudding, rather than hopes and dreams that devs were going to make optimized diamonds out of Radeon coal.

>and why Nvidia's more recent architectures share a lot of the same design goals
With the exception of not sucking. That they've got down pat.

Attached: 1371523000831.jpg (1214x1239, 834K)

lets see nvidia track record
>mobile soc's total shit to the point that nobody even cares anymore about nvidia
>motherboard chipset
tried to lower amd's perfomance on their chipsets got discovered and got gtfo from both intel and amd
>desktop
propiertary shit because of no alternative
>linux
HAHAHAHAHAHAHAH
>consoles
yeah cancer

gee i wonder why

Please show me benchmarks where pascal beats polaris and vega also nvidia's turing architecture sucks ass the 2080 cant even ray trace properly at 1080p

nVidia is an utterly shit and corrupt company. Nobody sane would partner with them. They're worse than Intel.

They shut it down because G+ had a critical security issue which let anyone fetch all private user data and they didn't want to deal with the shitstorm. And yes, also because 99% of gmail users didn't know what the fuck G+ is or that it exists.

>buys 2019 intel in winter
>buys 2019 amd in summer

Their high end chips for server and AI is used by everyone, AMD doesn't even have standing in that space.

SR-IOV would let google lean on all their existing expertise in streaming and virtualisation/container shit

Exactly, nvidia is the only that manages to outsell the product of their competitors 10:1 while having shittier performance and worse prices, for example look at the 1050 ti vs the rx 570, the 570 has almost double the performance of the 1050 ti and sells for way fucking less yet it sells a lot worse

Look son, i'm generally on AMD's side, but STD must die.

You do realise that GCN is designed - primarily - to be a compute throughput monster right? It is meh at gaming because it was never designed to push pixels in the way maxwell (and all newer Nvidia architectures save for volta) were. AMD gets its gaming performance through sheer brute force clockspeed.

Kepler - like GCN was all about raw data crunching and it too drank electricity like americans eat burgers as it equally was a one size fits all design. Maxwell stripped a HUGE about of compute focused parts out which not only let it clock much higher than kepler, it also got power draw way down as you don't have huge arrays of compute cores effectively sitting idle eating away at your board power limits. Increase ROPS and cache and (in simplistic terms) you have a chip that will excel at vidya but suck horribly in a sever or datacentre.

In recent years the demand of compute throughput has increased faster than the hardware has - the point of Tensor cores is to act as additional accelerators for A.I focused software and tasks. To this day Nvidia has to throw bigger dies at the problem to match GCN's compute power - on paper a 290x is a match for a 980ti for example. Even today the radeon VII will crush a 2080 in raw number crunching. There are obviously additional factors as its a complex subject (Nvidia's ecosystem for example is a massive factor in their favour). The mining craze was all about AMD precisely because if you want to do 1+1 and get a result nothing is more effecient than GCN until you start building custom hardware, which is exactly what happened.

The point here is you are an idiot and your understanding of why Nvidia and AMD do what they do starts and stops at /v/.

Attached: 1448917281249.jpg (297x296, 39K)

1050ti doesn't even require power supply and runs off PCI-e. It runs witcher 3 at max and is the king of GPUs in laptops.
570 on the other hand wasn't even affordable untill last year and still sucks more power than 1060 so you cannot even use it in laptops.

>vram

Attached: 6GB.webm (1280x720, 2.92M)

>Please show me benchmarks where pascal beats polaris and vega
Buddy, if you need me to show you benches of cards that ran efficiently, and were cheap to produce, beating cards that were the opposite, you've been asleep for the past few years.
>also nvidia's turing architecture sucks ass
lol
I can't think of a solitary reason to buy any GCN card when compared to its Turing counterpart.
Maybe maybe the free games? But Nvidia is also offering free games on the RTX lineup now I heard. All AMD can do is lower prices. One wonders if they're losing money on Polaris at a certain point.
Anyways, AMD GPUs are trash to make and buy, as is reflected in their marketshare and bottom line. Stadia as a product sounds terrible, but we'll see. As far as AMD is concerned regarding Stadia, they're competing mostly against themselves, with the exception of the Switch. With the expected thin margins. But that's the hole they've been inside for a while.

tl;dr AMD built chips that weren't good enough to cross Nvidia's tech moat and burned through electricity like nobody's business, and were garbage in videogames. Their only use was in crypto, which is now dead. Again.
>The point here is you are an idiot
Could be. It wouldn't make Polaris or Vega any less crap though.

not yet 2020 will be the year amd will take back some share there too

>AMD is powering every way to kill PC
Based

Only with rome, their GPU side is still shit.
Just see how they are dumping Mi20 as Vega 7.

yeah witcher 3 at max settings 30 to 40fps, sorry but no the 1050 ti is horseshit and it ages very badly, just like all the other x50 cards do, the only reason the 1050 is the king of laptops is because amd doesnt have any mobile gpus to compete with except from the vega igpus, the 570 sucks almost as much power as the 1060 but it doesnt matter as the 570 almost has 1060 performance and costs less than the 1050 ti oh and btw if your card is hot and draws shit ton of power, the 2080 draws as much power as a vega 64 yet its on featured on laptops

>football
fixed for you ;)

if you base your opinions on marketshare then you're retarded, do you think the 1050 ti is superior to the 570? just because it has more marketshare than the 570?, pascal cards were cheap to produce yet nvidia sold them for a premium

Google probably went with AMD because they are pushing their Vulkan porting tools and they will be using proprietary software so they need the implementation to work well with it, Nvidia's implementation is closed source so they obviously can't use it period.

Probably will result in more Linux ports of games but not better AMD support on Linux, just better AMD support for whatever Google is doing.

> burned through electricity like nobody's business
Except that they don't when you compare the big boy cards against each other - like the old firepros vs quadro and AMD's newer instinct lineup.

>Could be. It wouldn't make Polaris or Vega any less crap though.
Polaris isn't crap - it is just average. It is a chibi chip pushed harder and harder as there has been no replacement for it. Vega is a goddamn datacentre design that was designed to scale down - its power effeciency is rather impressive when not clocked balls to the wall for the gaming lineup.

Again it all boils down to the fact Nvidia split their architecture up - one for number crunching and one for pixel pushing. AMD didn't make the split as its hideously expensive to do so. RTX and turing in general is Nvidia pushing their number crunching defect chips onto the vidya crowd and doing a whole lot of nothing with it.

Tensor cores are used as accelerators and its nothnig AMD couldn't do on GCN precisely because it has a massive array of compute cores unsued for vidya as GCN gets generally throttled hard at the ROP side of things as it simply doesn't have the pixel throughput outside of insane brute force clocking.

The conclusion you are drawing is based on an incomplete picture of the overall gpu landscape which stretches far beyond vidya. For the sole focus of vidya? Sure, GCN is only competitive by being clocked through the roof (as a rule AMD has less internal clock domains than Nvidia, so the higher you clock the core you overclock basically everything else inside it. Nvidia has some internal clocks that you can't touch which dramatically helps with the race to zero and overall power draw).

Attached: 1546116684568.jpg (480x372, 14K)

>570 almost has 1060 performance
No it doesn't.

Linux future of games, gaymers that spent thousands of dollars in their led filled orverpriced gayming pc on suicide watch. Suicide line for gaymers will be released alongside Stadia.
Gaymers are fucking tools, they'll use their thousand dollars pc to play alongside pajeets using xiaomi pocophones.

yeah ok its 10% worse than 1060 in older titles but that doesnt mean its not on par with the 1060

When AMD is 10% behind Nvidia they are being utterly crushed. When the 1660ti is 10% behind a V56 its suddenly too close to call, wait for drivers etc etc.

Funny that.

>When the 1660 ti is 10% behind a V63 its suddenly too close to call, wait for drivers etc etc.

Literally no one says that, people can just overclock and get better performance, see rx 580 -> rx 590/ gtx 1060 -> gtx 1660 (non ti), im an amd fan and i would buy the 1660 ti in a heartbeat if it had 8gbs of vram or if it was at 220 euros with its current vram

>Literally no one says that,
Except. y'know, Jow Forums does.

Jow Forums is also filled with nvidia, intel and amd fanboys who talk bullshit about the other brands and hide anything positive about them

im talking about their new uarch on gpu..