AMD WINS!

AMD WINS!
CUSTOM GPU FOR GOOGLE

Attached: 1537352730622.png (1183x991, 601K)

Other urls found in this thread:

youtu.be/nUih5C5rOrA
twitter.com/TUM_APISAK/status/1107995323022991362
onlive.com/
twitter.com/SFWRedditVideos

>white youtube theme
off self

you'd understand if you weren't watching youtube in a basement

>Runs on Linux

Did I catch that correctly?

>google becoming even more monopolistic and controlling

Will avoid

Vulkan?

>AMD
>Linux
>Vulkan
I don't know who are the good guys anymore

What did you expect from Google?

They've been working on Android and fuschia for years

can't wait for the complaints of molten consoles start coming in

what?

Navi must be decent.

10.7teraflops.

I saw that they were going to integrate the service with YouTube. Would be cool if, when someone is streaming a game, others could pick up on that save file the streamer owns and play their own way through whatever part of the game the streamer is at in their own save file. Obviously, it would just copy the streamer's save and transfer it over rather than screwing up the original file.

youtu.be/wrmu_8LH88U
Reminder

link:
youtu.be/nUih5C5rOrA

>psu isnt even on
hehe, but the exploding athlon one with the air compressor was funnier

Vega 2 is 13.8

the one in google's shit is custom made.

It's Vega 56 brainlet

then it would not be custom. they said it's custom.

show proof

>AMD's semi-custom business
>off the shelf consumer parts
Pick one, kid.

Attached: 03-19-19_14h52m48s_firefox.png (252x314, 10K)

so 10.96teraflops

adding on the only thing that might be custom would be HBM used as both CPU and GPU memory. Maybe 7nm die shrink. also only the CPU was listed as custom.

Attached: Screenshot_49.png (2560x1440, 2.08M)

he said custom gpu

and he's a fucking brainlet like everyone else saying this isn't Vega 56.

and of course it's fucking "custom". it's not going to be built like standard RX/WX PCIE cards.

fuck posted early. they are guaranteed to not use off the shelf PCIE slotted Vega 56, they could make it "custom" by making it MXM, and APU, or some other shit, but the core GPU part is still Vega 56.

>10 TFLOP custom GPU
>Outputting in beautiful yuv420p compressed inefficiently and full of artifacts.

Attached: GyVXqSq.jpg (1080x920, 205K)

Remember it's a DRM service. Open source doesn't mean shit when games are protected by (((DRM))) and (((Copyright))) and you don't own most of your software.

>Google unveils Stadia
>cloud gaming service
Into the trash it goes

Attached: 626761d7695c95fea864b208a6486947ada126c745cdef193ae01f2700c389dd.jpg (225x225, 8K)

AMD is not built for streaming, this is a dumb move.
The latency is going to be high compared to Nvidia's game streaming which has already been out for like 6 years already.

I look at it this way. So many companies have or are working on game streaming that Google decided "why not?" and in the process decides to boost both AMD and linux gaming since both of those are dead/dying.

Parsec is miles ahead of Gamestream/Poolight

>Meanwhile, 2080ti 14 tflops
So not even flagship / long term hardware for promised 4k gaming.
Oh boy

>2080ti 14 tflops
that's at half you retard

It's going to be the youtube of games. You won't be able to avoid it.

>10.7 Teraflops of GPU
>Gets compressed down to some low bitrate veryfast x264 mush
What a waste.

Attached: 1327218420518.png (340x230, 74K)

>latency
>forced online
>high bandwidth requirements
There are already multiple cloud gaming services available and they haven't taken over the world.

Attached: 1553034150879.png (439x360, 67K)

Vega 48 faggot: twitter.com/TUM_APISAK/status/1107995323022991362

Oof

Attached: JR6bM0d_d.jpg (640x360, 22K)

>x264
you living in the past user?

based

>t. retard who didn't watch the announcement.

>2000ms input lag
And it's DOA.

Attached: 1553020693676.webm (752x398, 1.44M)

that jump input delay is digusting.

Is this real life?

Attached: Silicon-Valley-Wikia_infobox-Gavin-Belson_01.jpg (250x330, 41K)

You living in some kind of future where anything better than x264 can be hardware decoded? HEVC can't be decoded in hardware on chromebooks or most desktop/laptop computers. If they want to avoid latency they'll need to use a codec that can be hardware decoded.

you have never played the game? sometimes the game doesn't understand you want to climb

More razor thin margins.
Just what they needed.

did someone say razer?

Attached: 1545679412579.jpg (2448x3264, 1.84M)

Now those are some margins.

>484GB/s transfer speed

What's in that obviously shopped out area in the lower left?

It'll actually be APUs.

This is so 2011 it hurts

>Game streaming
I don't have an ounce of faith in this technology taking over from local hardware anytime soon.

underrated

Unless someone finds a way to transfer data faster than light, streaming would be a meme for anything but turn-based shit.

t. retard who has never tried it

pretty sure it's win 10
it's for GAMES

10.7 terafakes by hbmeme Vega 56
I just wonder how much the electrical bill will be and how much cooling will it requires

Hey at least amd found someone to by these hot turds

>Let's degrade the video quality of my entire game library and add significant input lag and unavoidable always online DRM for good measure!

Attached: brainlet 3.png (645x729, 32K)

BRUH

Attached: 1553033299788.webm (1280x720, 2.77M)

assassins creed games always have the character continue with the momentum if you quit running suddenly.

Google just don't used the required low latency network

Attached: 1528639156507.jpg (512x548, 20K)

False shit.
On my 2080ti 8700k it's instant reaction nothing like this hot mess

>instant reaction
stop running and the character finishes the momentum. that is how the game animation is made.

fuschia isn't linux

this. for 99% internet latency is already okay
anyway, google already developed a library that predicts user input, in order to achieve an apparent null lag, it's just up to game developers to include it

>google already developed a library that predicts user input
If true, another awful idea.
It's just a cascade of dumb from these people.

*fuhcsia

>google already developed a library that predicts user input
yeah so what happens if your user makes an unpredicted input?

I might as well watch a streamer for free on YT at this point.

You get killed in game. Or your cursor moves irregularly when it needs finer input. Or you get annoyed, throw the controller across the room, and walk away because you had enough of endless compromises to make product that doesn't quite come together work.

Attached: 1407350301216.png (1600x900, 906K)

it doesn't. human reaction times are limited and predictable, snowflake

This. What's it doing that Streamed or PSnow aren't? More memes? Easier to access from YouTube? Fuck that.

Why having a controller then?
Let the software do all the magic and have actual zero lag input we will just watch as the machine calculate all inputs I want to do

>Press A 2000 times
>Computer thinks you will press it 2001 times
>You press B instead
>"Google Stadia has detected a problem and will now close"

Remember, this is about convenience.

>input delay

>google shills alredy at full damage controll
lmao

Attached: 1416275044669.jpg (1600x900, 656K)

is that fast enough for mpv?

Imagine how poorly optimized games will become when they start running on data centers

watch the fucking keynote you retard

Yeah but google

Onlive did it with great success literally 7 years ago I'm sure Google can do it now

Imagine being this obnoxious

>Onlive did it with great success literally 7 years ago I'm sure Google can do it now
>with great success
>onlive.com/
>'After five years of uninterrupted service, the OnLive Game Service comes to an end.'

Attached: kek sopranos.gif (322x178, 2.56M)

Google is fully capable of creating failures. Remember Google glass? No one does, because it failed so hard. And if you think it can't fail because they're hyping it up, remember Google+? They even tried to force it on people and it still failed hard. In fact Google+ made the same mistake this is making, which is them thinking they can just show up and capture the entire market, because we're Google, motherfucker. Being Google integrated isn't enough to make something a success.

Well aware, but if there's a company that'll cause widespread adoption of a technology it'll be google

HOW WILL NVIDIOTS EVER RECOVER?

2080TI is more like 15 to 15.5TF because of the Turing boost, which automatically applies an overclock.

Nvidia has had the compute advantage since Pascal. GP102 did the same thing and the 1080TI was only a cut down model. Turing widened it deeply not only much better throughput but dedicated half and quarter precision cores that can be used in parallel with the CUDA cores. Yes, that's right tensor cores are just a big block of low precision cores that adds 8x the throughput for low precision

well it's confirmed that AMD got HBM working for both CPU and GPU then. Even quad channel DDR4 would be barely over 100GB/s

Attached: .png (217x161, 6K)

and now you know why it would be "custom". It's Vega 56 with Vega 64 memory bandwidth, and probably some more slight modifications to clocks. Not WX 8200 since that's 512GB/s.

Attached: .png (1074x733, 96K)

Attached: 1548874668767.gif (650x705, 38K)

>here's all this nice and flashy shit without a single explanation on how its going to work and who is going finance it

but youtube is already the youtube of games...