N A V I

Stop believing the fucking rumors and get ready for the first 20 BILLION Transistor GPU.

Attached: tismtismdtimismitmistmisiashdghasdg.png (1098x716, 20K)

Other urls found in this thread:

anandtech.com/show/11367/nvidia-volta-unveiled-gv100-gpu-and-tesla-v100-accelerator-announced
twitter.com/NSFWRedditVideo

can't wait for 500W power draw

Who cares? Electricity is cheap

>500Watt

??????????????

Attached: muhwatts.jpg (702x258, 20K)

Heat death can't come soon enough.

>THIS time we will get GCN right

Attached: 1491240894652.jpg (285x316, 7K)

>Burning furnace is nothing
>needing 10000000 watt gold rated PSU is >nothing.
>AYYYYYYmd is the best.
Just...Just stop dude.

>Just...Just
Very soi sentence.

JUST WAIT

No one implies they will get "GCN" right, GCN is by design a compute first architecture. The difference this time is Navi is going to be on "7nm" vs Turing at "14/16nm".

PS Would you say no to RTX 2080 performance for 200$~ less because of 20-25 Watts difference?

Says the AMD fanboy.

Hopefully it'll be decent. I'd like to upgrade my display, and this RX570 4GB isn't going to cut it.

Nvidia a shit, only gayman windows faggots use nvidia.

>372 watts from the gpu only
That's like 5 euros for 20 minutes of gameplay at full load here

If you weren't a retarded newfag, you'd notice that's my first post in the thread and it had nothing to do with AMD. Are your parents this stupid? Is that why you turned out like this?

Are you a glow in the dark nigger or just a regular low IQ nigger? It says total system power you daft cunt.

Attached: 1556983706258.png (500x522, 126K)

not holding hope on it being good, but holding hope on it being price performance.

holy fuckin shit
my entire system runs on less power

Yep accusing others of being newfags such a newfag move.
>XD XD XD XD I insulted your intelligence I am so witty and funny I watch Rick and Morty.
Such a sad life you lead user.

I'm glad you've realized your error, son of morons.

The PS5 pulls 150w under full load, and the IGP component has performance on par with Vega64. They have a 125w~ design delivering that perf. It is ahead of VII in perf at equal clocks by a marginal amount, single digit amount.

Thats all you're getting.

I am glad I did, Holder of higher intelligence and watcher of Rick and Morty.

>no u!
cringe

stop replying to yourself

Ok.

Attached: Annotation 2019-05-09 024144.jpg (621x176, 24K)

I don't understand why everybody believes adoredtv about navi but not zen 2. If he's wrong about one why wouldn't he be wrong about the other?

yeah, it's crazy how much the 2080 consumes, imagine if it was a 2080 Ti, woah don't wanna think about that one.

People here are fucking retarded. Adored has no sources, he just throws out blanket speculation while pretending his asspulls are sourced.

What did he say about Zentwo? TL;DW

>PS Would you say no to RTX 2080 performance for 200$~ less because of 20-25 Watts difference?
Who would? But considering the lackluster Radeon VII, I won't hold my breath.

lmfao riiight. Get fucked

only poor 3rd world people care about electricity price

AYYMD cards currently have less than 50% of Nvidia perf/W and considering the 300W disaster r7 was I personally have no expectation for Navi to be any less power hungry.

Understandable but VEGA GPU's came out almost 2 years ago and VEGA VII is a 60CU part, 3840SP's/240TMUs/64ROPS and it wasnt designed for 7nm in the first place.

refer to if you can read

I don't get where the Navi hype comes from. Everybody, even amd shills, have been saying for years that Navi will just be ok.

Well, i think they biggest reason why people are somewhat excited about Navi is because for the first time in a very long time AMD's GPU's will be on a better node. I don't personally think that AMD is going to leapfrog Nvidia in gaming performance or perf/watt, GCN just wasn't built for that but i do think that for the very first time in a long time they will actually have competitive products.

Not in my shithole cunt.

For a couple hours a day ....

wot are your specs

Attached: 107062.png (678x400, 26K)

Fake things

I swear to God I see the same shit posted here about every AMD device.

Clearly this fanboi loves the soi

*offers better performance per watt than intel*

Attached: 11157-ryzen-7-pib-left-facing-1260x709.png (1260x709, 393K)

I have an 1800x and 1700x and 2600k. All good. Im just saying the fucking shit posted here is bullshit and repetitive.

There's a reason why rtx2000s are so thicc
>performance per watt
Nobody gives a shit about this metric outside of data centers

RTG marketing team is a one trick pony

>>>performance per watt
>Nobody gives a shit about this metric outside of data centers
Then why do people even mention amd graphic card power usage

Let's assume 0.5 eur per kWh because you live in neu-weimar germany. That means 50 cents per three hours of full load which can only be achieved with a compute task. Normal people pay 0.05-0.1 per kWh which is 5-10 cents for three hours

>GTX 480 never happened

I'm more surprised 2080 is only 10w lower, there is literary no point to buying 2080 then.

memezen 1500x, rx470, one 1TB SSD

Vega 7nm wasn't much of an improvement so I expect rx590 + 10 at 250 watts

Cheap doesn't mean jack shit when there is a finite supply of it. Electric bills will skyrocket over the next few decades if we don't get our energy usage under control.

at this point I don't even know what i would use more power for.

I sold my 1080ti and bought a 2060 because I don't even play anything that needs the horsepower even at 3440x1440 i just play rocket league and counter strike when are good games coming out?

>bought two expensive graphics cards because games don't require them
Good goy! The more you buy the more you save.

Why did you sell a good gpu for a bad one user?

Vega is a 2 Year old uArch designed for 14nm.

Because GCN can't scale past 56 CUs ;)

The limit is 64CUs

That's not entirely true, there's a lot of issues GCN's architecture, like how the LL cache is implemented and how ROPs are linked to Rasterizers. GCN's low pixel fillrate vs pretty much everything else. You can calculate this by doing ROP * Core Clock, that's one of the reasons AMD worked pretty hard to increase clock rates on Vega.

Not true, they can increase the amount of CU's quite easily however they'd still be limited to 64ROPS which is not good for gaming.

>50c per kwh in germany
nigger its less than 30c everywhere not sure what kind of justed place you live at

>The difference this time is Navi is going to be on "7nm" vs Turing at "14/16nm".
What are you smoking? R VII already shows what GCN can do at 7nm. It's rather underwhelming.

just undervolt tm.

thanks jim

Just...Just kill yourself.

> who cares?
your PSU and mobo

If game is build with shit tons of shaders where R7 can stretch it's legs, card works like a fucking dream. problem with R7 isn't the GCN it's "universal" approach to it's design, one model fits all solution is expensive and not necessary.

Calling 2080 and R7 underwhelming is pushing it a little too far. Specially R7, it's 15% cheaper for 3% less performance than 2080.
For me R7 is the only affordable 4k card right now, if I was forced to buy 4k monitor I'd get it.

they had to drop those failed instinct chips somewhere
some money better than no money.

All AMD soft, fat nerds
see
>pic related when AMD fans get triggered

Attached: 99AA7FF4-FE65-4F17-B3B4-44C963CE8EC2.gif (276x260, 1.08M)

Doubt GCN approach is gonna change anyhow with Navi though, GCN will be GCN. And by underwhelming... well yeah objectively it's not a bad GPU, but it's 7nm and yet it still eats up more power while being slower than novideo... GCN is awfully inefficient and that's more important than some AMD fanboys seem to think.

And? No matter how much the engineering team try, software side will bring it down as usual.

>wattman has just crashed, again

are you a real person?

>the first 20 BILLION Transistor GPU.
Isn't V100 ~21B xtors?

>Doubt GCN approach is gonna change anyhow with Navi though
Going from single-issue to something completely different (and probably completely bogus) is that.

People noticed Kepler+-style register reuse stuff in recent LLVM patches, but I’ve heard nothing about multi-issue, DIMD, etc. yet.

Neither give a shit if the PSU is efficient and can handle the output you dumb cunt.

>when are good games coming out
sekiro says hello

anandtech.com/show/11367/nvidia-volta-unveiled-gv100-gpu-and-tesla-v100-accelerator-announced

>May 10, 2017
>NVIDIA is genuinely building the biggest GPU they can get away with: 21.1 billion transistors

Nvidia did it 2 years ago

>Who cares? Electricity is cheap
climate change and my ears are not

That's Maxwell-style.

>offers better performance
>per watt
Intel only needs the first 3 words.

>climate change
Yes. that 10W difference will doom us all.

Not an argument.

>it's yet another JUST WAIT™ AMD GPU product
It's all so tiresome. I say this as a Ryzen owner. Nvidia is pricey as all hell but they're the ones running the show. 2080Ti is untouched by any AMD offerings

He was replying to a post about power consumption, you dipshit.

2080ti is just a scam. it's just a cut-down TU102 that is priced like previous titans.

HHHHHAAAAAAAAAAAAAAAAAAAAAAAAAAAA yeah okay buddy, performance is good no doubt but the price is just retarded.

>2080Ti is untouched
that's the correct word here because nobody buying it.

Is Zentoo and Navi going to be good? I mean Zen is already pretty good but will Zen2 Poo Poo on intel?

Only weak environment friendly alarmist sois like you care about power draw, nvidiot. Stop being cheap and buy a proper PSU.

>mentions the 2080ti scam
Ngreedia drones, everyone. Keep bootlicking.

60 fps lock

x700,000,000 people is 7GW m8

Nvidia hasn't released a housefire since Fermi. Change my mind.

>the IGP component has performance on par with Vega64
[citation needed]
I wouldn't be surprised if it's equal to a Vega 56 at this rate.

does the rx580 will beat the 2070?

Nvidia hasnt released a card with 3.5 gb since 1 gen ago

>there are 700mil active pc gamers
if only

inb4 they do and it sparks another coinfag bubble because GCN was never about gaming