How about you make it 8gb and sell it for $550

How about you make it 8gb and sell it for $550.
What were they thinking selling it for $699 with 16gb?

Attached: 15314.jpg (678x402, 33K)

Other urls found in this thread:

youtube.com/watch?v=rf_3pn7rOfc
tomshardware.com/reviews/amd-radeon-vega-frontier-edition-16gb,5128.html
techpowerup.com/reviews/EVGA/GeForce_RTX_2080_Ti_XC_Ultra/33.html
twitter.com/SFWRedditVideos

What's the point? You gonna need those 16gb one day. Might as well buy it now. You aren't poor are you?

Because MOAR RAMZ AT DA SAME PWICE is the only thing amd have ever had over nvidia.

What's your excuse for not owning a 30 TB SSD? Do you live in the third world or something?

They want to corner the market of content creators who occasionally game.

It's not a consumer card. I think it's silly they presented it so.

its the same meme as the Fury X

dont fall for it

>I-IT'S FUTURE PROOF GOYS

Attached: 1497642804319.png (802x799, 41K)

i was thinking the same. itd be a lot more competitive if they reduced the vram and price with 2080 performance

Did they think they could get away with selling a card at $850 with 32gb??!

wait for whatever comes next from AMD

>next time
every god damn time

yeah
it's me, I'm the same guy
been doing that forever

Why? 200GE + RX 570 is more than enough

WHERE IS MY NAVI GOD DAMMIT WHERE

7
H
U
N
D
R
E
D

Attached: ff.webm (1920x800, 3M)

Same, but seriously

sure, keep that and be happy for 1080/60

Shills are memesayers.

Don't wait. Ever. If you need an upgrade, buy the best in your budget. If you wait on AMD to catch up you're waiting on well past Navi to be anything. That's how bad Raja fucked up these GPUs.

And he's just going to try and don't the same thing at Intel. While Nvidia sails past.

wait forever

Attached: 1518935744330.jpg (600x600, 59K)

you will need that much ram someday I don't care

my amd HD 6970 2g still runs BF1/5 at 80+ fps fine wish it had more ram.

ISV drivers or get the fuck out.

All of Vegas bottlenecks still exist.
This is just a die shrink with MASSIVE VRAM throughput for some fucking reason. Great setup with threadripper if I'm doing virtual special effects. That's about it.

>market is begging for cards that aren't priced out the ass
>AMD in the midst of this comes out with a 700$ fucking card

By the time we will need 16 GB this card will be useless.

flagships tend to generate hype better
that's of course assuming the midrange comes later

>poorfag general
>spending $700 every 3 - 5 years is outrageous
>poorfags
you people really are the trash of this planet

>why don't they make the top end card into a middle tier card and sell it cheaper
What do you think will they do? Duh

>flagships tend to generate hype better
Their flagship only matches the 2080, and not the ti or the titan. The only way they can force nvidia to really step it up is if they outperform their bleeding edge cards for a fraction of the cost, just like they did to intel's HEDT market.

Making it 12GB and selling it for 599 would've been the sweetspot.

This, but unironically.
Like Vega, Radeon VII will age like wine for 4K.
But too bad there is no 7nm option for 1080p 144+hz.
Will wait for summer, i guess.
Also this card could be amazing if rocm wasn't shit.

that's marketing to the consumer
while you're right, it's not about nvidia

AMD APES BTFO

Attached: 30A6AF4A-7027-4DD5-9A63-2ADD24F1ECC3.jpg (566x700, 106K)

putin would never side with nvidia

Based
Cringe

I thought the Fury X was amongst the finest of wines?

What's silly is not making an 8 GB model that would be a genuine consumer competitor.

>AMD
>thinking
they are as dumb as their consumers

Not everything is for you gamer kiddos. This is a fine workstation card.

Lol no navi no gddr6
It's Vega frontier all over again

So the "VII" is this new AMD series name right?
What the other version will be called?
Radeon "VII 64", Radeon "VII 56"?

Because that would halve the memory bandwidth and make it behave exactly like Vega 56 that already exists and goes for sub $400 already.

This.
Where is the navi? Sick off these 500-1000usd flagships that are basically double the price for 30% more perf if that
Now amd going full retard on hbm again instead of making something affordable

they should have slapped a blue cover on it and flogged it as radeon pro desu

the extra memory makes more sense then and people will kvetch less over game performance

>7nm

why is it 300000000nm long?

Navi is still in the lab
youtube.com/watch?v=rf_3pn7rOfc

Ps5 xbox two launches in a year or so so thats probably why
Big navi might not even come out anyway like big polaris (xbox one x gpu)

The absolute worst part of AMD is that they keep changing their fucking naming conventions.
JUST PICK ONE AND STICK WITH IT

that probably hurt their marketing, but I'm pretty sure HBM parts weren't supposed to be normal GPU generations which would be the reason

>What were they thinking selling it for $699 with 16gb?
That it's 100$ cheaper than 2080 while completely BTFO'ing it out of the water both on performance and efficiency?

b-b-b-but muh raytracing

>cheaper
2080 is $700 and lower
>BTFO'ing it out of the water both on performance and efficiency
Barely edges it in amd cherry picked games at 300W

Why do you drones so blatantly lie?

Why are nviditards so poor they can't afford the power for a 300w card?

There won't be a single version for sure. I expect some defective chips with 8gb and 2070 performance for 550.

will they stop supporting it with drivers like they did with the 6990?

and the showcased card looks like a full aluminum premium model similar to the limited edition of Vega 64
I could see a plastic shroud variant with a blower cooler and extruded aluminum heatsink coming out

Hey remember when I (and others) said that we were in the hype part of the AMD cycle? Remember how rabid AMDrones were over a rumour, hoping to finally for once have something of value?

Well well well, looks like the cycle is yet unbroken. This has been going on for what, 20 years now?

Attached: 1447322207974.png (364x322, 14K)

Kys. I have a Fury X and it's holding out till today. Factory liquid cooled as well

koduri is based poojeet

60 CUs for 699. Maybe 50 CU for 500?
It has 128 ROPs though, performance in games might be just fine since Vega (GCN) has been quite bottlenecked in that department.

Attached: vega.png (662x633, 41K)

>300 WATT

vega was severely bottlenecked by its own memory bandwidth. vega would have been great had amd design a better compression algorithm

There is litterally nothing wrong with fury X.

Because those idiots are the amd marketing department are trying to sell an enterprise/professional card to normal consumers.

>amd marketing department
Once again based marketing department will give us a great product like bulldozer

I'm going to bet you'll be able to get it to 150-200W and keep it to about stock levels of performance which would be the usual behavior of HBM GCN cards.

because it has 4 hbm stacks instead of 2 for double the memory bandwidth

Wasn't this debunked?
As far as I remember, going past the stock HBM clocks on 64 did not give too much increased performance. GCN is bottlenecked by its current hard limit of 4 Shader Engines and low count of ROPs. The new vega has double the ROPs amount and double the memory bandwidth which should help out, but 4 SE's is a big limit.

geee i wonder why amd placed the card for content creators and gamers...

tomshardware.com/reviews/amd-radeon-vega-frontier-edition-16gb,5128.html

hmmm

>Not even 64CU
>AYYMD gives you a 60CU 375W 2 8pin HOUSEFIRES throwaway garbage that they couldn't sell as Instinct

TOP KEK, AYYMDPOORFAGS

64 CUs literally don't do anything for your gaming purposes

thats not the flagship user that r7 has only 60CU which is basicly a MI 50 card
we know amd has an MI 60 card that gives 4 CU more and around 9 to 11% more perfomance which is well within the 2080ti area

Not even close, 2080Ti is much faster than 2080

on average is 11 to 13% depending on the reviewer
techpowerup.com/reviews/EVGA/GeForce_RTX_2080_Ti_XC_Ultra/33.html

It's a 4K card, it's not 1440P card

now now dont go so far...
it barely can keep up 60 fps in MANY games at 4k

I know you MAD that AYYMD is charging $699 for underwhelming, poor performance card with no real time raytracing, but don't try to post fake news

RTX 2080 Ti is the 4K card, period

AYYMD HOUSEFIRES has nothing to compete with Turing

>literally double vram
>b-but aliasing is all you need. Why bother with actually rendering moar pixels?
>waah, this is completely cherry picked! Unlike the games nviddia showcases that are specifically optimised for their cards

I hope its called Navi whatever, I really like the branding with the star names Polaris Vega Navi

>(((4k))) with dlss
>vs actual 4k
Thank leather jacket man! Don't forget to turn off rtx for moar fps!

Geometry bottleneck. My 780ti from 2013 has about the same geometry per clock than Vega.

You think there won't be a cut down partition card eventually?

the 2060 happened, there will be a cheaper 7nm card too

It doesn't make it easy to tell what fits which segment easily, which is the main issue

I'd much rather have different names for every real generation than stupid number increments that don't mean anything really and then suddenly make stupid jumps (980 - 1080 - 2080?)

it never will be a card that barely can hit 60 fps at 4k at many games isnt a 4k card
a card that uses temporal downscale to boost the perfomance while making the textures shit isnt a 4k card

you just mad cause you are out of arguments

Nvidiafags don't care about actual performance. They only want numbers that other people tell them is necessary. Place a 4k monitor and a 1440p monitor side by side, tell him that both are running on 4k with the help of a titan v. And you'll see him gleefully "wow"-ing at the monitors that are actually 1080ps with moar sharpening and aliasing.

This.
Gcn is bottlenecked anyway they could do 128cu monsters but they be useless for gaming because sub 2ghz core clocks

This, why don't retards realize this

if you mean games then probably
if you mean for productivity just look at vega FE benchmarks and judge by your self 16 gb is already the limit

They will only make as many as they have defective down-binned Vega 20 enterprise chips, which is probably not a lot so no point selling a cheaper version.

There's no pressure on Intel to deliver on their GPUs. They can call Arctic Sound a tech demo and no one will give them flak since they aren't a GPU company.

There is pressure on them delivering 10nm desktop and server parts. If they only cough up laptop parts on Q4 2019 they're pretty much fucked.

It's already the case, I use 7GB+ in shadow of the tomb raider in 1080p with ultra settings.
metro will be the same and doom 2 will 100% need atleast 10GB in ultra with 1080p textures and maybe even more with 4k.
10-12GB will be the absolute minimum to play 1080p games in ultra in 2019.
hbm2 has higher memory bandwidth than gddr6, you can do full memory mapping in one frame (1Tbps/16GB ~64hz), you should expect lower frame latency too.

how much would 8gb vs 16gb actually reduce the price though? people are reporting it costs $750 to build this card and theyre selling it at a loss.

it makes too much sense to use 8gb and sell it for $550, if it would actually reduce the price by at least $50. i would expect it to reduce that much or more considering memory prices in general. so there must be a reason. it would have been much more well received.

we must be missing something. how could they do something so obviously stupid as this?

hbm2 and 7nm may cost a lot to produce

When/where will this be available to pre-order? I bet they sell out in a day and are never made available again.

that doesnt really explain the fact that they could make the obvious move of 8gb for $650 and it would have been well received

> How about you make it 8gb and sell it for $550.
But you already have Vega 64.

I'm already topping out the 8 GB of my 1070 in Squad. And while 16 GB is a bit excessive and I can't see it being fully utilized for any game in the next ~3 years, we do need more than 8. Preferably 10-12 but I don't know what size the HBM modules come in.

>4096 bit
holy shit who would actually need that

if you think intel will release any gpu for games you are mistaken

You think new games won't need 16 GB for 1080p in the next five years?
> with MASSIVE VRAM throughput for some fucking reason
But user. Didn't people say Polaris suffer from mem bandwidth starvation? Vega is the same GCN with bells and whistles.

Because it's just a Instinct MI50 with a new name.
The question should be: why did they release it at all and what does that mean for Navi?