Where are the leaks?!?!!?!?

Where are the leaks?!?!!?!?
I can't take it anymore!

Attached: 8892_05_amd-radeon-vii-unboxed-tease.jpg (1920x1082, 223K)

Other urls found in this thread:

youtube.com/watch?v=u5YJsMaT_AE
youtube.com/watch?v=pBrDKnJ6vjQ
twitter.com/SFWRedditGifs

Attached: 613.jpg (657x808, 123K)

Attached: 1652521.jpg (875x1095, 132K)

>2080 beats it in almost every game for less than $100 more
AMDrones BTFO

2080 and vii are the same price what are you talking about

100% more of faster ram
150$ less
ez buy

These are AMD's own benchmark performed on a 7700K. Wait for NDA to lift.

also lol power consumption most likely

Interesting BF5 numbers, although makes no sense compared to BF1 which is practically the same engine

3% for $150 more
Not worth it.

Attached: 1e8.jpg (165x115, 5K)

NOOOOOOO

Attached: 1548610770161.png (396x408, 165K)

>Nvidia card winning in DOOM and Wolfenstein

Attached: 1499579407221.png (561x431, 53K)

then put a diaper on your /v/faggot

t.diy shitposter

Nvidia's new GPU's are pretty much the same as AMD's now in terms of having to do compute heavy stuff.

Radeon 7 looks like it will be a good upgrade to my GTX1060 6gb. Almost bought the RTX2060 but I had enough of Nvidia's jewery

>same price as 2080
>same performance as 2080

zzz

>more money from scalping AMD plebs
ez buy

>Double the VRAM
>Much better compute performance if you do any "prosumer" workloads
>Better drivers/softwaresuite
>Almost guaranteed to beat it by 10% in a year after real drivers come out
>Heavy undervolting potential like vega 56 / 64

Attached: 5.png (1280x720, 1.06M)

The whole basis of AMD's "fine wine" drivers is the fact they've been using GCN for the better part of a decade. It's going to be funny once they finally come out with a new architecture and start releasing broken messes again.

reviews going up in 1 hour?

Nah, maybe 8 EST

No its not engines change massovly in 2 years bf5 jitters the distance behind its TAA while bf1 is just static bf5 uses battlefront engine while bf1 uses a did fork they are totally dif

Thou AMD using a 7700k (4core) is weird I expect on 6/8 core the nvidia races ahead

I unironically want a 7350k (2core) or 9350k (4core) so I don't care multicore is waste of heat and money

GAS THE KIKES RACRE WAR NOW

Attached: images (41).jpg (454x324, 12K)

>get hyped for new gpu release
>remember gaming is dead and the last AAA video game even remotely worth playing came out almost 4 years ago

it's all so tiresome

Attached: 1423898111473.jpg (324x322, 41K)

>nvidia slightly faster for for $100 more
>AMD card has linux support out of the box
>cheap high performance freesync monitors
>AMD CPUs not affected by metldown patch bullshit
WHAT EVER WILL I DO

Will it age better than the 2080 due to massive memory size and bandwith?

It's a better card better thermals and design, it's about buying peak performance hardware. Consider gaymd is on 7nm and doing this badly. They are incompetent as shit in the GPU department

What a boring nigger. Off yourself, you didn't even play Doom.

i played doom when you were in your dad's balls zoomer

Beautiful. Just like the Vega 56 was a better idea than a 1080 (at least after a few months of release) this will be a better idea in a years time than a 2080 also.

Attached: 1507592883839.gif (250x252, 2.96M)

Explain why Vega took so long to get good then?

it didn't it just had a fucked up launch due to the mining craze so the cards were overpriced as fuck

took a year before it got to msrp

All you had to do was wait
Just wait

Attached: Happy Sugar Life 11 - 00.22.04.jpg (1920x1080, 134K)

No, Vega performance in benchmarks have gone way up over the time since it's launch, nothing to do with mining craze or MSRP, but the drivers.

>took a year before it got to msrp
It was MSRP here at the start. Minging craze was already dying out here at it's launch. You can get a (good 3rd party) 56 for €300 used now or a reference PCB one for €380 new.

gib SR-IOV pls

These are AMD own benchmark like they did with Fury X, it's will be worst. Pic related

Attached: AMD-Radeon-R9-Fury-X-vs-GTX-980-Ti-4K-1181x1200.png (1181x1200, 315K)

all I am doing with amd is waiting.
seriously, I want to buy amd stuff, I really do, nvidia needs to lose their monopoly to dampen this overpriced crazyness.

The thing is, amd still cant recover from their shitty driver, shitty performance and stability image.

That's all we can do. I mean, we're already waiting for Navi, then we'll have to wait for a high performance Navi, the wait continues.
I have retarded high hopes for Vega / Radeon 7 though, that Fine Wine™ will keep it relevant for a few years.

Attached: Happy Sugar Life 04 - 00.11.53.jpg (1920x1080, 162K)

This but unironically.
No games worth playing. And by 4 years you mean like 10 years lol.
(Mirror's Edge)

what exactly do you need?
an rx580 is beyond stable, has better drivers than nvidia, and better performance, for a lower price, and is easily the king of 1080p, and can even hit 60fps in plenty of 1440p things
95% of people dont have a 4k or 1440p monitor. so.

Oh god my body is ready for the AMDrone damage control parties.

Attached: pepepe - Copy.gif (499x499, 46K)

> better performance
compared to their "mid" range maybe.
I want to hit a consistent 100+ fps on 1440p (high-ultra). and still be able to use my desktop for work(coding/compiling/concurrent workloads) that would be the sweet spot for me. which even with the new radeon7 would cost an easy 2k euro right now.
but if im spending that much money it better be able to handle future games as well and not barely hitting 100 fps on current gen games.

tbfh rx 580 and gtx 1060 are within margin of error in performance and rx 580 consumes a lot more power

Attached: gpuperformance122018.png (1143x1005, 520K)

>For this test, we measure the power consumption of only the graphics card via the PCI-Express power connector(s) and PCI-Express bus slot. A Keithley Integra 2700 digital multimeter with 6.5-digit resolution is used for all measurements. Again, these values only reflect the card's power consumption as measured at its DC inputs, not that of the whole system.
Find a performance per watt chart that uses total system power usage instead as nvidia cards don't have hardware schedulers and off-load that to cpu causing it to use more power.

>muh power consumption
Unless you live in a third world shithole, that is entirely irrelevant. At least say heat production if you meant that.

>At least say heat production if you meant that.

This is what increased power consumption implies. Nobody thinks about power consumption in terms of wasted electricity money lmao.

How do you think a GPU uses all that power? Kinetic motion?

moving the fans duh

Like this?
youtube.com/watch?v=u5YJsMaT_AE

Attached: AdoredTV.png (1306x839, 800K)

rx580 is usually much cheaper

the whole power thing is a fucking meme, there's a reason companies charge extra to pay something off over time. people on a budget would far rather buy something cheaper now even if it means paying off more in the long run simply because they don't have the extra money now. usually the difference is not significant enough to even cry about. oh nooooo, $50 extra over 3-4 years

kek'd and checked

>POWER CONSUMPTION DOESNT MATTER AAH JUST BUY AMD AND 1000W PSU GOOD GOYS

>good goys
>shilling for nvidia
you can't make this shit up lads

Attached: 44145444.jpg (474x535, 16K)

>it's okay when AyyMD does it
AMDjeet my son...

Germany ... highest cost per kW in all of EU ... you're a retard, the better off the country the higher the cost. Not caring for cost regardless of your financial status is plain retarded.

Attached: 3.jpg (400x400, 30K)

kek'd exactly - highly depends on usage, but since its safe to assume the average degen here is 24/7 gayming and or 100%ted when mining (without his knowledge - malware)

In what world is a Vega 56 a better GPU than a 1080? You are talking out of your ass.

>multicore is a waste of heat and money
2005 called. They want their Pentium 4 back.

you dumb fucking muppet

>fine wine
That just means AMD developers are drunk, it's not a good thing.

Older cards like the RX570 still don't work right.

Just look at pic related. What did they mean by this?

Attached: kernel-5.0.0-rc5-amdgpu-mystery-message-b-keumjo.png (2078x1392, 618K)

Reminder there's no point in upgrading until cyperpunk 2077 comes out

Reminder there's no point in upgrading even after cyperpunk 2077 comes out

Reminder the next gen consoles will have a vega56 equivalent GPU, which means the poorly optimized shitports on PC will need at least a 2080TI to be playable.

You're better off just skipping this gen if you want an upgrade that will last into the next console cycle.

reminder: cyberpunk is a pile of feces

dude there is like 5 multicore games and only 1 of them was released in last 3years. and is the biggest commercial failure of last year BFV

>source: my anus

dude having more than even 2core lowers ya fps by 1-2fps multicore is literally a waste of power for gaming get the lowest they still sell and clock it massively high youtube.com/watch?v=pBrDKnJ6vjQ

enjoy your awful 0.1% and stuttering

false!

LOL!

slub means it is a ram error

looks like a use-after-free, that's not something happens because the RAM is defective, it's something that happens when developers are drunk after drinking "fine" wine. It was introduced somewhere between early in the 4.19rc's. The trace is a bit different for 4.19 and 4.20 kernels than it is on 5 but the root cause is probably the same.

Attached: kernel-4.20.3-mysterious-amdgpu-message-fs8.png (2983x1997, 1.25M)

>1000watt psu
you dont pair vega with i9
Retard

how the fuck has nothing leaked i'm sure some faggot has got his hands on one already from a store

>cyperpunk 2077
God i hope this shitpeace flops hard
All the shills dont even realize how awefull W1 was and W2 is at most decent

SO when the FUCK does the embargo lift?
At what time

>W1
>awful

lmao get some taste pleb

With up to date drivers in some games it scores better, in some worse. With a decent UV (that will actually make it faster than stock voltages let it) it's only about 30W more power hungry under load than a non-OC'd 1080 (non-Ti), but it's a lot cheaper, 200 bucks cheaper bought new, half the price of a 1080 if used.
So I wouldn't' say it's better in terms of bang per watt, but it is in terms of bang for buck, since they both are on par in terms of performance.

2 bongs and 30 bings

>sip monster energy potion
>have to cycle through 30sec cutscene
>no skip button
>sip next monster
Dont get me started on the combat mechanics

Are you sure this is not something retarded that you are doing?

You must live in a magical place then

They were double the "retail" price almost everywhere.

the cheaper cards were 600€ in autumn of 2017, I just looked it up
not retail price but not double also, not that user though

Same 0.1% thou

You just pissed you running 7 or 5 literal useless cores when you game shut them off and build 7inch squared small form factor PC

This isn't fucking /v/ nobody gives a shit about gaming. Does it support SR-IOV or not?

How could you tell if it's working? I'm too dumb to figure this out and it's not documented but source diving shows that amdgpu driver itself seems to have all the bits and bobs in place for this

Please be real, I will buy it despite having no reason to upgrade from a 1070.

Attached: 1479876654713.png (328x408, 83K)

>the better off the country the higher the cost
Totally burning coal just to spite you greenfags with a few freedom cents. Africa must be great since electricity is more expensive there

Attached: depositphotos_94867220-stock-photo-asian-woman-laughing-and-pointing.jpg (1023x682, 44K)