Can it reach RTX 2080 performance with its fine wine potential?

Can it reach RTX 2080 performance with its fine wine potential?

Attached: AMD-Radeon-RX-5700-XT.png (2088x1171, 1.3M)

Other urls found in this thread:

youtube.com/watch?v=215TJIZpFlQ
youtube.com/watch?v=pcGWIJW3iNU
techpowerup.com/review/nvidia-geforce-rtx-2080-ti-founders-edition/31.html
youtube.com/watch?v=sBQPGr87w6Q
youtube.com/watch?v=1nqhkDm2_Tw
research.nvidia.com/sites/default/files/pubs/2011-09_Interactive-Indirect-Illumination/GIVoxels-pg2011-authors.pdf
twitter.com/NSFWRedditImage

FineWine doesn't exist for RDNA it was only for GCN because GCN actually needed optimization

turing will age better than rdna

Old AMD had potential due to insanely high FLOPS and memory bandwidth for the price of the cards. Navi is meh-tier compute, and handicapped by its low memory bandwidth, giving it no theoretical advantage over nVidia.

AMD has no chance to beat nVidia in a fair match-up like this when nVidia has so much control over how games are being developed.

RDNA is basically GCN with its bottlenecks removed and no driver optimization. This mother fucker right here will age like the HD7970/280X.

But I wouldn't mind a 5800XT or a 5900XT, provided they have more than 8GB of ram. Even more so, if they go for HBM/HBM2 for lower TDP as well.

Is the 5700 XT a bad buy? There's still a price gap with the 2070 Super but the custom cards and sparse supply are shrinking the price difference to make you consider going for Nvidia instead.

The 2070 super is over one hundred dollars more expensive for a ~6% better performance bump, on average. It's not worth considering nvidia.

To follow up on my comment

youtube.com/watch?v=215TJIZpFlQ
youtube.com/watch?v=pcGWIJW3iNU

That's assuming you can get them for MSRP. Quick look at Amazon shows the Pulse card is going for $485.

Get the Red Devil now for relatively cheap (500e) or wait for Nitro+? It's not in stock right now and I fear that the price will hike up once it's back in stock to over 500e like other stores have it.

>Amazon shows the Pulse card is going for
unless it's being sold by Amazon themselves it's just going to be a rip off 3rd party selling looking to price spike new cards.
Just have to wait for the amazon supply chain to fill up with stock.
No reason to even bother to look at 3rd party AZ sellers.
At least not when newegg has a few company's XT's for between $409-$449

That's what big(ger) navi is for

In Eastern Europe 5700 XT of AIB partners that actually matter like Sapphire Pulse are at best $20 less expensive than the vast majority of AIB 2070 Super which makes 5700 XT completely pointless. Not to mention AIB 5700 XT are usually out of stock constantly.

It's a bad buy because they barely fucking work right now. Tons of people are still reporting issues of BSODs and extreme stuttering even after the last drivers that helped alleviate a great deal of the issues. I would try back in like 3 months. Also AMD is never going to fix openGL drivers on windows so in my book that's an instant fuck them.

I wonder what they can achiev with 80CU navi, but the power usage scares me.

but HBM is hot as fuck anyway

>turing
>age better
AHAHAHAHAHAHAHAHAHAHAHAHAHAHAHHAHA
*WHEEZE*
AHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

If it scales linearly and they don't use HBM it should be around 400W~ :^)

Attached: power-gaming-average.png (500x970, 54K)

Just gonna leave this right here...

Attached: untitled-1.png (709x909, 52K)

what does that have to do with the temperature of HBM?

Upgraded to a 5700 XT from an Nvidia GTX 770. Experienced several crashes until I did a complete windows registry sweep of all Nvidia entries, after which it has run smoothly. Even after uninstalling Nvidia's drivers, there were tons of registry settings that were left behind.

Attached: a225a8be9e1578aba41fd50d1a70a688496c29886985e3ef43a9e451c93f6bcf.png (780x825, 27K)

Doesn't DDU remove all that shit automatically?

Don't know, I uninstalled with Nvidia's own uninstaller.

>I uninstalled with Nvidia's own uninstaller.
You're not a very smart man, are you?

Attached: president-trump-dorian-map-ap-jef-190904_hpMain_12x5_992.jpg (992x415, 65K)

Power shouldn't be an issue. Navi 10 is clocked well beyond the sweet zone like desktop Zen 1 and 2 are. If they add more cores they can just reduce the frequency and save a lot of power, like EPYC basically.

Of course that also applies to nvidia but it seems like Nvidia has no problem getting the 2080TI to hit the same boost 4.0 clocks as the 2080 despite only using like 50W more

Also, why are you looking at 1080p benches on a 2019 GPU?

The whole point of these new GPU's is better 4k performance, besides some DX12 horseshit.

it is optimized GCN. RDNA will have all benefits of all next gen RDNA cards driver optimizations just like GCN did.

These GPUs are not meant for 4k performance at all goofy

>only using like 50W more
more. 2080ti can go up to 350w.

GCN only had those improvements because it was a brand new ISA and uarch. RDNA is just a modified GCN. RDNA will have very little room for improvement

We need 1000fps at 1080p before we even think about 1440p

No they can't, not unless it's specifically an AIB custom card designed to break Nvidia's TDP specification. But AIBs can do that on virtually any GPU because it's non-standard

techpowerup.com/review/nvidia-geforce-rtx-2080-ti-founders-edition/31.html

you don't get the point.
RDNA is same ISA as GCN rearranged, true
it means the benefits of all next driver optimizations for any RDNA card will benefit all RDNA cards, same way GCN did.
this is the whole point of "finewine"

What your suggesting didn't happen. The Fury X and Vegas are getting left in the dumpsters in terms of driver improvements. The 7950 stopped being relevant after the first big end of year drivers. Most of the improvements for a particular generation happened within a year of its launch

Then why the fuck do some of them have 500+gbps of bandwidth?

Oh ok, let's just bottleneck the CPU

fucking retard

Yep. the 5700 has 448GB/s memory bandwidth and this asshole thinks it's meant for 1080p.

See what happens when you utilize bandwidth?

Go suck a turd

Attached: forza3-1.jpg (711x530, 49K)

In my country, an AiB 5700XT is 489, a 2070s 499. Why not pay 10 more for raytracing and machine learning? With machine learning, nvidia will be the fine wine this gen. They have farms of GPUs running every conceivable game and once all that is learned is compiled in to a driver update, performance gains will be unbelievable.

Suck my dick pussies. Maybe like 2 or 3 of these GPU's MAX are meant for 4k gaming. The vast majority are for. 1440p and 1080p gaming.

RTX fine wine

Attached: 3C91536D-0362-4DD2-9258-2DDD74BC4CEE.png (3244x1703, 134K)

help

Nice cherry

go ahead and avoid the bench dipshit

>calling that cherry picking
>ignoring

>driver update intended to increase performance in specific games
>chart shows the increases in those games across nvidia’s line-up
>this is somehow cherry picking

I think you just have sour grapes.

why the fuck you are parroting bullshit made of GN?

This. Fuck the nvidiots

oh lord, you just skip the content and say what you want to say right?

Go ahead an be a red/green fag then

Most gaming cards can't handle 4K in modern games at 60 fps.

Having a spec or two that helps in that area isn't relevant if the end product still doesn't give the results.

No, the performance went from inexcusably dogshit to decent, all of those games had terrible performance on nvidia but they release these charts to trick retards like you

Nice nvidia marketing slides nigger

It’s really pretty simple what you do. 2070 super for around $460, you buy it. I would be willing to pay maybe $40 more for the 2070s over the best 5700 xt (red devil, nitro). If the super prices don’t come down a little more, then fuck em

I just wish AMD would relase rx 680/690

Possibly, but definitely not on game/engine utilizing nvidia tech. Aka Gameworks/UnrealEngine

One of the reasons Vega wasn't AS hot as it should have been, but quite expensive to compensate for that, was HBM2. I don't know about HBM1, though. HBM2 though doesn't run as hot as GDDR5. Or GDDR6, apparently.

With better binning, perhaps a 56CU 5800XT could be possible, with equal TDP to a 5700XT, because of better silicon and HBM2? It could work. Maybe coming early 2020?

It's funny when 1080p is suddendly relevant to AMDrones when we're comparing GPUs and not CPUs

Well, in that case, you go for what you can afford best. The Ray Tracing part is not going to see much adoption, I think you've figured that one out.

>They have farms of GPUs running every conceivable game and once all that is learned is compiled in to a driver update, performance gains will be unbelievable.
Uh ... sure.

That driver is a mixed bag, It improves a certain few games, others remain within margin of errors, others regress.

youtube.com/watch?v=sBQPGr87w6Q

So, depends on your game.

Not just the HBM but the controller runs cooler

Well, there you go.

GCN is an ISA.
Southern Islands, Sea Islands, Volcanic Islands, etc are all different hardware implementations of the ISA.
RDNA is still GCN. It is not a whole new ISA.

>The Ray Tracing part is not going to see much adoption, I think you've figured that one out
Mentally ill. The raytracing part is an industry standard, DXR. Any RT core hardware acceleration happens on the driver side so it's transparent to developers. MS already confirmed their next gen Xbox will support hardware raytracing so you can bet tons of AAA next gen games will support DXR. Navi is DOA, might as well just toss it out in 1 year

FineWine was never about something "improving", it was about fixing. It's just that at the release AMD drivers are always unbelievably shit and any improvements come from fixing that shit up over ridiculously long period of time.

I'm talking about nvidia's specific brand. Not RT as a whole. AMD is working on their own. Here's a tech demo Crytek made of their own RT implementation working on a vega 56

youtube.com/watch?v=1nqhkDm2_Tw

Nvidia doesn't have a specific brand of ray tracing. All they did was make a driver front end for Microsoft's ray tracing shit thats part of DirectX. Its PR spin.

>1080p doesn't matter when it is comparing cpus
>1080p all of a sudden matters when it is comparing gpus
Drones in a nutshell.

Nvidia's implementation IS DXR. The RTX part is nvidia accelerating the DXR tasks given to it by the game on RT cores, hardware accelerated blocks attached to the SMs. Pascal GPUs also support DXR but without the RTX acceleration it runs like shit. AMD doesn't support DXR at all on any GPU.

>own RT implementation
They're literally using a variant of a technique Nvidia invented nearly a decade ago called VXGI which uses voxel conetracing to effectively do low fidelity raytraced global illumination.

research.nvidia.com/sites/default/files/pubs/2011-09_Interactive-Indirect-Illumination/GIVoxels-pg2011-authors.pdf

They released the public Apollo 11 tech demo in 2014. It's supreme irony that AMD piggybacks off nvidia's work yet again

Are you saying that nvidia's RT API is open to all AMD cards?

>driver front end for Microsoft's
You're a retard. Drivers are backend. API is literally the front end, it's Application Programming INTERFACE. It's just a definition that someone else fills out with actual functionality. It's like getting a list of requirements for a car, like "must go faster than 120mph", "must acceleration 0-6 in under 8s", etc and it's up to Nvidia and AMD to make something that fits those requirements so game developers can use it assuming it fulfills the needs laid out in the requirements.

In reality nvidia is doing 99% of the technical work while MS makes sure everyone is on board with their idea so the industry doesn't have any issues, aka relations. I would say AMD is too but they haven't talked about any support for DXR at all.

tell that to tessellation

>MS already confirmed their next gen Xbox will support hardware raytracing so you can bet tons of AAA next gen games will support DXR. Navi is DOA, might as well just toss it out in 1 year
You ...you do understand the next XBOX is Ryzen+Navi, right?

>nvidia's RT API
There's no NVIDIA RT API it's just a proprietary wrapper for Dx 12

It's not Nvidia's API.

DXR is Microsoft's API and yes it's open to AMD and has always been open to AMD, and AMD needed to have been on the team helping MS design it so they always knew about it. They're just pretending like Nvidia is keeping something locked down because they're grifters who rely on BS brand image.

Tesselation is fucking everywhere

What I've been trying to tell kids these days all along is:

If you want a car that goes 120mph and 0-6 in under 8 sec, all you need to do is look at the curb weight, torque/rpm and horsepower, kind of like looking at a GPU and deciding oh, well, it has 64 ROPS, 2048 parallel shaders, and a256-bit memory interface at 4000 Mhz...

hasta bien?

You understand that Microsoft is the writer and maintainer of DX12-DXR right? And it's in their best interest to make sure everyone uses DXR when making games. It doesn't matter if the xbox is all AMD, developers will be using a MS API

Hold up, you are saying that nvidia's RTX, which wasn't even open, at launch, to the GTX 10xx series, can be activated for AMD cards? As it is in the Gameworks implementation of the technlogy?

>You understand that Microsoft is the writer and maintainer of DX12-DXR right?
Yes, and AMD does support DXR. But RTX is not available to AMD. Meaning RTX won't see a console implementation.

and accordingly shits on hardware

>But RTX is not available to AMD
Doesn't matter, when developers make DXR games RTX will be able to use it. RTX exists on the GPU hardware-driver stack, developers shouldn't need to make it RTX specific unless they want to.

When a game calls DXR instruction "doRaytrace()" it goes to the GPU drivers at which point the drivers will use the RT cores to accelerate it. RTX is just a brand name for their acceleration feature. If AMD wrote DXR drives the same instructions would go to AMD's drivers to do whatever they want.

>Hold up, you are saying that nvidia's RTX, which wasn't even open, at launch, to the GTX 10xx series, can be activated for AMD cards? As it is in the Gameworks implementation of the technlogy?
Again RTX is just a brand name for their acceleration. It's like if Nvidia started calling their tessellation engines Nvidia TesselationX(tm). It's still called by DirectX instructions.

>can be activated for AMD cards
No, AMD needs to write drivers for it.

>As it is in the Gameworks implementation of the technlogy
RTX isn't entirely part of gameworks, although there are probably some gamework APIs that are specific to Turing/RT cores, idk, but the primary mode of RTX we see in games is DXR.

>not reinstalling os after gpu change

>RTX is just a brand name for their acceleration featur
Which nvidia has closed off the competition from implementing. This is what I am talking about.

>No, AMD needs to write drivers for it.
They cannot write drivers for RTX.

>RTX isn't entirely part of gameworks
Because it is based on DXR, true, but its implementation is not open to AMD and there cannot be any driver optimization for that. It's the same thing for DLSS and PhysX.

oh. my. god.

fucking Jow Forums

>It's the same thing for DLSS and PhysX
Are you retarded? PhysX and DLSS are closed entirely, including the interface. AMD literally cannot access it even if blindly guessed it because it's protected by a license.

>They cannot write drivers for RTX.
Because RTX is hardware-specific, just like GCN.

>Which nvidia has closed off the competition from implementing. This is what I am talking about.
No, they've closed off AMD from copying their hardware. Guess what AMD closes off from everyone else? GCN, their x86 (32-bit) patent suite, their x86_64/x64 patents, a shit ton of HBM controller implementation patents, the original trueaudio DSP block, even XGP is still proprietary at this very moment because instead of just opening it up before tossing it in the garbage they killed it quietly. They've already applied for a patent of their version of hardware raytracing accelerators.

It's a world apart from CUDA which AMD could conceivable implement if they had a license for (which, one was offered for pennies per unit when CUDA was new but AMD spat on GPGPU instead and turned their nose up at it). AMD doesn't open source or open domain any of their hardware.

AMDrones have finally come to the final mentally ill conclusion that everyone else just has to give up their hardware for free while AMD continues to maintain duopolies with Intel and Nvidia.

>Yes, and AMD does support DXR
They don't support DXR. That's why you can't actually run raytracing on any DXR games. There was a moment when some retards thought AMD could do reflections in BFV but it was just defaulting to screenspace reflections. 0 support, 0 statement of support, complete disregard for industry standards.