Rip nvdicucks

rip nvdicucks

Attached: 1565953313839.jpg (1171x1021, 178K)

Other urls found in this thread:

youtu.be/fCTieiaSMYA?t=205
guru3d.com/articles_pages/amd_radeon_rx_5700_and_5700_xt_review,13.html
techpowerup.com/gpu-specs/nvidia-tu104.g854
techpowerup.com/gpu-specs/amd-navi-10.g861
techpowerup.com/258340/assetto-corsa-competizione-dumps-nvidia-rtx
youtu.be/AY2UHSYMVyo?t=1m29s
twitter.com/NSFWRedditGif

no ray-tracing, no thank you

No -80% FPS, thank you based AMD

>He missed the threads where devs were abandoning ray tracing

Ohnonononono

*25%
*at FPS counts that are already far in excess of 60

>60% drops for 1080p DXR Medium/High/Ultra
>64FPS at 1080p on 2080 Ti
>far in excess of 60

Attached: 1080.png (500x1170, 85K)

Whats it like being bad at your only hobby?

Attached: 31LTeRW.jpg (544x841, 48K)

I missed it too, can you post a link.

I appreciate AMD and what they're doing in the midrange market but they still don't win me over unless they can bring out something comparable to pic related.

Attached: 1080Ti_Chad.png (293x165, 19K)

Attached: bf5.png (713x909, 54K)

Good for the people who play Battlefield V(agina)

>1080p
>60fps
>2019
are you poor?

Sorry I don't enjoy 30 fps gaming.

2060+ can do raytracing 1080p at 60fps

>2070 barely does 60.4 at Low

>2080 Super faster than the 2080 Ti
Nice shill site you've got there.

It just shows that in titles optimized for modern apis and arch that Navi really excels. I'm surprised most titles aren't more optimized for Radeon considering 2/3s of the console market are based on Polaris and the last 1/3 is 6 year old Maxwell hw.

yes but can i go back to not paying over $1000 for a single computer component

It's not even the point. Nvidia's current rt implementation is purely cosmetic and doesn't noticeably improve graphics. It's simply not worth the price increase and fps drop.
Ray tracing is a good thing but right now hardware is barely able to handle its very basic implementation.

????
>youtu.be/fCTieiaSMYA?t=205

I will buy 2070 super soon. Fuck amd fascists

>Linus Tech Shills

2070 S is the best bang for buck GPU right now. Just lol at buying anything else. Nvidia is tried and tested, they haven't been housefires since Fermi and that was ages ago. AMD Radeon can't even get the basics right LOL

>non arguments

based retard

no you retard it shows that the game is CPU constrained at 1080p with anything more powerful than a 5700XT hurrr

No look at the 1440p results

> Bang for the buck
Imagine thinking 600 dollar is the affordable option.

post them here, bud

Let's assume his results are correct. It's still 50-60% FPS loss despite the chip having dedicated hardware for RTX. Now imagine that meme hardware was instead turned into normal hardware. In non-RTX games half the chip does nothing.

Imagine thinking 600 dollars for something that will last you at least two years is unaffordable

Just take out a loan

AMDrone anuses will be so stretched when Nvidia decides to go 7nm.

Best bang for your buck is the vega 56, priced lower than gtx 1660ti.

You can thank AMD for not having to pay 1500$ for a 750Ti. Just look at what nVidia did with prices in the market they dominate: Quardo and Tesla.
AMD is keeping both Intel's and nVidia's prices in check, which is good for us.

It's outperforming gpus twice the die size. The one flaw is Navi has high power consumption, but that's not particularly important on desktop.
guru3d.com/articles_pages/amd_radeon_rx_5700_and_5700_xt_review,13.html

techpowerup.com/gpu-specs/nvidia-tu104.g854
techpowerup.com/gpu-specs/amd-navi-10.g861

imagine giving nvidia money for their premature rtx experiment which is totally broken and unusable

nvidiots truly are the biggest goys

Attached: 1564637061388.jpg (2880x1562, 546K)

I'd rather have working drivers.

>Now imagine that meme hardware was instead turned into normal hardware.
I like how you think goy, we shouldn't innovate and push boundaries, we should just keep milking people by reselling the same old tech over and over again!

Housefire trash

Yes I'm glad AMD is doing okay for that reason, but it's really cringy when you see posts like OPs that completely disregard the fact AMD needed to jump @ 7nm to catch up with NV's 12/14nm. Navi @14nm would still be fast of course but it would be housefire incarnated.

sure if you don't care about noise and heat :)

>at least 2 years
>just take out a loan
We're reaching bait levels that shouldn't be possible

Attached: 1564814880618.jpg (583x657, 100K)

ahh, yikes

Attached: 4ks.png (705x946, 53K)

am I wrong?

Innovating more ways to cheat in benchmarks with drivers or pushing the boundaries of bullshit like in
Going away from GPGPU into proprietary fixed-function units is not innovation, it's regression.

Is DLSS supposed to be a sales pitch?
>rivets in tank are completely gone.

They got innovated away!

The middle one is Radeon Image Sharpening that has 1-2% performance cost, has an open-source algorithm and works with most games without having to be nVidia-approved like for DLSS.

also introduces artifacts just in other ways.

nice cope

Any image modification has a chance of causing artifacts. The DLSS impact on quality is way bigger.

No user, it’s going to become like the 90s again

nice cope

techpowerup.com/258340/assetto-corsa-competizione-dumps-nvidia-rtx
>ray-tracing, no thank you

PREASE BUY WATERCOROR STUPID AMELICAN

the rx570 is so fuckin based
literally fine wine

The idea that vega 56 is a housefire is a meme. With an undervolt and an overclock, you can make it 15% faster than a standard v56 and make it use as much power as a 1660ti. Undervolting and overclocking is brainlet proof, even you could do it.

This thing is a beast.

They did everything right this time.
>Thermals
>Noise
>Build Quality
>10 Phase VRM
>Fixed the Red Devil Aesthetics
>Added RGB lighting
>440 USD

Attached: Cyberdemon status.jpg (620x496, 22K)

Why does price gouges eu every time.

Vega 56 on release more expensive than 1080.

Now aftermarket 5700xt are 500~ eu, while 2070 Sups are 550-600.

Attached: 1488757143285.png (400x400, 192K)

vat

okay great, where the fuck do I buy one? You'd think after everyone sitting around waiting for over a month for AIBs, they'd have more stock.

Laughs in 1080Ti in sli that paid for itself during the mining craze
->rule one never listen to Jow Forums-<
They shill AMD and are too poor. Should have mined poor fags

Nice CPU bottlenecked graph.

>out of stock

The mere fact that it's loud and hot relatively speaking makes it an absolute failure to me as I got spoiled by my oced 1070 temps never going above 70 degrees while remaining quiet in my dusty case. Then add the retarded discrepancy in performance between games depending on API and developer which can put 5700 xt anywhere below 2060s and above 2080 which honestly is a huge fucking pain in the ass as you never know if the game that you're looking forward to will actually run well.
AND TO TOP ALL OF THIS OF here in Europe it's another infamous AMD fucking paper launch where cards are always out of stock and the limited availability makes them cost 100 euro/pounds more than MSRP.

only the awful reference card is loud/hot

No. Sapphire 5700 xt Pulse is going to 75+ degrees under load in most reviews when in comparison a 2070/2060 with good cooler never goes above 70 even on OC.

>The mere fact that it's loud and hot relatively speaking
29 dBA under load is loud?

>this is what amdrone actually believes

Are you referring to core temps or junction temps?

>Then add the retarded discrepancy in performance between games depending on API and developer which can put 5700 xt anywhere below 2060s and above 2080 which honestly is a huge fucking pain in the ass as you never know if the game that you're looking forward to will actually run well.

That's my biggest fucking gripe with AMD GPUs. Drivers are still all over the place and you can start beating 2080 in Vulkan games just to fall behind 2060 Super in dx11/12 titles.

Seems to do great in DX12 games like Hitman and Forza.

2060 Super nah
Extremely few titles favor nvidia that much to beat this card
Navi fixed some of the inconsistency but yea AMD drivers need some work
OpenGL performance needs to improve
DX9,11 and 12 should be fine

>RTX
>Actual ray tracing
One of these is not like the other.

I think my b450/3700X was enough paid beta testing for AMD, kek.

ah you are a b450 victim

>tfw 5700XT is right behind a 2070 super

imagine how fast the 5800 and 5900 will be

>RTX is announced
>everyone laughs at nvidia's efforts to push a completely unfinished technology
>fast forward to 2019
>"no raytracing, no buy"
Really makes me thonk

Attached: 1490519139530.jpg (480x360, 30K)

>he pays more for inferior NVIDIA image quality in 2019
What is the point of getting 200 frames a second if the game looks like diarrhea because of NVIDIA's corner cutting?

Attached: 1565193381159.jpg (471x388, 94K)

any other christians hesitant to buy this because of the name? not even joking

Attached: WBC_protest.jpg (187x244, 19K)

Is it twice as good as my 1080 and less than the 400 dollars I paid for it? No? then fuck off last time I fell for the amd meme was the 290x and you can fuck right off if you think I'm going to fall for it ever again. I'm happy they're kicking nvidia hard enough in the balls to reconsider the 2000 series pricing though. Maybe the next gpus will be reasonably priced and not retarded like the SUPERS

Attached: smug.png (293x270, 41K)

Define twice as good. Radeon cards render much better looking graphics compared to whatever Nvidia puts out. Nvidia only cares about the amount of frames rendered a second and the image quality seriously suffers as a result. You don't get that with AMD.

>Radeon cards render much better looking graphics compared to whatever Nvidia puts out.
Can you give me some evidence of this? I didn't notice any difference other than the performance going from 45fps~ to 80~ to 120~ going from the 290x to a gtx980, then to a gtx1080, all of which on a fx9370. FPS doubled when I went to a 2700x but I never once noticed the picture quality changing because I've had the same monitor since 2009.

im hesitant to buy even though im not religious. they put all those weird pentagrams everywhere on their devil products and its uncomfortable to me.

>AMD RX 5700 XT GPUs reaching 110°C “expected and within spec” for gaming
BRUH
OH NO NO NO NO NO

>another brainlet who doesn't understand the difference between package temperature and edge temperature

That is unironically normal though. nvidia gpus report edge temp, that's junction temp.

I don't get why people would spend so much money on GPUs and high end desktop parts when a cheap ps4 slim + cheap 50 inch VA TV plays the same games at stable framerates for a fraction of the price.

It's not the same experience duh

I'm not even Christian and it makes me uncomfortable. Pre-ordered the sapphire pulse.

Once you experience 1440p 144hz gaming it's very hard to go back to console framerates.
Also that's a closed platform I want the versatility of PC gaming.

Playstation doesn't allow you to run your desktop applications. I guess it'd be viable for the /v/ plebs who treat their computers like game consoles, but your average Jow Forums user isn't gonna get what they want out of a PS4.

yeah, but are those extra fps and resolution worth the price?

>$439 + Tax
Nah, probably will just snag the Pulse instead.

CRAZY.

Attached: JUST CRAZY.png (905x811, 372K)

you make up the money in game prices and not paying for online. plus the experience is just outright better. even on potato PCs with lower specs than a ps4 the experience is overall better.

Yea that's the real problem. THIS GPU IS FROM A DARK PLACE

the fact that VERY little in the scenes even uses ray tracing, the games are BARELY ray traced, and even when it is used... tell me why this
youtu.be/AY2UHSYMVyo?t=1m29s
is not doable and in fact preferable to ray tracing if only for cost v performance?

the fact of the matter is until raytracing is doable, (which will either take 3d stacked gpus at low frequencies on chiplets or will somehow find a way to accelerate it though fucking with the math and some bastardization that looks good enough in real time) it is better to have the hardware serve a dual purpose, if nvidia wants to run ray tracing through tensor cores, fine, but figure out a way to run normal gpu operations though it too, then just have the whole gpu a tensor core, that way you're not pissing die space away when you do the work, what is it... 50% of the gpu is machine learning/tensor optimized? the fuck am I paying for 50% of a barely functional gpu that by the time something works will be outdated and useless for said feature?

thirding the thread that they were abandoning it, really want to see it myself.

Thank You.
I will buy now.