WEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEW

techpowerup.com/249557/battlefield-v-with-rtx-initial-tests-performance-halved

T R A S H
R
A
S
H

Cannot-fucking-wait for 'DEM JUICY 7nm Radeons in a couple months from now...boy-oh-boy, what a treat that's going to be.

Attached: xrYUUUhhEbIsk3b8.jpg (500x290, 44K)

Other urls found in this thread:

youtube.com/watch?v=bC5OAWqKF70
youtube.com/watch?v=U93IV8vv5EU
youtu.be/mFSLro_OXLQ
techpowerup.com/249465/david-wang-from-amd-confirms-that-there-will-eventually-be-an-answer-to-directx-raytracing
youtu.be/dhGz_7mYnj4
twitter.com/SFWRedditVideos

joke company

Attached: amd.png (608x981, 239K)

Go back to

U seem MA

That's the cost of living your life with raytracing.

Attached: bda9488eea1214d29b5091e7882b816e098c9ed281808e3ce881970be0024bda.jpg (1041x1238, 256K)

funny pic. more cores at less nanometers are dope though.

Is the CPU bottlenecking at 1080p. wtf

PAY MORE FOR LESS PERFORMANCE
AHAHAHAHA

Attached: the_more.jpg (1280x720, 79K)

>Less than half performance with Goytracing® ON
Holy shit, what a blunder. 1080ti is looking better than ever.

Nvigger, stop.

>less than half performance
>at 1080p
Now imagine 2560x1440 (the DE FACTO mainstream gaymen standard these days) and 4K...also, Star Citizen ALREADY supports up to 8K and the upcoming X4 THAT'S COMING OUT AT THE END OF THIS MONTH is going to support up to 16K. Just Saiyan.

>better graphics take more power
Woooooow

DLSS is GARBAGE, there are NO "better graphics" in this SHITE.

Of course it runs half as fast it has to shade all the reflection samples too. What did you expect?

I'm just here to see the eventual GN video where they try and run it on a 2070

Attached: 3erXs29.jpg (960x1023, 91K)

Gotta love that faketracing. A decade of progress gone in an instant.

>tfw the whole raytracing meme is completely unnecessary and has been since fucking waverace
>tfw nvidia is doing this RTX shit simply so they have some new shiny thing to show off without actually innovating since radeon gpus are STILL fucking worthless so they have no reason to try

youtube.com/watch?v=bC5OAWqKF70

Attached: 1494783887150.jpg (153x256, 12K)

>Radeon gpus are STILL fucking worthless
RX 580 and Vega 56 are selling like hot cakes right now, neck yourself.

AMD are looking to be kings of the midrange and I am all for it.

I picked up an Rx580 2nd hand for ridiculously cheaper than a 1060 because of nvidia mindshare meme and couldn't be happier.

if they make the king of FPS per $ with navi I'm all in.

I don't know why they're trying to push this gaymur raytracing shit so hard right now. Dedicated RT hardware is nice, but we're still nowhere near what's needed to do it well in real time.

1080p60 seems pretty realtime to me. It's also done well in the sense that full scene reflections were impossible before and are possible now. I do think they could optimize it down some more though. My guess is it's the shading not the ray tracing that is the problem.

I wasn't actually aware they were even selling those again after being out of stock for so long, my bad. I was only counting the rest of the lineup. Point stands about nvidia not needing to innovate though.

To do QUALITY ray tracing you need to have AT LEAST 1000 cores dedicated strictly ONLY to ray tracing. RTX 2080 Ti has only 36 ray tracing cores. LOL.

>Buying a 1200$ GayPoo to play at 1080p
>end of 2018

Attached: WWWY.gif (446x251, 721K)

>QUALITY
They're doing it real time, which is somewhat impressive but this RTX shit is going the way of physX pretty soon.

>They're doing it real time
Real-time IS quality. And you need 1000 cores for that. RTZ 2080 Ti has only 36 cores. Geddit?

I made pic related yesterday.
nobody cared.

Attached: rtx.jpg (1199x1233, 254K)

>Real-time IS quality
but its not and has never been. Quality rendering requires time for a single frame. A long time, in most cases.
>RTZ
did I miss something? I thought it was RTX

Tanks, shaved.

>but it's not
EXACTLY, because THEY ONLY HAVE 36 OUT OF MINIMUM REQUIREMENT OF 1000! It's TRASH tracing, NOT actual real-time ray tracing.

That's not the kind of real time ray tracing I'm talking about. NVidia's ray tracing uses a hilariously low sample count per pixel with less than desirable accuracy, then uses a neural network to smooth out the terrible results. On top of that, the ray tracing is only used for a handful of effects in the game. Rasterized rendering is still responsible for the majority of what you see. And even with all that you can barely get 1080p60 on a $1200 card.

It's still over 60 FPS™

Attached: 1492961312602.jpg (832x690, 130K)

This is tech meant for pro customers that they are forcing onto gamers. Gamers would be better served without ray tracing cores.

>NVidia's ray tracing uses a hilariously low sample count per pixel with less than desirable accuracy, then uses a neural network to smooth out the terrible results.
This d00d gets it. You need at least 1000 RT cores to do an actual real-time ray tracing at the MINIMAL spec, GTX 2080 Ti has only 36 RT cores however. Fucking garbage shit...and it's 1200$. Fucking laughing my fucking ass off at this TRASH and also at all of those marketing victim dumbasses who bought into this SCAM

Have you seen the demos? They look quite good. I like the metro exodus version a lot. You're right that they only have enough power for one effect at a time. Not ideal but still a value add. I also am pretty sure they aren't using neural networks for denoising yet just a normal temporal + spatial filter.
It's a first generation GPU melting feature only for the dedicated graphics whores. Everyone else will get it in a couple years and appreciate the nicer graphics. I'll probably get a 3070 or something.

t. nGreedia employee

>MINIMUM REQUIREMENT OF 1000!
you seem very dead set on that number, why?

>1080p benchmarks in 2019

Because it's the officially calculated spec. To produce a fully rendered (at 60 FPS with ~16ms frame timings while 60 Hz panel refresh rate) 100% ray-traced 3D scene at resolution of 1920x1080, you need to least 1000 RT cores. That is the official minimal spec of the industry. This doesn't take into account 2560x1440, 4K, or higher resolutions, or any refresh rates higher than 60 Hz. Now go back and read this , and you'll get the picture. Novideo's """""""""""ray tracing""""""""""" is a complete and absolute SCAM, a total-fucking-FAD.

Looks good on a compressed YouTube video... wait until it's running a two feet from your face on a monitor with no color subsampling.

This is why nVidia is pushing DLSS, because they know people will have to run games under 1080p and upscale them.

don't have to imagine it desu

Attached: rtx meme.jpg (727x514, 46K)

(battlefield V obviously)

>X4 comes in 15 days
>supports up to 16K resolution
>noVideo GimpWorse RTX ON™

Attached: Laughing stops.gif (400x225, 1.48M)

When did nvidia become mid/late 90's ati in the drivers department.

416.94 driver was just released and this fuckery still happens
youtube.com/watch?v=U93IV8vv5EU

if you have a benq xl 2730 (zowie)
no 144hz at any resolution above 2k
If you want to use more than 2 monitors (3) all will go dark (driver crash)

all with 20 series cards.

bonus rounnd:

if you set the resolution to 1920x1080 you can actually set the resolution of the benQ monitor to 144hz.

but you still have multiple monitor crash.

on a whim i tried to set the resolution 2k and then in windows itself instead of the nvidia driver i set the refresh rate to 144hz. driver crash. black screen on the benq. no responce from acer monitor.

1 hard reboot later and i decided to try 2016's doom. at 144hz and 1920x1080 worked fine. but as soon as i changed the resolution to 2k in doom while windows resolution was still 1920x1080, the monitor as predicted went dark and the other monitor did not respond. (just frozen) But i could still hear the doom music and if i clicked the gun would fire ect. so the machine is not locked/crashed but the drivers are taking a shit. something about nvidia's control panel allows the recovery.

there is something defiently fucked with the 416.81 and 416.93 drivers causing this issue but nvidia is still claiming it's benq's problem and will not do anything about it.

Attached: how to install ati drivers.jpg (1177x805, 260K)

>When did nvidia become mid/late 90's ati in the drivers department
Since after 980 Ti.

The newer drivers are also gimping performance on older cards like the GTX 1060.

youtu.be/mFSLro_OXLQ

Attached: wcpve33fnttttrp78e2a.jpg (940x500, 57K)

>Since after 980 Ti.
hrmm .. i've been using nvidia cards since i stopped using my old matrox mystique. i haven't had issues like this since.

i remeber people talking about the problems they had on ati cards. i gave up on them because of the install/crash hell they became. (when you need a program like DDU to remove your drivers you know your coders/developers are shit)

i've never needed to resort to DDU till this last week. goddamnit nvidia you're defiently driving me to switch to ati finally. I'm thinking about returning this 2070 and getting an ati but i don't want to spend a bunch of money just to replace it with a new card in 1 or 2 months.

>100% ray-traced 3D scene
but that's never been the purpose of nvidia's RTX meme shit, its basically just bounce lighting calculations drawn over a traditionally rendered scene like the 32x on a genesis.

>The newer drivers are also gimping performance on older cards
They've been doing it since 6xx.

Lol and the actual graphics in game look terrible this is like 90s tier raytraced bad

Attached: ZoQvJ2IvVQiRk0zt.jpg (2560x1440, 665K)

You can do top tier, 100% accurate reflections and lighting effects, only if you fully trace the entire scene. Everything else is simply half-assed and downright cheating, BS. It's like comparing the ONLY true way fo top tier anti-aliasing, which is Super Sampling and ONLY pure Super Sampling, to the "conventional anti-aliasing" half-ass cheating garbage such as FXAA/CFAA, or etc.

This looks like fucking PS 3, what the fuck.

I think it looks good if I'm honest, but it is so not worth cutting your performance anywhere from 1/2 to 1/3rd for pretty reflections.

in this case playing the game at 4k runs better than 1080p with RTX on and that's a hugely noticeable visual difference, which this RTX shit ain't.

It's overly reflective junk seriously look at this
Dlss is also Mia as well as rtx ambient occlusion global illimination shadows and lighting

Nothing (NO THING) is worth cutting your gaymen performance by more than 10 FPS. NOTHING. But this HALF-ASSED GARBAGE SHIT cuts it BY MORE THAN A HALF. Absolute-fucking-DOA TRASH.

Told you fags months ago that raytracing in games was a huge meme. It's like the modern day "the power of the cloud to make gaming AI better"

>1080p
>over 60 FPS
what fucking year is it?

It's not a meme (Quake demos back in early 2000s clearly proven this can be feasible), it's just NOT ready AT ALL yet and is a thing of a very VERY (as in - 2028 AT THE VERY EARLIEST, with 2048 being way more realistic) distant future.

2018

no RTX actually works while cloud-tec literally was never a thing.

RTX is just shit, but it does work.

Attached: LITERALLY LIMITLESS.png (640x547, 167K)

I'm agreeing with you user, the purpose is not 100% accurate, its approximated. No developer is going to waste time on a fully ray-traced rendering pipeline from the ground up for the benefit of less than 1% of the already small high end gpu market.

I predicted it 4 months ago I've been following raytraced games since the 90s
It won't happen on midrange console pc mobile hardware for at least 10 years and still won't be able to do 4k raytrace then.
Tbqh until mcm and 7nm comes out Monolithic gpus have hit the wall.
700mm dies is insane

2018 if you're using an older radeon gpu and FX series processor.

>No developer is going to waste time on a fully ray-traced rendering pipeline from the ground up for the benefit of less than 1% of the already small high end gpu market
See techpowerup.com/249465/david-wang-from-amd-confirms-that-there-will-eventually-be-an-answer-to-directx-raytracing

$1200 card for playing in 1080p
ayy

Not him but there's tonnes of rtx/radeon rays real time rendering engines out there for cgi/pro use just nobody bothers for gaming

We have literally hit the wall in terms of what's possible on current hardware even if by some miracle amd/nvidia pooped out a 8 corelet zen2 Rome like die + chiplet design that had 100tflops+ and dedicated rtcore hardware with 100x the power of current 2080ti crap it would still struggle to do 4k 60fps with 10x the power because it couldn't use a hybrid approach it needs to be 100% ray traced no rasterized cheating like we are seeing now.

And then there's the problem of rtx not being able to do path tracing and more advanced stuff like wave tracing going forward.

Even if they charged 10k+ for the hardware able to do 8k vr raytraced crazyness who the Fuck would want it?

I'm getting flashbacks to shitty 90's CGI renders just looking at this thing.

Looked really good. RTX is the future as much as you guys want to go autistic on it on Jow Forums.

see

Unironically stuff from the 80s looks better/beautiful youtu.be/dhGz_7mYnj4

WE SYNTHWAVE NAO, BOIZ

Muh gigarays

>NIGGARAYZ

The future of differentiation is things like RTX though. Nvidia gets a step ahead and they have the lead so it's a good time to do it. It's early tech but it will get better and it looks good now at performance hit.

IMO it's good to push graphics forward and take the risk. You can buy a 1080ti if you don't want it. I don't see how this is bad for consumers at all considering in the $400 and below budget range there is competition.

>people spend thousands of dollars to brag about their LOL EA 1080P experience at a hair over 60fps

Attached: 1534044652665.png (521x737, 617K)

>goy citizen

Just you AMDtards wait untill every game has this and based Nvidia makes it so you can't turn it off. AMDead BTFO.

Attached: If_you_scroll_up_and_down_It_looks_like_theyre_chuckling.png (1920x1080, 1.03M)

>t.owner of a oculus rift dk2

Attached: 970.jpg (625x352, 72K)

It's amdtards just being butthurt.

Who mopped the floor and didn't put down a Wet Floor sign? They're fired.

>Early adopter's remorse = the post

Using ML and RTX is the future of graphics though. You can go look at published papers by nvidia to see this on denoising and temporal consistency etc. It's just going to take a while. It's not like game devs are using ML at all already even though it's applicable in areas. Nvidia is forcing this shit forward and in a few years everyone will benefit when it reaches cheaper prices and more maturity.

It's just dumb to shit on it when such effects had been thought impossible 6 months ago.

>reddit spacing

That's not the point.

Nice buttmad there, kiddo.

Not gonna happen until consoles make it standard like with dx11 (that took almost 10 years) and dx12 is almost 3 years old and still isn't standard.
No way proprietary dxr nvidia only crap like rtx is gone a take off until consoles phones and low to midrange pcs get the tech and that's 5-10+ years away

I wasn't aware polished marble looked like glass lol
We have had these effects in real time for years what the Fuck are you on about sto falling for marketing bullshit look up radeon rays
Denoising/ai is the only new
thing not rt
Oh yeah paying 1k for shiny floors in one game meanwhile Dlss and every other Gaytraced rtx marketing shit is absent months after the games launched

Uma delicia

But how will RTX spread and become the future of gaming if all the current consoles are AMD based? I smell PhysX again.

Attached: thsnek.jpg (680x835, 48K)

even povray scenes on a pentium ii are better than that shit

dead cards, preorders, 2 months wait, novideo doesnt know what the fuck they are doing
kys jensen

>delid CIA

also
>that resolution on the reflections
how the fuck is this better than a low res cubemap?
fucking leather jacket faggot

>Cannot-fucking-wait for 'DEM JUICY 7nm Radeons in a couple months from now...boy-oh-boy, what a treat that's going to be.

the 7nm vegas arent consumer products user..

It's THIS year's 7nm Vega that's not consumer. 7nm Vega/Navi that are coming out in Q1 and Q2 of 2019 (hence - a couple months) are gaymen cards.

Attached: PAY UP NIGGA.png (307x358, 152K)

It's not better
/v/g/ can blame dice as much as they want but the fact of the matter is low sample count raytraced images look like ass washed out crap with meme tricks like denoising (that still has a tonne of noise in motion)

OH NONONONONONONONONO

Attached: 1534799470933.gif (250x188, 1.63M)

Wew I didn't see that 30fps minimum hahahaha

dlss is not rtx though

BS, nvidiot got some useless features from the scientific cards and made up pseudo goytraicing+mayonnaise denoiser to get used in gaymes, its just a gimmick

Attached: rtx.jpg (1280x720, 276K)

>minimum
MAXIMUM*