New RTX feature!

>You can lower quality settings to improved framerate
Wow!

Attached: firefox_2018-11-25_17-40-33.png (1167x922, 1.16M)

Other urls found in this thread:

xbox.com/en-us/xbox-one-x
youtube.com/watch?v=uGmD2Vtw1f8
pcgamer.com/geforce-rtx-2060-benchmark-leaked/
blogs.unity3d.com/jp/2018/03/29/amd-radeon-rays-integrated-into-unitys-gpu-progressive-lightmapper/?sf185787736=1
twitter.com/SFWRedditImages

>106 fps to 107 fps is a 1% performance improvement

Attached: stopped reading there.jpg (250x272, 62K)

You can also lower the resolution for improved framerate, something Nvidia called DLSS, so innovative! AMD can only dream of having features like these.

Another case of

>NVIDIA EXCLUSIVE FEATURE
>BUY GREEN
>BUY THESE GREEN-ENABLED GAMES
>NVIDIA, THAT'S THE WAY WE'RE MEANT TO BE PAID

If nVidia dumped their time into improving standards, so games ran better across the gamut, so they could focus more on their hardware, versus helping devs implement the software, maybe 2080s wouldn't be catching fire.

>within margin-of-error "gains"
>raytracing that is not raytracing
My sides cannot handle this level of damage control.

Attached: Only_the_serious_know_how_to_truly_laugh.jpg (1200x793, 112K)

>4k gaming
>high-refresh 1440p gaming
>Ray tracing
>Power efficiency
All the things AMD can't do

>4k gaming
>high-refresh 1440p gaming
Nvidia raytracing takes away those two.

Redpill me on how that's wrong, please.

DLSS is anti aliasing you mong. If you don't care about AA you don't turn on DLSS.

Right now AMD has the chance to take over the CPU and GPU market. Intel is fucked with it's 14nm++++++ process and Nvidia just dun goofed with Gay TracingĀ®

the purpose of a GPU is to provide hardware that accelerates certain programming instructions. Nvidia has been expanding what can be accelerated.

wrong
it is literally image upscalling, like console have been doing to get a 4k output from a smaller image buffer (usually less horizontal pixel size)

>Your Nvidia GPU needs to shoot itself in the foot to lower itself to AMD level of performance
Sounds about right

margin of error

Right now AMD has the chance to take over the CPU and GPU market. Intel is fucked with it's 14nm++++++ process and Nvidia just dun goofed with Gay TracingĀ®

if you are running a game with DLSS you do not run any AA as its built in to the implementation used in games. this is why its significant.

they dont have the manpower to do that right now, all their shit is focused on the server business. it sucks. i hope im wrong, tho

1% of 106 is 1.06. not 1.
106 fps to 107 fps is less than a 1% increase in performance.
either use accurate percentages or don't use percentages at all.

> all their shit is focused on the server business
That's where everyone's shit is focused because that's where the money is.

AMD is raping intel badly, however they need a miracle to get to a similar level than turing without Gaytracing """"""""features"""""""".
upscaling is AA these days thanks leather jacket man!!

CPU, yes.
GPU, I don't think so. Radeon needs a chemo first and Nvidia is not like Intel, they have actually been doing stuff and they don't have problems with the foundries.

you dont need to turn on aa cause the upscalling already leaves enough of a blurry mess on the output for you to even care about aa, stupid. what you would want is an edge enhancer filter before displaying the final image

retard alert.

Attached: aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS82L0MvODA2MzQwL29yaWdpbmFsL1Byb3RvLTQwaS1hZnRlci1kb3VibGUucG5n (1142x839, 420K)

Well, Ryzen 2600 is literally the most selled CPU on amazon right now. All they need is a top tier gaymen CPU to put a nail in Intel's coffin.

If Ryzen 2 delivers a nice clock bump for desktop, Intel is finished.

I'm not deep into the math behind statistics, but I'm sure you can have a sample size that allows virtually arbitrary precision.

It's 0.94%, come on. That 0.6% probably actually is within the margin of error.

I don't know the methodology, though, but I'm not convinced that this is universally wrong.

Yes it's upscaled trash, no need to repeat yourself.

>leaves enough of a blurry mess on the output for you to even care about aa, stupid

WHAT THE FUCK DO YOU THINK AA EVEN IS.

ITS LITERALLY COMPUTER ALGORITHMS TO BLUR PIXELS.

Attached: best-anti-aliasing[1].jpg (400x263, 12K)

Over at Jow Forums we only want the highest degree of standards for our advanced mathematical models and algorithms. Did you think we are just random people who couldn't see a 0.6% performance difference?

Not that guy, but whether or not it's upscaling wouldn't matter to me personally, as long as the image has better fidelity.
If the net effect is functionally AA while offering better performance, it's a plus in my book.

>4K Gaming
>AMD can't do

Err...

xbox.com/en-us/xbox-one-x

Attached: RWhKvP?ver=b11f.jpg (1920x1080, 911K)

Didn't Shadow Warrior 2 uses something similar to this? On Pascal...

>calls someone else a moron
>is the moron himself
Oh, user.

Yeah, this is very obviously doable on current cards.

Maxwell/Pascal couldn't do CAS. They did MRS, which disabled certain graphical effects.

I like that we both went with the mistake. It's 0.06, not 0.6.
So, I'd say q.e.d no, we don't see the difference.

Sorry buddy, but I'm a real gaymer. 0.06% margin of error is far too much.

>literal upscaling counts as AA these days
Jesus Fucking Christ what the fuck happened to people

>selled
maybe learn proper english first before engaging in an argument

Sold*

Sorry, english is not my first language.

sorry for my bad english im from USA

I am not a fucking moron. Read any of the fucking literature about DLSS applied to games. IT PROVIDES ANTI ALIASING.

I am a developer with a focus in games. I couldn't give a flying fuck about all the stupid quibbles these pc builders have with company xyz.

>It's another "I'm too poor to afford the newest tech so I hate it and it sucks" episode

you don't know what AA is. prove me wrong.

ELI5 me what's the real benefit to CAS? Not the perf gains, but the technology.

>It's another "I spent $1200 on an RTX card OH FUCK OH GOD OH FUCK" episode

Attached: WAKE ME UP.jpg (900x1200, 152K)

Its not running the game at absolutely lower settings, but rather its trying to algorithmically determine which portions of the scene will need the least amount of GPU processing necessary to improve performance. The issue though is that the scene in question is a really poor example to show off potential capabilities of the technology.

It would be best to show it off during a scene where there's a lot going on, on-screen. Then break it down by what is key to the player's success, what is least benefitting from needing full shading accuracy, and how the technology can be used to improve performance for the player in such gameplay engagements.

Finally, its entirely fucking useless to lock it to the 2070, 2080, and 2080Ti. All three cards are crazy expensive, and this tech on paper, seems to see the best value in the lower and mid-range tiers, where every drop of performance matters for delivering a solid, stable, framerate.

Cool tech, but Nvidia's business decisions continue to fuck over the consumer. If I've dropped anywhere from 8-1200 dollars on a GPU, then I don't really give a shit over 1-7% differences. It's valuable if we're talking 15-20% differences.

Shutup John.

(1 - (106/107) ) * 100 = x%

literally neck yourself faggot nigger

OH NO NO NO NO NO

>0.93458 = 1
no.

Attached: .webm (960x568, 2.76M)

>4k gaming
>Xbox one X
Choose one.

I've never owned a nvidia card and I have a question: does nvidia install malware on your computer like AMD does when you update drivers? Cause I'm getting very sick of that.

It's supported by hardware changes. Low end 2000 series should have it too.

Yes.

Oh, so manually installing drivers is mandatory no matter which brand you have, huh

No, choose two.

youtube.com/watch?v=uGmD2Vtw1f8

Attached: forza-horizon-4-xbox-one-x-4k.jpg (3840x2160, 731K)

Why do PC people like to lie about the X GPU? It's faster than the rx 580 everyone on Jow Forums owns.

Upscaling from a shit resolution to 4K isn't 4K you mong.

Dynamic resolution is 4k, got you buddy.

I love far cry 5 which literally shills for AMD in its ads but you can only play on 4k with nvidia cards. Vega finally has decent drivers after all this time but even AMD's last card they released is only mid-range. The rtx platform is dogshit right now but nvidia can afford to take heat for noisy water, 100% gpu usage, and "optimization" because AMD has nothing to compete with. Though that's just right now, and I'm hoping AMD can come up with something decent in 2019-2020 because I'm waiting to upgrade from a 1060 6gb, and the 2070 is tempting but idk bruh

>Efficiency is bad!
>Why even have models if you're just going to have LODs diminish their quality with range???

>tfw bought a 1080 for 200$ from a guy who was upgrading to an 2080
>tfw 1080/ti prices shot up again because people realized the 20's are barely an upgrade. Should've just called them 1090's and 1090tis.

Attached: 478e5f1c146ae3ecbc54ec97ed47e787.gif (278x340, 74K)

I'm a developer too.
DLSS is an upscaling method. It provides AA merely by being blurry lmao.
It's 1080p upscaled to 4K. It looks and runs similar to native 1800p.

It's a bit below the RX580.
It is 6TFLOPs peak, not 6.1+ sustained.

you are a developer and you don't know the difference between 1080, QHD, and 4K ? you also don't know that DLSS is using a convolutional auto-encoder that is doing edge and multidimensional shape detection ?

really joggin the noggin.

slightly lower flops but tons of extra memory bandwidth
dlss isn't quite cooked yet but neither is anything rtx
consider that dlss cost might barely compete now but render loads get higher every year and nvidia will cram more tensor cores on every new generation
subsampling is already an important option and it only gets more important from here

msaa, at least from my understanding, is a super sample, it goes to a retardedly high resolution, but only does edge vertices and the little bit they need, then downscales that to your current resolution, its why get some implementations where its not costing to much, and others where you may as well be super sampling at this point.

fxaa and several other ones that do full screen blurring of pixels, sure, but not all aa is blurring.

>QHD, and 4K
GJ outing yourself as a no-nothing retard. QHD is synonymous with 4K as far as consumer monitors and TVs go.

True, RX580/590 is really starved for memory bandwidth and the Xbox One X has significantly more.

yet another thread full of poorfags complaining about the only good company out there called Nvidia.

just stop being poor.

Look ma! I made a post on 4channel!
...about video games! Haha, isn't that funny?
What do you mean "Go back to Jow Forumspcmasterrace/"?
I'm 40 years old ma, what do you mean "Videogames are for children"? Do you mean like those porn cartoons I watch all day?

Looks like the 2060 is just going to be a further cut down TU106 pcgamer.com/geforce-rtx-2060-benchmark-leaked/
Or they're scamming people into buying tensor cores and RT cores for features the card isn't even powerful enough to run.

blogs.unity3d.com/jp/2018/03/29/amd-radeon-rays-integrated-into-unitys-gpu-progressive-lightmapper/?sf185787736=1

you were saying?