THANK YOU BASED NVIDIA

blogs.nvidia.com/blog/2018/08/22/geforce-rtx-60-fps-4k-hdr-games/

THANK YOU BASED NVIDIA

Attached: TuringVsPascal_EditorsDay_Aug22_v3-2.png (6000x3375, 143K)

Other urls found in this thread:

youtube.com/watch?v=CT2o_FpNM4g
computerbase.de/2018-07/hdr-benchmarks-amd-radeon-nvidia-geforce/2/
nvidia.com/en-us/geforce/products/big-format-gaming-displays/
youtube.com/watch?time_continue=194&v=m42XiyJgyco
developer.nvidia.com/announcing-cuda-toolkit-10
wccftech.com/resident-evil-2-support-nvidia-rtx/
videocardz.com/77696/exclusive-nvidia-geforce-rtx-2080-ti-editors-day-leaks
twitter.com/hms1193/status/1032351762018181121
guru3d.com/articles\_pages/star\_wars\_battlefront\_ii\_2017\_pc\_graphics\_analysis\_benchmark\_review,5.html](https://www.guru3d.com/articles_pages/star_wars_battlefront_ii_2017_pc_graphics_analysis_benchmark_review,5.html)
eurogamer.net/articles/digitalfoundry-2018-08-17-nvidia-geforce-gtx-1080-ti-benchmarks-7001](https://www.eurogamer.net/articles/digitalfoundry-2018-08-17-nvidia-geforce-gtx-1080-ti-benchmarks-7001)
techspot.com/review/1478-destiny-2-pc-benchmarks/](https://www.techspot.com/review/1478-destiny-2-pc-benchmarks/)
twitter.com/SFWRedditGifs

based

inb4 this gets deleted while the faggot shill gets to spam his threads

There is quite literally no way that the 2080 is going to be more than twice as fast as the 1080. The numbers simply don't add up based on Nvidia's own specifications. Can't wait for the actual benchmarks, rather than Nvidia's usual bullshit bar charts containing no real data.

it's more like 1.5x without DLSS which is still good imo, and DLSS could work quite well

AdoredTV leaked the RTX video weeks before the announcement and he said it was 1.5 faster 2080 vs 1080. Honestly I think NVIDIA might be showing true numbers or at least very close.

someone just photoshopped and replaced the 1080ti text to 2080 text

Interesting stuff
youtube.com/watch?v=CT2o_FpNM4g

>We should note that NVIDIA will enable DLSS in games for developers for free, if a dev just sends them their code for processing on an NVIDIA DGX supercomputer. NVIDIA will hand back that code, which is reportedly just megabytes in incremental size, and enable the feature in their driver for that game. And, as with anything AI and machine learning, the more you feed the beast, the better it gets at the task at hand moving forward, in this case with a wider swath of game engines and visuals.
how can amd even compete

Notably Nvidia are also using the handful of HDR games that exist on PC, where the 1080 loses 10%+ of its performance.

computerbase.de/2018-07/hdr-benchmarks-amd-radeon-nvidia-geforce/2/

I mean, if they've solved that problem, great, but there's almost zero fucking support for HDR on PC right now, so who really cares?

nvidia.com/en-us/geforce/products/big-format-gaming-displays/

amd btfo once again

Lies

Attached: jews deception and scheming in the dark Memri+tv+part+16_c97a9f_6411091.jpg (1200x896, 72K)

very convincing argument shill

> cloud tethering
As if Geforce experience isn't enough.
Via Geforce Experience, they restrict access to Geforce's capture card capabilities to their proprietary cloud software. If you think DLSS is a good thing in that it ties the new features of the card to the cloud, you're asleep. I have this nasty feeling they're not going to allow raw access to the ray trace/tensor cores.

>immediately makes a new thread when the old one dies
this is the future we chose for Jow Forums

?

Those aren't the first HDR PC monitors and TVs have had it for years. Why would overpriced G-Sync monitors that can't even do the advertised 4K/144Hz without using chroma subsampling "btfo" anybody except the idiots who got scammed into buying one?

it's how the machine learning meme works, you feed it with a ton of data on a supercomputer, i bet you suck elon musk's dick, the self-driving meme works in a similar way

>comparing the cheaper 1080 to 2080
>not comparing 1080Ti to 2080

jews being jews again

I clearly indicated how it works for the current and upcoming gen and expressed my concern and dislike for it. Thanks for missing this point reiterating it unironically.

Now the thing that matters ,what's the hashrate?

>blurry pictures with less detail yeld some more fps
wew, who knew. its not like console folks havent been using that shit FOR YEARS now to upscale picture to 4k output from a 2k source
give it up jensen, aint nobody gonna be fooled by your lame shit

i'm not the one pushing for the machine learning meme, but this is how it's going to work, and in this case it actually looks useful and it's not harmful in the autistic freetard way you suggest

with DLSS it's different, imagine giving a reference pic to a highly skilled painter, they could make a baller ass painting for you in any resolution that you'd like

>1,2
This explains nothing.

>If Nvidia just refreshed Pascal, everyone gets mad and says Nvidia is being Intel
>When Nvidia innovates with Turing, concurrent FP & INT execution, Tensor cores & RT cores, everyone gets mad also

Ignorant faggots being ignorant as usual

> gaymen
> compute
Will CUDA dev kit allow ray access to the ray tracing and tensor cores that make up the DLSS/Ray trace meme functionality in gaymen?

> Will you be able to use the hardware w/o restrictions on your own?
Get the point of my comment now retard?

My only concern is whether or not there will be unrestricted access to the ray trace and tensor cores for general purpose compute. I could honestly give zero shits about the gaymen portion which seems fine. For it to be of value, the new features should have access. Nvidia's track record with existing geforce hardware isn't boding well.

idk, the tensor cores might just mean that you have more low precision flops tho

Shocking interview with an Nvidia engineer about Turing

youtube.com/watch?time_continue=194&v=m42XiyJgyco

Are you fucking retarded?

People with Volta GPUs, Titan, Quadro or Tesla have full access to Tensor cores

Nvidia wants developers to use their innovations, of course any developer will get full access to the hardware

developer.nvidia.com/announcing-cuda-toolkit-10

wait, what happened to the "you don't need AA" at 4K resolution meme

> Are you fucking retarded?
No. What makes you think that?
> People with Volta GPUs, Titan, Quadro or Tesla have full access to Tensor cores
They also have double the memory, ECC, and a whole slew of other features. What does that have to do with my question about whether or not Geforce will especially when they have a history of restricting access to hardware in Geforce?
> Nvidia wants developers to use their innovations, of course any developer will get full access to the hardware
IS THIS TRUE ON GEFORCE FOR RAY TRACE AND TENSOR CORES?

>it's different
>blah blah
stfu, you have no idea what you are saying.
even fucking TVs have been using that shit to denoise mpeg artifacts and supersample the picture, both on keyframes and subframes. consoles go even a step further.
your shilling is NOT working, and your lousy attempts at it are pathetic

idiot, it's using DEEP LEARNING that has been tuned for the specific game, it IS different than some basic bitch upscaling/denoising algorithm

that's bs, if you're sitting that far away from the screen or you have poor eyesight (get glasses ffs) you could just as well play at 1080p with AA, even 4k with AA could be improved upon, it's just about the limit of what is feasible with today's consumer technology

so is dlss is implemented on the game developers end? Will this card out of the box offer close to 2x the performance @ 4k or will games need to be tuned first to reach that performance?

The comparison is probably the 1080 with heavy slow AA (which is unnoticeable at 4K), like x4 TXAA, compared to fast DLSS AA.

This image actually confirms that the 2080 is roughly at or below 1080Ti performance.
You can see that it's around an average of 30% higher FPS in these cherry picked games, which is almost about how much higher than 1080Ti is over the 1080.

So the $850 2080 is slower than the $650 1080Ti when you're using it in today's games with realistic non-cherry-picked settings.

So I was exactly right in my predictions.
Once again.
Go fucking figure.

GOD THIS IS HEAVY DEEP LEARNING YOULL WANT TO KEEP YOUR ABS TIGHT

Attached: 1534795878209.png (1199x773, 559K)

see

>This image actually confirms that the 2080 is roughly at or below 1080Ti performance.

Attached: 1506610021596.jpg (645x729, 46K)

It was him reiterating fake leaks sent to him by a fake nvidia employee via email.
How? Just how? How are you this dumb?

I'm a Nvidia fan, but numbers don't look good, especially without 2080ti numbers.
Also
>Pugb
Fuck this

What is the scale on the left? What a shitty official image.

THANK YOU BASED NVIDIA

wccftech.com/resident-evil-2-support-nvidia-rtx/

Anyone saying Battlefield V doesn’t hit 60fps at 1080p is flat out wrong. That’s a lie or an incorrect understanding of what was being seen. It’s much higher.

Also, expect huge gains. This was less than two weeks of work by DICE. They barely had access to these cards prior to this event.

Let’s not judge so quickly here. Battlefield V is much faster and we know the team had very little access to this hardware. I suspect Shadow of the Tomb Raider is the same.

Also, Shadow of the Tomb Rraider is perhaps the least impressive playable game of the bunch in terms of how RTX is used. Resident Evil 2 and Metro Exodus look dramatically more impressive and ran like butter.

thank you

demo gtx1080 with TAA vs rtx2080 with DLSS

from two diferents off screen vids and exactily same runs frame to frame.

a fake demo from nvidia?

Attached: DLSS.jpg (1177x1087, 298K)

why would it be fake? ~2x performance with DLSS looks right as in OP's pic.

is it just me or might this unironically be a historical leap? no meme for once

I mean that the demo is constantly showing the same frames and fps without variation run after run and it seems strange

sorry for my poor english

yep it looks great, just that Jow Forums kids might be butthurt about not able to afford one of the top cards, in 10 years we might have fully ray traced high end graphics as well which is incredible

videocardz.com/77696/exclusive-nvidia-geforce-rtx-2080-ti-editors-day-leaks

Turing now matches Volta, 64 CUDA cores in 1 SM unlike Pascal's 128 CUDA cores in 1 SM

Uneducated ignorant morons will still say it's Pascal with Tensor cores though

Attached: NVIDIA-TU102-GPU-Block-Diagram.jpg (1200x657, 173K)

It's as important as the first programmable pipeline and the first unified shader. It would take 1-2 gens to become mainstream and useable though.

another

Attached: DSSL2.jpg (1183x1033, 288K)

twitter.com/hms1193/status/1032351762018181121

Adored shill SEETHING

RTX 20 is a huge sales success

the temps with DLSS look pretty bad. it's to be expected i guess

Maybe Nvidia ran a pre-recorded video as if were a real demo.

Probably, that's pretty typical with public demos. There's no reason to do real time rendering when we already know what the hardware can do.

I've been saying it's Volta with a ray tracing ASIC since before the announcement, m8.

There's not many people saying it's just Pascal with tensor cores. Maybe 2 anons?

>$1200 screen tearing.
I thought they said this was 60fps? It shouldn't tear then.

>screen tearing happens only on low fps
brainlet.jpg

it's running on a fucking GSYNC monitor

>Comparing server and prosumer GPU to consumer GPU.

>I was right BECAUSE my baseless assumption is right
>nVidia BTFO

GTX 1080 = 1
RTX 2070 = 1.05
GTX 1080 Ti = 1.33
RTX 2080 = 1.42
RTX 2080 Ti = 1.87

lol @ rtx 2070

NVIDIA just made already ridiculously expensive PC gaming even more expensive.

I wonder if they are trying to kill off PC gaming on purpose.

>Resident Evil 2 and Metro Exodus look dramatically more impressive and ran like butter.
show some on/off pics from both games because Google isn't showing any.

literally no difference except the top one has deeper contrast.

With gaytracing and DLSS the card's gonna run like a bonfire, best to give this useless shit a miss until it can handle 1440p 144hz with everything enabled without burning up or eating a million watts.

Why is the screen tearing on a Gsync monitor ???

Everyone on Jow Forums was telling me my baseline, reasonable assumption which any knowledgable tech analyst would agree with was crazy and that I only had such low expectations because I'm an AMD shill, late.

You should check the >>/pcbg/ threads prior to the anouncement. There's still a few days of them up.
Granted, one of them was one guy pretending to be 3 people like he always does, so I guess it's not really that many people.

They're mad that Sony and Microsoft don't want to work with them, so they want to kill PC gaming?

they're literal ray tracing cards, older cards still exist, and 2060 is coming at a lower price point, $499 launch price for a high end card like 2070 isn't ridiculous, fucking entitled poorfag

>2070
>high-end card
hahahahaha what a fucking retard
it's not high-end even with regards to market segmentation, much less the actual hardware you autist

at minimum it's upper mid range, and it's priced above what the average amd kid can afford, the X80 card used to be the flagship, and they added the titan and X80 ti as the ultra enthusiast cards for those with deep pockets, the X60 is the low-mid card which is affordable and still decent for gaming

and it's a good hardware, it beats gtx 1080, fucking idiotic shills, as the digital foundry guy said they will likely get more performance and more interesting uses out of RTX, and the DLSS looks superb as said and with DLSS you will get a massive performance increase without really giving up any noticeable quality

No everybody was telling you your fucking retarded.

So they were telling me I was retarded when I'm right, yes.
Now it turns out I'm objectively correct, thanks to all these reveals and overwhelming evidence in my favor to back me up.
So I'm not retarded and they are?

Right, that's exactly what I was saying. Thanks. Not sure why you're agreeing with me in such a contrarian manner.

Launch price of the 2070 is $600, not $499. The MSRP is a lie just like they did with with the Pascal launch. 18+ board etc.
Also, it's a midrange card now. 2070 was only high end 30 months ago. It's almost 2019 now.

Navi is expected to be 2070 performance or better for $300-$350. AMD calls that performance level in 2019 a "mainstream range" card.

AMDelusion: the post

i am typing this on a p series quantum with picture quality you can only dream of monitor peaseant

it supports both hdr10 and dolby visiion which battlefield 5 on PC supports but NO MONITORS DO.

MONITORS BTFO

Attached: pquant.jpg (550x345, 63K)

>this guy has been correct about every single prediction for over a year running at this rate
>he's either a deeply connected insider or some sort of deep learning predictive AI
>haha I'm going to call him a delusional AMD fanboy again like I said in regards to his recent predictions which all came true. It'll be funny.
Yeah, I found it pretty funny. Good job. But you could do better.

Attached: i love pizza.png (980x1200, 1.71M)

Just filter or don't respond to the retarded attention whore drone.
Subhumans like him literally go mad irl when nobody acknowledges their existence.

CHECKED
nice whore too, would love to throatfuck her

I've only been in the /pcbg/ threads lately but he's the only guy who's ever offered actual advice and help without being a condescending cunt.

>trusting a biased retard who doesn't know what he's talking about
good luck

try putting intel or nvidia in your part lists whilie asking for advice and watch him turn on his autistic rage mode

this

I like your posts and I agree. Pascal just sucks at HDR and 4K and it doesn't support FP16 or Async Shader. That's why nvidia used a bunch of HDR supporting games and even AMD sponsored games like Hitman and Wolfenstein 2. 2080 should be on par with 1080ti in most games.

DLSS isn't supersampling? What they are doing them? Streching images? How the hell pubg runs so bad? Only 3 games have 2x performance of who knows what the fuck they are mesuring.

they're rendering at a lower resolution and using deep learning to fill in the gaps based on its prediction of what a higher resolution render would look like, a bit like waifu2x

nvm maybe it's rendering at 4k and using deep learning for anti aliasing

I've samefagged in the thread and he's recommended NVidia and AMD products based on my made up needs/budget. I'm not gonna defend the guy further. This board already clearly made up their mind on him and would rather shitpost than counter his suggestions without using memes.

>I've samefagged in the thread
is this an even bigger autist or is he samefagging to get more (You)s and to try to save face

>y axis unlabelled so you can't sue them later

it's clearly the relative performance (frame rate)

user... It's a ratio, it has no units...

He always recommends Vega crap, the 1070ti shits all over it. For anything beyond, there's the 1080ti.

Comparing 4k benchmarks to 1080ti:

Battlefront 2

* 2080: 65 fps
* 1080ti: 66fps [guru3d.com/articles\_pages/star\_wars\_battlefront\_ii\_2017\_pc\_graphics\_analysis\_benchmark\_review,5.html](https://www.guru3d.com/articles_pages/star_wars_battlefront_ii_2017_pc_graphics_analysis_benchmark_review,5.html)

Battlefield 1

* 2080: 84 FPS
* 1080ti: 82 FPS [eurogamer.net/articles/digitalfoundry-2018-08-17-nvidia-geforce-gtx-1080-ti-benchmarks-7001](https://www.eurogamer.net/articles/digitalfoundry-2018-08-17-nvidia-geforce-gtx-1080-ti-benchmarks-7001)

Destiny 2

* 2080: 66 FPS
* 1080ti: 88 FPS [techspot.com/review/1478-destiny-2-pc-benchmarks/](https://www.techspot.com/review/1478-destiny-2-pc-benchmarks/)

>reddit spacing
>reddit linking

Attached: 1494878561280.png (200x176, 38K)

>reddit spacing
come on man it makes it easier to read, looks a lot cleaner too.

Nice y axis label, better luck next time Raheem.

>check /pcbg/
That namefag is a notebook example of a delusional amdrone. I seriosly hope he doesn't shill for free.

If this is any true I'm getting 2080ti

It's probably in ray tracing benchmarks not in regular ones.

Lol I know lots enjoy my posts. You don't have to validate. This is just Jow Forums where people love to get (you)s for yelling at namefags even if they aren't tripfags and are right.

>doesn't support FP16
desu this isn't a big deal.
double-rate FP16 isn't nearly the insanity of quad-rate or octo-rate FP8 which is basically what tensor cores do at mixed precision in matrix ops.
Vega 7nm is supposed to have a higher rate quad precision and virtual tensor cores, though. And the die size is fairly small (if it were 14nm). It's a new arch. Vega is kind of disappointing, though much better than Fury.

As for async compute, yeah.
Nvidia is now pushing DX12 and Vulkan and asyc compute because without it, the 2080 about even with the cheaper 1080Ti. But with Async Compute, the 2080 beats the 1080Ti at least slightly.
So Pascal is going to be falling behind in new games, but hilariously it'll be much better value on Vega and Polaris which already support Async Compute.
They're more concerned with cannibalizing their own customers than with actually competing with AMD.