Nvidia #BeForTheGame GPU launch livestream (STICKY)

The more you buy, the more you save goyim!

twitch.tv/nvidia

Attached: 1514887155687.png (550x243, 7K)

Other urls found in this thread:

developer.nvidia.com/rtx/ngx
twitter.com/NSFWRedditVideo

previous thread

Attached: 1507273468048.jpg (1000x517, 94K)

Why did they jew the 2080 with only 8GB of memories?

Attached: 1533856204493.jpg (1448x980, 204K)

reference design

Attached: 1515572287875.jpg (656x241, 37K)

>RTX 2080 $750 in 2018
>2944 cuda cores
>GTX 980 ti $679 in 2015
>2816 cuda cores
R.I.P. Nvidia

my drangus is tingling

Attached: brules rules.jpg (1280x720, 76K)

Attached: 1528686960734.jpg (1000x828, 155K)

It's time for more pricedrops

Attached: 1516958446269.jpg (1000x767, 110K)

Because only Samsung makes 16Gb/2GB capacity memory modules

Until SK Hynix and Micron catches up to Samsung, there's not enough supply to cover everyone's needs

I already told you to stop that you fucking cunt

How much performance boost will the 2050 be compared to 1050 ti?
Is it worth the wait?

Attached: 1521503795737.jpg (964x847, 133K)

1440p 240hz monitors when?

>moar coars is everything
How to spot amdfag.

Attached: 1529118129697.jpg (1000x750, 147K)

>3 DP
>1 HDMI
>Probably, a Type-C
>No DVI
D R O P P E D

Cx

Why would you want 240hz? Cs go pros dont even bother with them

Attached: 1520892324941.jpg (1000x803, 151K)

We don't know that yet.
Heck we still don't know how much the 2080 will be over the 1080. You'll just have to wait and see.

You do realize the value of money reduces every year? 100$ was worth significantly more over 30 years ago.

GTX 2080TI 2x SLI NVLINK !
Finally 4k@240fps on all recent games ?

Attached: 1516285004-risitas-hyperespace.gif (352x264, 2.51M)

we'll know only when benchmarks are released.

>shower

More hz is always good.

SOON

Attached: file.png (1280x720, 1.13M)

>showering at night

>2015 was 30 years ago
Brainlet or bait, you decide.

Why have all of the leaks been about 2080 and 2080 Ti? Where's 2070? 2060? 2050?

you only get this quality if you buy the highest end Quadro RTX ($10K)

Probably only high end stuff today.

stop feeling enthusiasm for a soulless piece of hardware

you guys are pretty pathetic

Help us, Intel, you're our only hope.

Attached: Intel-Graphics-Card-Discrete-Gaming-Solution_7-740x396.png (740x396, 231K)

AMD
SOS

Attached: 1529332716094.jpg (464x357, 25K)

yikes

In GPUs?
Of course.

Placebo teir purchase

stop feeling butthurt for a soulless piece of hardware

you are pretty pathetic

>t. 60Hz

Attached: tTUYlCH.jpg (580x628, 49K)

Only if you're comparing cards in the same gen.
Did you fall for the amdtard meme that everything nvidia is refresh?

When does the livestream start?

i hear they're adding ray tracing to the new tomb raider game

Attached: esolruyz7wjrqfm2asp8.jpg (3840x2160, 1.88M)

>Source: my dad works at nintendo

click the link you mongoloid

Anyone member 60Hz displays?
Yeah, they're coming back.
Thanks based NVidia.

?
More ALUs is always better, stop being retarded.

>Crypto market literally in the fuckinh shitter
>New GPU prices are still fucking insane like we are in the middle of mining craze
>Used GPU prices are fucking insane too and full out crypto GARBAGE.
WHY

Attached: DP03JW0UQAAdH9I.jpg (1199x481, 168K)

user.. you're going to buy me a RTX 2080 right?

Attached: 1528492997290.jpg (1080x1080, 159K)

look at how much better it looks

Attached: nusohul2oln00kxvlfmn.jpg (3840x2160, 1.61M)

nice meme faggot.
>hurr I added noise to the image because raytracing
did you forget the whole thing revolves around AI-denoising?

>games are using 10Gb+ of VRAM when it's available now days
>lets release a $700 8Gb GPU.

AMD CPU and GPU are actually lower latency (time between your give mouse/keyboard inputs, and the time it takes for the CPU to send draw calls to the GPU and the time it takes for the GPU to render and output frame to your monitor) than Intel ones, which makes things more responsive.

Thank you sticky mods.

Just look at the damn page, it has a live countdown

60hz is still the most use refresh rate in the world

Negative, 144z, the best refresh rate

Sexy actually.

They're experimenting with all major game devs but real time ray tracing for games is a ways away except for maybe simple things at 720p or lower res

Well yeah it's horrifying

At what performance level would a $1000 price tag actually be justified?

raytracing fans on suicide watch. Just watch, one of them will start talking about how AI can conjure up information out of thin air and post a screenshot of a low-resolution "realistic" scene where all of the edges and textures are blurry

Look, at current price, when you mine, it's like directly converting your electricity bill into shitcoins.
If those new GPU are any more efficient at mining, then miners will grab them all.
The worst mistake I made was stop mining eth back in 2014.

>Nvidia
3.5GB

Memory bandwidth alone is ~27%. On top of that, core count difference is ~21%.

This will atleast be 25% faster. More likely, around 40% faster.

Not on Jow Forums if you believe posters.

>7pm
>night

Attached: 07.png (395x318, 334K)

you're right, i've enabled OptiX™ here.

Attached: esolruyz7wjrqfm2asp8.jpg (3840x2160, 553K)

AMD doesn't want to compete with Nvidia in the PC market/consumer market (because obviously they can't with their current RnD investments). Rather, they are focused on bringing the high-end 7nm Navi to the professional server market. That being said, since Lisa Su is in charge of RTG, we can not ignore that the team is focused on Epyc and the upcoming Ryzen 2 7nm, which is going to BTFO Intel in the next few years.

less than 10% of the people browsing this thread can afford a 2080Ti

so why do you care?
what do you have to gain?
go outside, do something else more productive than bandwaggoning some corporation that wants to overcharge you for useless features like ray tracing

Perfect.
This is The Way It's Meant To Be Played™.

From what I gathered, Pascal was never really memory limited, so it's not a simple multiplication thing.

I expect Turing to offer us "free" antialiasing. Check this out: developer.nvidia.com/rtx/ngx

DLAA is AA that can be done in the tensor cores so it won't load the cuda cores and should have a near zero performance hit since the tensor cores aren't being used in games anyway.

Seems like it can be easily injected into any game like FXAA too.
>The RTX feature is integrated into any game, application or plugin through the use of the NGX SDK. This is the main code for the AI-accelerated functionality, but it also requires a trained neural network in order to function. NGX has been designed to be thin - a header file that points to a DLL in the NVIDIA driver, making it simple to augment any application with these RTX features

since there was no node shrink still at least 50% over the 1080ti with a full die and no cucked memory system (352bit).

30 MINS

hey mongoloid, the main rendering is still done with rasterization. you clearly don't know shit about rendering.

Attached: 1503565968331.png (500x373, 201K)

So will 1000 series cards go down in price finally?

Mathematically 42% increase in performance over 1080 Ti.
But that doesn't take into account that each new gen offers more performance for the same amount of money. So taking that into account I'd say $1000 would be acceptable for a 70% increase in performance over 1080 Ti.

Goddamn
I expected this but this is so cool.

That's actually a nice looking cooler.
About as nice as the silver Vega one, even.

>clear plastic
what.
And it's a 3 slot.

Nice meme.

Na, it's likely exactly 25% faster. Maybe 30%.
It could be 45% in games optimized for it (in some tech demos they'll show) and using async compute, but on average I'm reasonably sure it's 25-30%.

How the hell are you getting 50% when you know there's no node shrink?
It's the same node the V100 was manufactured on.

No. New cards are just more expensive. You're getting jewed.
>But that doesn't take into account that each new gen offers more performance for the same amount of money
Not this time.

But AA was only a problem back in the days.
The fact that game developers opt for FXAA is only a console thing.
You can force it to MSAA or SSAA in drivers.
You've got plenty of VRam anyways.

>Nvidia cucks are also Twitch kiddies

I would have liked to say I am surprised,

But I'm not...

>people complaining about the price

You're still going to buy it. Be honest with yourself. Do some soul searching.

l2havefun, friendo

Attached: ruslan-pronin-344-2.jpg (1219x1208, 476K)

ho boy it's happening.

Idk why you're saying AA was only a problem back in the day, it still can cause a big (~10%) performance hit.

Nah, I sold a kidney and bought a 1080Ti for 720€, I'm never again spending more on hardware.

that looks fucking disgusting
>sick looking skin
>that dumb expression

Nah man that GPU alone costs more than my PC build

PSA:

This will go down the same as always.

High end card is released with a lot of fanfare.
Then progressively lower ends with barely any information.
Then the highest end is released.
The performance will be a 20 to 30% improvement for every tier.

Story will repeat in a year.

Nothing ever changes in this cycle of consumerism.

Why. It's a great for 1080p/60

2x 2080 Ti will wipe the floor with that nigger

>2015 was 30 years ago
Read everything again and show me where I made this claim. So is your post brainlet or bait, you decide.

As if review sites even test that anymore.
They just go with whatever ingame AA option is there and call it 'high settings'

2060 will take a big poopoo on it.

implying you're a cuck because you want the best performance

i kinda like how bad the skin looks, t b h.

Attached: chrome_2018-08-20_16-36-06.jpg (519x585, 185K)

You're wrong already