2080 has raytracing

>2080 has raytracing
>It's barely a hybrid between raytracing and simple raster tracing, just gimmicky light effects and shadows
>No games actually allow you to test raytracing because they dont exist
>2080 is bulky piece of hardware with a gigantic, noisy fan
>Sucks up immense amounts of power for only a minimal jump in performance
>Almost sold out due to the massive amount of preorders

HAHAHAHAHA INTELFAGS, CANT WAIT TO SEE THEIR DISAPPOINTMENT WHEN THE REST OF US BUYS A NORMAL GPU FOR ADULTS

Attached: 1534508376128m.jpg (726x1024, 89K)

I'd argue it's the most pointless gpu nvidia ever made even a 1080ti is faster and has less vram bandwidth congestion

milkies!

>I'd argue it's the most pointless gpu nvidia ever made even a 1080ti is faster and has less vram bandwidth congestion
So I see this a lot. And it is true.
But here a new 1080ti and a 2080 have a 50 euro price difference.
So if ray tracing costs 50 bucks I don't see what is the problem.
That seems reasonable to me.
This only applies if you dont own a ti tho. Only when you need to upgrade.
Which I do.
So buying a 1080ti would be stupid for me.

Why pay 50 more for a bunch of gimmicks.you're never gonna use?

What's she doing?

>Why pay 50 more for a bunch of gimmicks.you're never gonna use?
If I'm already shilling out 800 euros. That 50 euro difference for ''gimmicks'' means shit.
Of course I'm gonna buy that. Also the chance it can become a big thing is pretty high.
So I don't feel that bad about spending 50 euros more on a already expensive device.
It's called future proofing.
Every gpu gen people say the same. In the end people telling you ''buy a old better card'' are 9/10 times wrong.

>Jow Forums fags told me not to get a 1060 but to get a 1080 during the cryptomining inflation
>2080 is garbage
>Get to buy a cheaper 1080 soon as all the Nvidiababies are rushing for the 2080

Attached: 1536624400539.jpg (768x1024, 234K)

>get a 1080 during the cryptomining inflation
Never buy a GPU at the half mark of it's lifetime.
You got played user.

What else do I get then, the 2080 didn't convince me at all.
1060 will be built into a gaymen laptop as an eGPU for my gf.

>the 2080 didn't convince me at all.
Thing is that old cards go into a spiral down performance after new cards get released.
The 2080 looks like shit now, but when more drivers and new games come out it'll shine.
People said the exact same shit every new gpu gen.
''Just get a xxxx they'll be cheaper'' 1 year later they bitch that nvidia/ amd is cucking them. And that the card is shit.
Now if a 1080 costs as much as a 1060 would by that time. Go ahead.
But most places (aka anywhere but the us) dont drop prices that much.
And the 50-100 bucks difference means shit at the 800-900 mark.
But hey if you are in the US go ahead.

1080 is less than 500 euros, we're talking a 350 euro difference for something that might be a thing, but most definitely not this GPU generation and you know it.

>she

>1080 is less than 500 euros, we're talking a 350 euro difference for something that might be a thing, but most definitely not this GPU generation and you know it.
Wait why would you choose between a 1080 and a 2080?
They're not in the same performance region?
You should be looking at the 2070 coming up.
>difference for something that might be a thing, but most definitely not this GPU generation and you know it.
For the small price difference it is not a issue.
Also you're retarded for trying to compare a ti level performance card with a base 1080.
Dont look at a 2080 at all if that's your aim

Every since I saw one of the games i've been dying to play will have ray tracing I convinced myself this would be worth the buy. Now watch the game end up as total shit.

Fuck off.

Honest to God? She's actually just cleaning her phone screen.

>>Almost sold out due to the massive amount of preorders

I can't believe Jow Forums still doesn't get it, companies frequently put an initial very low supply of a good especially so it sells out, that way they can get some free PR out of it. Where I live they had only a few in a city of over 2mil people

Cute

>he doesn't know

Anyone wondered why no test mentioned power consumption?

Attached: Thermi 2.0.png (1961x1084, 851K)

Jesus fuck it's over 350 watts for the gpu alone what a shitty housefire
Imagine how hot it gets when they turn the rt and ai stuff on hahahahhaha

Looking at this chart, You now need a 1000W power supply for your 9900k/2080Ti setup.
We've reached power levels never seen before.

Fuck that my little 750w wouldn't handle it hahah
2700x + 1080 here suits my needs fine

I just bought a ftw3 1080ti.

It cost me 3450 banana coins and still has 2 years warranty, whereas a 2080 would set me back 4999 credits from my banana stash.

>Im too young to remember dx11 being laughed at this exact way

who the fuck cares about power consumption you brainlets. Any non porfag has a 1000+ watt psu anyway.

Literally fucking useless
I had a 1kw psu for almost a decade before my 390x blew it up most pointless part of the pc.
Actually fag I remember when glide dx9.x 10.x where laughed at even harder.
Dx11 actually brought some cool shit with it like proper tessellation (cue crysis 2 and nvidiots bs)
Dx12 has been a flop and vulkan is the only one showing promise.
Sadly devs are lazy as fuck and won't optimise either way.
Wolf 2 is the only game that scales well on these new gpus
Doom eternal as well since its on the same engine
By then amd might respond or these turing turds will drop in price
RTX is a complete gimmick it either makes games look worse and only really improves reflections at great cost of everything else especially performance
Anyone buying gen 1 anything is a moron same with fermi vr Ryzen and vega

people who pay their power bill

> RTX is a complete gimmick it either makes games look worse and only really improves reflections at great cost of everything else especially performance
shut up boomer, you cannot even tell what looks good or not anymore. Go play with your lawnmower.

your washing machine and steam iron use more power than your computer.

yeah but my washing machine is on for 1 hour at most a week.

Oh yeah I'm silorry your blown out eyes can't see how shit rtx doctored bullshit looks because of all the blue light from lcds frying your eyes from your shit screens
Seriously every rtx title I've seen demod runs like crap and turns basic stuff off like shadows and ao just to make (((rtx off))) look worse it's blatantly obvious how shit nvidias implementation of rt is for gaming and its only worth it for gi + reflections at most
Shadows are too expensive still and the card is underpowered it barely beats a 1080ti for 70%+ price and worse thermals (without any of this crap rtx even enabled so performance and thermals are obviously gonna get alot worse when they turn ai + rt stuff on and bench it maxxed out) and on to of it all nvidia doesn't even have any first party demo's even a basic ue4 demo or anything for their BIGGEST ARCH CHANGE IN OVER 12 YEARS makes me think no I Know they are hiding how badly these cards perform.
Nobody will adopt this shit and dlaa is a meme that looks no better than taa and needs the game dev + your pc to have gfe Botnet enabled to download deep learning crap to the card itself
Honestly the whole situation is a joke.
Gonna sit this one out till 7nm and if amd doesn't respond for another 2-5 years I don't care this tech will never be standard anyway it's just a neat showcase for next gen hardware and games going forward into 202x+
No way in fuck will anything good come from turing especially on this shit ass 12nm (16nm+) node
Nvidia already working on 7nm and just finishing off old volta crap this time not even giving it hbm these cards are memory starved by shit slow 8gb 256bit 11gb 384bit gddr6

Attached: 1487720349817.png (900x600, 492K)

if your computer is on for 8+ hours a day it means you are at least working with it correct ? that means your computer is bringing in money, it pays for itself.

>I don't care this tech will never be standard anyway
spoken like a true boomer. FYI there were thousands of people like you when nvidia came out with shaders around 2001-2002. Realtime raytracing is here to stay. Is it going to be an enthusiast tech for a while ? yes mainly because of the console audience gaming on a 3 generation old piece of hardware.

RTX is like 4k, in that it has been supported for many years, personally I've been on 4k since 2015, and its now barely catching on with the normie audience. RTX and DLSS will be supported but its not until consoles adopt it that we'll see mass adoption by studios.

Personally I dont give a shit about AAA games, its nice to have some expensive game to play with but I rather experiment with the tech myself in unity or unreal.

>it's another amature game dev larp
Mate nobody gives a fuck about nvidia or idiots like you gushing about dumb first generation tech like it's gonna revolutionise gaming.
Vr did that decades are and still didn't catch on
4k dates back to 2013 even before now tv and monitor companies wunna push 8k now that 4k and hdr flopped
I honestly don't care about 4k native it's a meme and even this shiny new 400watt sucking monster can't run older games at 30-50fps minimum maxxed out is a joke the 0.1% lows are WORSE than pascal
As I said you are a dullard for calling anyone who disagrees with you a dumb boomer who doesn't understand rt
It's not good for gaming period it's a nice thought and tech experiment but we are decades away from getting anywhere near decent results with it.
Even with denoising ai powered drivel and deep learning aa nvidia is only pushing 3 rays per pixel
Noise artifacts are gonna be a big problem as well this new ai deep learning meme can only help so far when the ray count is so low
As I said wait for 7nm or enjoy lighting 1200usd+ on fire for features you can't even use in games for years
This is a big post I am a big guy OooThis is a big post I am a big guy OOoThis is a big post I am a big guy Ooo

Attached: bane_I_love_it.png (800x449, 447K)

power consumption literally makes or breaks the architecture, brainlet
power consumption is the reason why GTX480, R9 290X and Vega 64 are considered shit

fuck, I wish this meme died already

They are all beasts at compute tho
All the lite archs like polaris pascal and shit sipped power because they where stripped down and designed for gaming and lower power usage for consoles
Turing is literally Volta with (((rtcores))) strapped to it
Won't see any decent efficiencies in turing until 7nm

I'm not into game dev retard, but machine learning.

> Vr did that decades are and still didn't catch on

vr is dead right ? thats why facebook bought oculus for 3 billion dollars and they are coming with santa cruz on 2019 and oculus 2 in the near future. Thats why we have vive released on 2016 and vive pro on 2017 with new tech for 2019. Thats why Pimax recently came out with 8k and 5k version of their devices. the industry is dead though, sad

> I honestly don't care about 4k native it's a meme and even this shiny new 400watt sucking monster can't run older games at 30-50fps minimum maxxed out is a joke the 0.1% lows are WORSE than pascal

it doesnt matter what you care or not, you are not the audience. People care about 4k and turing gets you there, at least 35% increase. Buy a 2080ti and test it for yourself instead of rambling nonsense.

> Even with denoising ai powered drivel and deep learning aa nvidia is only pushing 3 rays per pixel
you are just parroting big words to seem smart, do some research on the kind of stuff nvidia came up with, not the marketing stuff but the whitepapers done by researchers. Oh thats right , that would take effort, its better to watch videos and talk shit.

> As I said wait for 7nm or enjoy lighting 1200usd+ on fire for features you can't even use in games for years
anyone getting paid decent money can buy a 1200 dollars gpu. Again a good chunk of people are not buying to play AAA games, I dont give a shit about that.

go back to watching tv boomer.