THANK YOU BASED NVIDIA

videocardz.com/77439/pny-geforce-rtx-2080-ti-xlr8-series-leaked-by-pny

THANK YOU BASED NVIDIA

Attached: PNY-GeForce-RTX-2080-Ti-XLR8-1000x1163.jpg (1000x1163, 163K)

Other urls found in this thread:

twitter.com/ryanshrout/status/1029180324993888261
anandtech.com/show/12095/hdmi-21-specification-released
twice.com/product/ce-devices-certified-hdmi-2-1-unlikely-this-year-hdmi-forum-ces-2018
youtube.com/watch?v=Q3ZFRgZMdlA
anandtech.com/show/13214/nvidia-reveals-next-gen-turing-gpu-architecture
developer.nvidia.com/rtx/ngx
twitter.com/barba_toss
twitter.com/NSFWRedditImage

>boost speed 1545MHz
So there is hardly much node improvement here, as indicated by the die sizes.
They're calling it 12nm when it's really just a 16nm refinement.

So the 2080 is going to be slower than the 1080Ti and the 2080Ti just a bit faster than the 1080Ti, maybe 30%.
In line with my expectations, but I'm still surprised that people are excited and were overhyping this.
2 years later and you get a 30% performance increase for a 30% cost increase.

Then when Navi comes next year, they'll suddenly drop to half the price after a bunch of retards bought these.

1 problem, game development is going to start implementing significantly more Ray tracing based graphics creating a demand for better graphics cards with dedicated Ray tracing processors. Next AMD will do their thing and add a Ray tracing core and older graphics cards will take hard hits to overall performance since they don't have Ray Trace cores

>real tim ray tracing

they showcase a low poly demo on their new titan that barely could keep up

but they advertise their gaming cards as capable? LOL

no they wont there isnt enough horsepower for it thats why every demo we saw was a canned on a confined place and literally nothing more

Wake me up in 5 years when this actually becomes a thing

>b-but check out these physx!

Good point, guess it's just another mediocre overpriced gpu. Nvidia still does like to be a dickish company and forces companies to optimize for their cards.

they gonna brande dynamic lighting the likes we see on star citizen(which is top notch by far) as ray tracing mark my words

give me a quick rundown on

1080ti vs 2080

need to upgrade and it seems 1080ti seems to be the way to go?

Attached: 1494973621652.jpg (476x687, 60K)

>285w tdp

twitter.com/ryanshrout/status/1029180324993888261

>The 300-watt TDP of the Quadro Turing card includes 30-watts of overhead for USB-C and VirtualLink. Claim you "won't get close" if you aren't using the interface with a headset.

285W - 30W = 255W TDP

B-but muh RTX 2060 with GTX 1080 level performance for 250$??!

in gameworks games prob 20% faster
on non gameworks 10-15% faster

It'd actually be good if nvidia started moving up the TDP segments. AMD is demonstrating cooling and power delivery is getting to the point where we can fit the cooling for higher power cards into a PCIe form factor but it's probably important to nvidia to keep laptops using mostly identical chips to the desktop parts.
Maybe they could add a higher end model to the lineup, a 2090 or something like that, dunno.

>game development is going to start implementing significantly more Ray tracing based graphics
Is it? When? Because it sure won't be while this generation of consoles is still lingering around like a bad smell. You might get the odd exception, but the move to this being a common technique is years away, and these cards will be obsolete by the time it happens. Hell, they'll be obsolete by the end of next year when Nvidia move to 7nm. They know it too, which is why we're getting the Ti at launch instead of six months down the line.

This whole 'RTX' thing seems like a scam designed to trick stupid people who think that these cards are going to magically add ray tracing into their existing games or something.

>HDMI 2.0b
not HDMI 2.1
wut

Attached: ayy.png (623x808, 418K)

>meanwhile at amd

HAHHAHAH HAHAHHDHAHHAHAHHAHAHAHHAHAHAHAHHAHHA

Attached: AMD-Radeon-Vega-Frontier-Edition-Liquid-Cooled-1-800x372.jpg (800x372, 34K)

@67193748
>just wait
filtered

why the fuck do we need three thread dedicated to this shit

I have a 56 gathering dust it couldn't even keep up with a 1080
Hopefully sell it soon gonna lose 50%of what I paid it's just useless

>implying anyone claims amd's top line cards are any better

stop being a retarded fanboy and call bullshit when you see bullshit

considering i get 100+ fps in 99.9% of games at 3440x1440 on a 1080ti, and that rises to 100% when i don't use 8k shadows and AA i see no reason to care. ill be all over the 30 line tho

2.1 adds variable refresh. Can't let improvements in the standards cannabalize their sales.

Vega 56 was never meant to compate against the GTX 1080 retard. It was meant to compete with the GTX 1070 which it did (and beat it in many cases). If you flashed Vega 56 to the Vega 64 BIOS you could get close to GTX 1080 performance. If you had purchased Vega 56 on launch day (without the shitty games bundle) you got a good deal overall. I only paid £360 for mine. Flashed it to Vega 64 and tweaked it up to near GTX 1080 levels. Sure it uses more power and is noiser with the stock cooler. But it performs fine.

>compete
>£380
fixed

It's the 2070

>It'd actually be good if nvidia made more housefire GPUs instead of focusing on efficiency and portability

No thanks bruh.

anandtech.com/show/12095/hdmi-21-specification-released

HDMI 2.1 was released in November 2017, it was too late to include into Turing because things have to be taped out months or a year in advance

They devs are just now getting their hands on cards capable of some light raytracing features.
It'll be 3-5 years before you get a few releases a year using it.

Exactly. This'll be another PhysX given Nvidia's proprietary bullshit.

The only reasons Gimpworks is used in games is that it's high level and practically "drag and drop" to add it in and have it ruin your game for money. This ray tracing is going to be incredibly low level and highly integral to the entire rendering pipeline.

>yfw you realize Nvidia has been sitting on this new arch for a year but not releasing it or giving any info because people were still happily buying overpriced Pascal GPUs.

Attached: smug pc inspector.png (720x534, 344K)

>30% performance increase for a 30% cost increase
>Just under 800 cores over 1080 ti's 3584
>boost clock slightly lower than 1080 ti's
>Roughly $1000~ vs 1080 ti's $700~
Maybe I'm retarded here, but it sounds more like a 25% performance increase for a 40% price increase.

I literally do not know if GDDR6 has a huge edge over GDDR5X, so maybe I'm retarded.

Attached: 1530829819764.png (333x333, 108K)

>Just under 800 cores over 1080 ti's 3584
Man, I worded this terribly.

>Just under 800 cores more than 1080 ti's 3584

That would have been better. I guess I am retarded.

A spec being finalised doesn't constitute HDMI 2.1 being "released", and the taping out process has literally nothing to do with it. The reason that these new cards, this year's HDTVs and literally everything else doesn't have HDMI 2.1 support yet is because chipsets and the certification process aren't ready. There will be zero HDMI 2.1-certified products on the market this year, because the HDMI Forum is still in the process of rolling out its compliance testing.

twice.com/product/ce-devices-certified-hdmi-2-1-unlikely-this-year-hdmi-forum-ces-2018

21% increase in cores.
GDDR6 will give some improvements at higher resolutions (4k).
A 30% improvement isn't too unlikely.

But it's not an increase in price:performance over an over 2 year old release.

It'll basically be the minimum you need to consistently run 4K in unoptimized games. For $1000. It's quite a joke.

>40% price increase
I was being generous and comparing to AIB cards generally being $750.

youtube.com/watch?v=Q3ZFRgZMdlA

Lmao even the biggest Nvidia fanboy is extremely disappointed and disillusioned by these new cards. Starts at around 17 minutes

It's not worth it, they'd rather focus on efficiency. Anyone who feels like it can OC their shit anyways. My 1070 performs nearly as good as a stock 1080, It's clocked somewhere between 1850 and 1900.

>maybe 30%.
That's not a small improvement

30% after 2 and a half years and also increasing price by 30% is not an improvement.
This is actually historically unprecedented in the 20 years I've been following GPUs.

That means the exact same price:performance we've had for a year and a half. Nothing's changed. You can now just get 124 fps instead of 95 fps at 1440p which you could have just turned settings down for since games now days are only demanding when they're shittily optimized, not because they look great and the settings warrant how shittily they sometimes run.

>THANK YOU BASED NVIDIA
preorder NAO
*this post is approved by videocardz, your wccftech tier newz site.

Attached: 1534507061976.png (976x1431, 362K)

Are you stupid, you can't even preorder those cards yet

Be patinet, he's just a drone.

>Nvidia selling to retards who will buy these shit cards
2 and a half years and we got literally nothing price: performance wise. Guess I will stick to my OC 1070ti at 1440p until next gen.

>videocardz being a shill

more at 11

GDDR6 has more bandwidth at higher production cost (extra tracing and PCB playing) and power consumption (probably why TDP got bump higher)

This is the reason why HBMx is actually the future of ultra-high bandwidth applications while GDDRx is a walking corpse. GDDR5 will likely continue to persist on lower-end SKUs because it is mature and dirt-cheap

Called it back a year ago, when Nvidia started to hype up their post-Pascal silicon.

Nvidia exhausted all of the low-hanging fruit, 12/14nm process's clockspeed/thermal celling with Pascal. The only realistically way to increase performance over Pascal is to add more blocks and fatten them up = poorer yields.

I suspect somewhat big Turing costs at least as much as Vega 64 to produce which is partly why Nvidia is going to push for $699-999 MSRP in order to keep the same insane profit margins as its predecessors.

Will these have tensor cores?

All I want is fast training of neural nets, I couldn't care less about gayming.

Seriously? Do you not follow the announcement since Monday when Nvidia announced the Turing GPU?

anandtech.com/show/13214/nvidia-reveals-next-gen-turing-gpu-architecture

RTX SKUs will have them but I suspect some fraction of them (probably 1/2 to 1/3 of them) are going to be disabled at hardware-level to prevent them from cannibalizing Quadros/Titans SKUs.

Yeah that's my worry after seeing little mention of AI or tensor anywhere. It's quite likely nvidia are going to jew us.

What else would be different between 2080ti and a titan to justify the price increase?

Any news/price on the 2070? It's either that or a used Vega 64 for me desu.

I've been saying the same.
Maxwell got low hanging fruit in its tiled rendering and highly efficienct and fast culling.
Pascal got a big jump in node optimization and some memory bandwidth improvements.
There's just nothing less. Just adding more cores, and this raytracing gimmick.

But people are so delusional they believe there would be a 50% better performance card for the same price as current ones.

Instead it's may 30% more performance (probably closer to 25%) for 25-30% more money.
Yet they're still clinging to it and calling me a dumb shill and saying the 2060 will btfo the 1080 and Vega for $250 in other threads, too.
It's insane. No where else do I see are people buying this bullshit in such large numbers except on Jow Forums.

It'll launch a few weeks later. Probably slightly (single digit percentage) above 1080 performance for $500-$550.

It is a repeat of Geforce 3 launch.

Geforce 3 hype-up pixel/vertex shading as the next big thing at launch. It was marginally faster than Geforce 2 GTS/Ultra it had replaced. By the time, pixel/vertex shading started to be used outside of tech demos it struggled which forced you get newer silicon.

Yes,tensor cores for masses
developer.nvidia.com/rtx/ngx

twitter.com/barba_toss

It's probably this meme spouting Slav garbage hating on BASED NVIDIA. If you're still on Jow Forums, KILL YOURSELF FAGGOT

GTX 970 3.5GB edition owner here. Would it be a good idea to pre-order an RTX? I want to pay the MSRP price before mining goys get it.

Just get a Vega 56, or used 1070Ti, 1080, 1080Ti.
The new RTX cards are a joke, as many Anons pointed out.

Miners aren't going to buy them, except a few with subsidized electricity in cold climates. The price:performance is shit.

Can they do more operations per clock? I know they said they added simultaneous Integer and FP but I don't know if that will help in games at all

Realistically, how much should I pay for a 1080Ti? I am a wagie so I can afford a 1000USD card new but I am not an idiot.