NVIDIA

How the fuck are the RTX cards all sold out even with the "founders" tax of a couple hundred bucks on top of the already bloated MSRP?

Attached: 1524002827543.png (1280x1280, 651K)

Other urls found in this thread:

youtube.com/watch?v=uHY90epFf2g
twitter.com/AnonBabble

Marketing.

They're not called
>Nvidiots
for no reason.

who's buying them?
gamers? just so they get 10 more fps even though they already get about 200?
a $300 gfx card can run most games and even then why are they playing these new games?
from what I noticed games that require high end cards aren't even good

Attached: 1525722490987.png (430x288, 130K)

gamers

they are delusioned by the mining boom that happened prior and think that buying a card on release is the best decision one could do

I'm waiting for benchmarks. 10-series cards are hella cheap right now and the RTX platform is only going to see widespread adoption over the next few years. That said, I'm happy real time ray tracing is finally being pushed, and shadows and reflections finally have a chance at looking good, but this isn't the time to upgrade. maybe in a year or two, but not yet.

>hella cheap right now
I'm buying a second hand Aorus 1080 Ti for 550 Euros, with still 20 months of warranty left. lmao

Attached: shutitdown.png (752x529, 218K)

>shadows and reflections finally have a chance at looking good
oh boy, I didn't know the shitty fps I'm playing right now has rtx jigarays! I also didn't know that my ye olde nvidia gpu had rtx all this time as well!
Ye, nah, fuck off. Only idiots would fall for that bullshit. This just reminds me of that bullshit nvidia pulled before with reduced image quality and boosted fps.

High-end gaymers who want the latest, greatest and cost is no object.

Resellers.

then who is buying it off of them (for $1K on top of the price)

basically this
if not for the crypto fags soaking up the entire market for muh crypto mining then prices would hardly be an issue except for high end cards

I just saw a leaked benchmark.

4fps over a 1080ti, for $300 more...

This. I dont give a shit unless they throw it on some porn game but Illusion wont catch up until like 10 years from now

Most people don't understand, a 200 € card can run most games in 1080p high (60 FPS). You only need expensive cards for stuff like 4K or 144 Hz. Its luxury.

I've heard of a lot of people who sold their 1080Ti to get the 2080.
It's going to be hilarious when they get lower FPS.

/v/ and reddit had people convinced that 1070 was a midrange card and that they should pay 80% more money for 30-45% increased fps.
1060 and rx480 run all but like 2 shittily optimized releases the past 2 years at high or maxed settings 1080p 60fps.

They arent. Nvidia intentionally makes very few cards on release in order to create the illusion of demand.

144hz soon standard

Smarter to buy 1080 than 2080

We dont know what availability was.

Selling out is less impressive if they only had 1000 cards for example. You'll always find that many people who are either suckers or so rich they dont care.

RTX Titan when? I like to spend money, but the Titan V isn't part of the RTX series and the current cards are pretty much as good as my Titan.

RESELERS

>being poor

Attached: 39313182_466413863872633_3701595831523082240_n.jpg (768x960, 62K)

It's called creating a shortage so we sell our backlog of 10xx series.

For what it's worth, 2080Ti is this gen's Titan, let's be honest.

movie industry? that raytracing is really leet for their cgi workloads

Except that last gen, the 1080ti was released and had a Titan Xp dropped right after to claim top spot. The 2080Ti is better in some specs than a Titan V, but the Titan V has a larger memory bus. We need a new flagship.

Anything below 120fps is unplayable for me.
That said, I'm not a retard who plays on meme ultra settings

What really prevents 2080Ti from being a titan is memory.
Nvidia really jewed us on this one. Only 11GB for 1200€ is fucking ridiculous.

>make a dozen cards
>trick a dozen retards into buying them
>look goyim, our cards are so popular that we're sold out already!

film / cgi peeps if i had to guess

Artificial scarcity

Attached: 1507815016285.jpg (408x402, 82K)

EUREKA

Hollywood, just like I said, they would sell like hotcakes for them because everything vfx there uses raytracing.

So with AMD chasing the 7nm meme and likely to present single digit framerates with ray tracing features turned on, do you think they'll go back to single digit market share?

The speed at which RTX sold out suggests so.

Attached: AMDead.jpg (321x222, 16K)

The best thing AMD can do right now is just release a monster GPU at not-raytracing.

Because tom's hardware told us to just buy it right now.

3D artists/studios won't buy AMD then.

Yeah, but it will still meme-learning.

As if Nvidia's raytracing can just be shoehorned into existing renderers... It's about as noisy as a lot of current CPU raytracers, main difference being that clever AI denoising algorithm. They boasted Unity and Unreal integration for a reason, if it were in Vray i'd be impressed.

Attached: DlslegjU4AAYkif.jpg (1200x675, 69K)

You have no clue what you're talking about, do you?

Do you? Cause all i've seen so far is marketing dazzle and razzmatazzle oriented at gamers who have never heard of raytracing

The software developers would be dumb not to "shoehorn in" it into existing renderers, it would make their product have more value.

And they can market it as "less render time, so you get more value from your electric bill at your render farm".

So i can play the new SJW Battlefield at 1080p 30fps but with some more reflections.

You don't understand how ray tracing works. RT is noisy when not enough rays have been traced. RT cores raytrace, in the case of runtime applications, they trace less rays to keep up a decent framerate, which is then denoised with AI and whatnot. If you are rendering something that is not real time, you don't need to cut off the ray tracing too early so you don't need to denoise. RT cores are extremely fast at crunching rays regardless, if you then only trace a few rays because you need 60 frames per second, or if you use them to crunch a few hundred billions of them over 30 minutes is up to the application, regardless RT cores will raytrace a lot more than a normal GPU under the same span of time.

Of course you still need some level of implementation, but it's likely it would use some independent API just like games use DXR (which a lot of people get wrong and think RTX is something like GameWorks, when it's not, the games use the DXR API, not RTX, RTX is not an API).

Exactly - you just explained how it's clearly not fast enough to do it at realtime without noise, which is what we've had for years now (minus the fancy denoising AI). Here's a video from 2011 of Octane render running on two GTX460s:

youtube.com/watch?v=uHY90epFf2g

You mongoloid, not fast enough for 60fps does not mean not considerably faster than previous hardware. We've had raytracing since what, the 80s? The difference is speed.

Attached: 1525707493729.png (485x443, 26K)

some people will disregard price for what the product represents

Sure, and it ought to be faster 7 years later. Not just on Nvidia's cards but also on AMD cards. It's not like Nvidia invented realtime RT overnight - they just took what a 1080-ish card could already do, ran some denoising on it so it can be used on games and started hyping it up for their new card that's not that much faster than their old one. The gamers will see it as revolutionary and that's enough to drive sales. CG artists will enjoy the speed but i doubt their mind will be blown, especially on complex scenes.

If your so poor in mind, logic and reason that you will buy one of these next gen RTX cards.
Then it is you who are poor.
So poor, you cannot out wit a bear and that is a poor man indeed.
Go by a ball of yarn it will occupy you beyond measure.

I think this card will mostly make its way into the professional market, Raytracing in real time is gonna be huge for people rendering CGI