NVIDIA RTX cards could be a massive flop

Speculation so far:

Ray Tracing on BFV multiplayer at Ultra 1080p with a 1080 Ti shows massive stuttering and poor performance (granted this is an alpha build)
youtube.com/watch?v=RLV9ciJZnmg

Leaked score of an assumed overclocked RTX 2080 shows low gains over a 1080 Ti (unless this is a RTX 2070 which would be good, but it is not being distributed yet to vendors so makes little sense)
videocardz.com/77763/nvidia-geforce-rtx-2080-3dmark-timespy-result-leaks-out

NVIDIA is being super paranoid about RTX launch, controlling how it's reviewed, this isn't anything new and AMD does this as well. But far more extreme than normal, potential indication of poor performance
hardocp.com/article/2018/08/28/nvidia_controls_aib_launch_driver_distribution/

NVIDIA estimated annual per dollar performance looks bad
i.imgur.com/dZPWUKd.jpg

Only thing left is DLSS, however for 4K gaming arguably AA is not really needed. Remains to be seen if this tech is really worth it for most gamers and how good it really looks since it appears to render at a lower resolution and than upscale via AI.

Attached: nvidia.jpg (700x394, 42K)

Other urls found in this thread:

3dmark.com/spy/2520142
i.imgur.com/dZPWUKd.jpg
old.reddit.com/r/nvidia/comments/9b33hx/hardocp_nvidia_controls_aib_launch_and_driver/e501cza/
fudzilla.com/news/graphics/46038-amd-navi-is-not-a-high-end-card
fudzilla.com/news/graphics/46014-vega-7nm-is-not-a-gpu
hardocp.com/article/2018/08/28/nvidia_controls_aib_launch_driver_distribution/
pcgamesn.com/amd-navi-monolithic-gpu-design
tomshardware.com/news/nvidia-rtx-gpus-worth-the-money,37689.html
youtube.com/watch?v=ma1gh-21diQ
forums.guru3d.com/threads/nvidia-to-control-aibs-custom-rtx-2080-ti-reviews.422723/#post-5579179
youtube.com/watch?v=8kQ3l6wN6ns
youtube.com/watch?v=WNS8a60YVyc
youtu.be/m42XiyJgyco
twitter.com/NSFWRedditImage

it'll be a few years before anyone really gives a fuck about the raytracing shit

until then it'll probably just be for impressive looking tech demos and whatever add-on patches nvidia feel like paying for

That's true, no one will be buying first gen RTX for Ray Tracing. Perhaps they should have waited a gen or two longer before releasing it? But at the same time they have no competition so maybe it's good enough. I feel like RT will only be good enough for single player and only on certain games.

At the same time RT could be a flop and we will not see much progress in the next 5-10 years at all. Either someone creates a competing solution with far better performance. Or that the graphic fidelity isn't that huge of a leap when compared to modern techniques like cube map reflections, so no one will care about it.

Also no one cares that much about hairworks or physx.

Everybody should just wait until the reviews come out for these. Pre-ordering expensive shit when only Nvidia Approved™ benchmarks have been shown is stupid. Inb4 somebody calls me a poor fag.

RTX will end up being a waste of time. Convolutional neural nets will revolutionaize video rendering.

This begin gold era cards for 3D artist and Deep learning students.

But RTX for gaming is stupid,nvidia really want farm render,3D studios,archviz money and RTX begin half baked product for gamers, because build two silicon one for Quadro and other for gaming is too expensive, nvidia prefer say RTX is for gaming too.

wat the fuck did you just say

Just the overpriced GPU I need for my new i9 rig. :3

The independent benchmarks haven't released, this is pure speculation you AMD shill

OHNONONONONONO

Attached: E4OteZ_s.png (1178x843, 91K)

>Slower than similarly clocked 1080ti
LUL
3dmark.com/spy/2520142

gotta get it out there even if nobody's buying it for gaymen yet

nobody really used the hardware transform & lighting on the early geforce cards for a while either, takes time to build it up into something worth using and for devs to be comfortable that it won't disappear next year and shit on all their investment in it

BFV beta ran great on Vega. Like 50% better than the Nvidia cards. Very smooth and high framerates. 70fps 1% minimums @ 1440p at ultra settings on Vega56.
Now it can't even do 1080p on a 1080Ti? Hilarious. Great optimization work on an already optimized game by Nvidia.

This is going to be especially ullshit to anyone who played the beta where it ran great, especially on AMD cards, before Nvidia got their hands on it.

Yes they will. But not this gen. I know you know that; I'm just clarifying for others.

They're basically scamming gamers to lower their costs on the Quadro cards and increase their yields. Also as a well to sell a Quadro card without the actual certified drivers and service that is expected with a card that expensive.

The gameplay is using 144hz config

>i.imgur.com/dZPWUKd.jpg
Should make this graph look nicer.

It should really be a line graph, with the negatives red and positives green. Should also also have *80 and *80Ti overlapped and arranged by year instead of gen.
Should also add in AMD cards.

Are you using perf/dollar based on 3dmark or what?

>this isn't anything new and AMD does this as well
AND doesn't do this

>Everybody should just wait until the reviews come out for these.
But not even the launch day reviews, because they're already confirmed to be conducted by a list of Nvidia-approved shills. Anybody not hand-picked by Nvidia won't get a review sample, even from AIBs. It'll take at least a few days for reviews to come in from truly independent sources, whilst the likes of (((Tom's Hardware))) are busy calling them the best cards ever released.

Is this the AMDrone delusion general?

>That's true, no one will be buying first gen RTX for Ray Tracing.
All RTX cards are already sold out.

Attached: 778878787.jpg (330x319, 70K)

How is he playing with RTX on and DX 12 disabled ?
RTX/DXR runs only with DX 12.

Just buy it™
Thanks Novidea.

I'm gonna buy RX 580 or Vega 7mm later.

AYYMDPOORFAGS still SEETHING at massive success of Turing RTX cards

Sold out everywhere, widely adopted by developers everywhere, highest performance and power efficiency unmatched by AYYMD HOUSEFIRES garbage

>Be Jewvidya
>make a deep learning card
>tensor cores for deep learning bullshit
>want to save shekels
>refuse to design new cards for gaming
>instead pocket that R&D money
>rebadge the Tensor/deep learning card
>As an amazing new gaming GPU
>has shitty performance and TDP
>Is literally a housefire
>extra useless AI bullshit taking up space
>GPU die is too big so yields are horrible
>have to gimp 2080 die cause yields so bad
>double gimp 2070 die cause yields SO BAD!
>cost/performance is horrible beyond belief

And then a lightbulb clicks on.....

>Have codemonkey slaves cook up software
>make new jewvidya shitwerks feature
>RAY TRACING.tm
>isnt actually ray tracing
>just makes shit look like mirrors
>but it runs on the Tensor core bullshit
>now useless cores have a use
>force this shitwerks into all games
>can only run mirrors4days shitwerks on RTX
>mirrors look terrible
>game performance is the worst ive ever seen
>literally 35fps at 1080p with shitwerks
>on a 4,300 shader RTX 2080ti GPU
>for the low low price of 1299$

The most fucked up card lineup since the Fermi Housefire cards. AMD need to speed up Navi mesh designs and get those badboys out by Q1 or Q2 2019 to take advantage of this cluster fuck

Navi's MCM design will offer gtx1080/rtx2070 performance for 250$ and if AMD scales they can easily make an even larger mesh GPU to destroy the 2080ti and nvidia will have no counter until they make MCM cards.
>4000 shader 2080ti

> Ray Tracing on BFV
Oh cool, so a game nobody is buying will look nice, while not even be playable.

Also Nvidia's new generation didn't replace the price brackets of the old models but are insanely high. I hope Navi won't fall for this too. I'm waiting for Navi but I can't justify these prices for gaming use. I'll just have to keep my RX480 8GB forever.

Oh and by the way. All of these 2080ti's are the leftover chips that were binned as too low quality for the Tesla card.

They are literally repackaging dogshit and selling it to your for a 1300$ premium.

stop kvetching

>Golden kek image
>D-Delusional!

Perfect

Attached: mmlol.jpg (496x499, 32K)

AMD 2019 lineup will be as follows

>RX680
480/580 design on 7nm with GDDR6 Will be between1070~1080 ish in performance but closer to 1070. Its gonna be over 2000mhz on the core due to 7nm and the GDDR6 will give it 1080ti bandwidth at 256-bit or they can cut the bus to 184-bit and save costs and give it the same bandwidth as the 1080.

I would make this card 184-bit R6 and do a 6gb full model and 4gb binned budget 670 model with GDR5 and sell them for dirt cheap to corner the budget gaming market at 1080p/1440p

>RX vega

Same vega a chip but on 7nm. Clocks will be close to 2000mhz. Will likely remove HBM controller and other useless shit to reduce die size further and then scrub HBM and go with a GDDR6 memory layout on 384-bit wide bus giving it the same bandwidth as the rtx2080ti. 4096 AMD shaders at 2,000 mhz will destroy the rtx 2080.

>RX Navi

First Mesh GPU ever made. Schedulers and all controllers on a command die in the center with HBM on that package. 4HBM2 stacks giving us 1Tb/s of bandwidth. In a cross pattern we will see four dies with 2000~4000shaders each directly connecting to the command die through a modified infinity fabric. This 8,000~16,000 shader monster will be the new flagship GPU.

Wait for CPC hardware review them

>AyyMD shills this desperate for Nvidia to fail

KEK

>Speculation
Dropped here.

>RTX cards
such technology is basically useless or more like non-existant at a hardware-level.

If the game/software doesn't support it, it's not there.My speculation is that those are basically little bit more OC'd last gen cards, more cuda cores(which is obviously to be expected since the market for them brings them money) and "new" GDDR6 memory.
Besides those hardware changes, basically i would assume the rtx actual improvements will be made through drivers, which will use the classic nvidia proprietary secret sauce algos that enable those cuda cores to do better real-tracing and other math-packed loads.

These cards will probably make new customers amongst people who have 3D/ML& similar computing workloads, gayming improvements will be small for the ray-tracing shilling considering games have to actually implement it.

So would a gamer still aim for new Vega and wait for Navi to become mainstream in the future generations? It sounds like it might be expensive if even RX680 will exist. Probably better as a profitable work horse at first.

do the tensor cores/ ray tracing contribute to performance, or just do the gimmicky dlss / ray tracing? why would they give up half the die to useless gimmicks when they could have 7k shaders???

Because its not a gaymer card
The Fiji of nvidia

It's just to have a new way of gimping older GPUs, just force enable """raytracing""" on every new Gimpworks game :^)

r9 fury was a good card though.

>Devs had access to hardware for less than a week
>"WAAH, poor performance. WAAH RTX Is a flop!!!!"

GDDR6 is actually pretty cheap. The pricing isnt much higher than GDDR5 for almost double bandwidth.

Vega is a solid design and with the HBM controller bullshit for pro cards it would be a decently small die even on 14nm.

On 7nm you will have a very small die with very good yields that will base/boost better than the 12nm RTX2000 series cards can. AMD will have the node advantage.

Vega on 7nm with 4,000 AMD shaders at 2ghz with 384-bit GDDR6 would be cheap enough to manufacture that they could sell it for 499$ easily and it would be significantly faster than the rtx 2080 and might even run close to the 2080ti.

Even without navi that very easy to make and easy to sell vega card would destroy nvidia for an entire release cycle.

They contribute absolutely nothing to performance. They just handle ray tracing instead of the core doing it but heres the fucking kicker: Everything they rays trace is extra shit that the main shaders STILL HAVE TO FUCKING RENDER!

The ray tracing demo of tomb raider had the performance crash to 36fps at 1080p with reduced settings and reduced render distance on a fucking 2080ti.

Its worthless.

ITT: SEETHING AYYMDPOORFAGS CAN;T GET OVER RTX REKTING THEIR PRECIOUS FOUL-SMELLING SHITS.

I hope your prediction are real.

old.reddit.com/r/nvidia/comments/9b33hx/hardocp_nvidia_controls_aib_launch_and_driver/e501cza/

>This is not surprising. In fact, AMD basically does the same thing. They force AIBs to send cards to their HQ, so only AMD can *officially* seed prelaunch samples.

REPEAT AFTER ME

IT'S OK WHEN AYYMD DOES IT AND AYYMD IS ALLOWED TO DO SHADY STUFF WITH NO REPERCUSSIONS OR OUTRAGE FROM AYYMDPOORFAGS

Your FUD and attacks have failed, Turing is selling out everywhere and there's nothing you can do about it except SEETHING about it

NVIDIA RTX cards could be a massive flop > NOVIDEO RTX cards are a massive flop

Tesla top
Fermi flop
Kepler top
Maxwell flop
Pascal top
Turing flop

Just get a 1080ti and wait for the Turing successor.

Dont bother buying a 20 series card , there priced the way they are so that nvidia can get rid of there hugh 10 series inventory without having to cut the 10 series prices.

the 20 series prices will drop once that happens.

>RX Navi
Navi is not high end GPU.
fudzilla.com/news/graphics/46038-amd-navi-is-not-a-high-end-card
>RX vega
It is not GPU. Vega 7nm - according to both Lisa Su and a few others at AMD’s January technology gig - was always presented as an instinct/artificial intelligence product.
fudzilla.com/news/graphics/46014-vega-7nm-is-not-a-gpu
>RX680
It doesn't exist at all. Unless you mean Navi as a successor of RX580? Yeah, the problem is that it will be released in 2020 or 2021.

Attached: Dqoj9VI_d.jpg (640x282, 16K)

I can't get over how much space is wasted on the die for RT and the Tensor cores, especially since these cores are gonna be dead weight while playing games that can't utilize RT and Tensor. I wonder what the power draw will be for the new additions while they're idle when playing other games? At best you'll have a bit of a heat sink to draw away heat from the cuda cores, at worst the new cores will fuck with the drivers making some non-RT/Tensor games not work....I can't fucking wait.

> NVIDIA is being super paranoid about RTX launch, controlling how it's reviewed, this isn't anything new and AMD does this as well. But far more extreme than normal, potential indication of poor performance
> hardocp.com/article/2018/08/28/nvidia_controls_aib_launch_driver_distribution/
Maybe they are sandbagging, I can't tell.

Find out on Sept 14th

>Navi is not high end GPU
It is, current RTG CTO directly said that.
The question is when.

Navi architecture is the new Multi-Chip module design.

You can scale it up or down simply by using larger dies or by adding more modules.

The low end navi you are talking about is the design going into next gen consoles. Thats an ultra low power MCM design thats cramming gtx1080 level performance into the low TDP budget of a console. That product in a Desktop form factor will simply be clocked up.

>the full performance aspect
We know the core count and clock rates compared to the previous gen. Unless there is magic involved there are not going to be many surprises on how these cards perform.

Navi is not MCM, their fucking CTO said that.

the fact that the only "leak" so far from 2xxx series
is a fucking 2ghz card you know shit is going to hit the fan

pcgamesn.com/amd-navi-monolithic-gpu-design

>“We are looking at the MCM type of approach,” says Wang, “but we’ve yet to conclude that this is something that can be used for traditional gaming graphics type of application.”

Keep on lying though about MCM

A rough paintjob of Navi design for the autists.

Purple is command chip, Blue is HBM2+ memory, Red is 2.5D interposter that whole package is on, Orange Arrows are modified connectors based on Infinity fabric, Brown is initial four dies positioned for best latency and performance, yellow is the additional four dies that will be added to later more advanced designs in the 2020's when they make sure the bugs are all ironed out. The yellow slots will have the biggest latency problems so they come much later in successive generations.

The initial four die design can do tiny ultra efficient 4x2,000 shaders or larger 4x4,000 shaders of vega architecture.

Attached: Rough Outline of Navi.png (1914x961, 56K)

Navi has no MCM anything.

Keep telling yourself that.

>Infinity Fabric
Enjoy your HOUSEFIRES and wasted electricity just for the fabric

Attached: IF Power EPYC.png (1527x999, 142K)

>RTG CTO sez they have no idea how to make MCM work for gaymen
>hurr Navi is MCM
?

>gimp 2080 die cause yields so bad
what did he mean by this

That just shows that Zen cores are stupidly efficient.
Full TU104 is reserved for Quadro RTX 5000.

Additional Note. The infinity fabric design is dependent on memory bandwidth and latency for its performance. So inifnity connecting GPU dies and relying on ultra fast low latency HBM memory will work MUCH better than infinity fabric on zen CPU's which relies on high latency and low bandwidth DDR4 System RAM.

1080p with ray tracing sister

>The infinity fabric design is dependent on memory bandwidth and latency for its performance
Data fabric being tied to MEMCLK is the design choice, not the rule.

do you even understand that it shows the load of zen cores being at 85.73% ?

Fair point but if they stay consistent with MEMCLK then they wont have the same bottle-necking problems that happened on first gen RyZen because HBM>DDR.

>hardware shillnucks

>LOOK GUYS OUR NEW VIDEO
>WE ARE SWITCHING TO INTEL A DAY AFTER ADOBE MADE A PATCH TO USE THEIR IGP
>WE TOTALLY DIDNT KNEW ANYTHING
>I PROMISE GOYS

32cores and 64 threads almost fully loaded the entire package does not hit 180, not even once......absolutely fucking amazing......

>we've been briefed about performance
At least they are honest about just being a mouthpiece for official "benchmarks"

Stop kvetching and just buy it, goyim.

Let's face it, it's going to be quite floppy.

But they have been researching ray tracing and working on this for a decade and now finally it became a releasable product. It's a product absolutely worth releasing, but not worth hyping up.

Nvidia could expect a modest ROI of all the R&D sank into it if placed right. Unfortunately, they are hyping it up so much they will be vulnerable to backlash if it falls flat which it likely will do.

t. poorfag with bottom of the barrel card

Attached: I happen to be an expert on this topic3.jpg (427x427, 37K)

>GayTracing
>Turing was gay
Really makes you think

>Unfortunately, they are hyping it up so much they will be vulnerable to backlash if it falls flat which it likely will do.
That'd be why they're going for it now, while AMD aren't really competition and Intel are years off being a viable competitor. Even if shit sucks, cunts have no other choice so it won't hurt them too badly

>buying 1st gen anything
>not wanting others to buy 1st gen for (You) so they can beta test it
>not buying two or three gens later
the absolute state of Jow Forums

tomshardware.com/news/nvidia-rtx-gpus-worth-the-money,37689.html
>Just buy it you fucking idiots
Nvidia trying to literally steal money from users.

>he kvetches
stop and just pay up
its just worth it, trust me

>NVIDIA RTX cards could be a massive flop
But Jow Forums and Jow Forums says graphics card sales doesn't rely on gaymers and their success isn't reflected by gaymer sales.

youtube.com/watch?v=ma1gh-21diQ

njewdia is forcing aib partners to buy unsold 10 series chips.
hahahahahahahahahahahahaha

>best cards ever released
i would hope so? otherwise what the fuck are they doing?

forums.guru3d.com/threads/nvidia-to-control-aibs-custom-rtx-2080-ti-reviews.422723/#post-5579179

> That's just a big can of nonsense (and I initially wrote another word there). NVIDIA always has tracked what media gets what AIB samples, period. You know who does that as well? AMD, they even regulate what brand and sample end up at what reviewer. How conveniently he forgets to mention that.

>Believing the lies of an AYYMD asslicker Kyle

>Now it can't even do 1080p on a 1080Ti? Hilarious.
It's a 2080ti with meme tracing on, OP fucked up. Watch the video.

What a shit 1080Ti. Here is mine, without touching anything

Attached: Timespy.png (722x238, 5K)

If true (I honestly doubt), that would be pretty interesting testbed for next generations of EBYN with dedicated command chip to keep pushing more cores and see what happens.

> I want nvidias new cards to be a flop so I can feel better about not having the money to afford one. please be a flop, please be a flop, please be a flop....

That can also mean that multi chip Navi won't be a gaming card.

>not having the money to afford one.
A 2080Ti is almost 1 month salary at minimum wage.
You're basically NVidia's bitch for a month by paying this shit.
Next thing you know it's two fucking month.

youtube.com/watch?v=8kQ3l6wN6ns
OP BTFO

>OP BTFO

>9:50 DICE are currently not using Nvidia tensor cores or AI trained denoising filters to clean up the ray-trace reflections

Holy fuck, you do not even need the RTX to play with ray tracing considering that it was not even used in the development.

AMD cope thread.

This is now different from PhysX

Question :
What risk do I take by pre-ordering if I can freely cancel or return this shit 30 days after purchase? I have a good feeling Nvidia will do something to piss me off between now and then.. but, at which point, I send their shit back to to them.

you are pretty bum blasted haha

youtube.com/watch?v=WNS8a60YVyc

lol stutterfest

Attached: 1535245990154.jpg (1024x785, 548K)

This stuff will die out like every Nvidia meme tech, but is there any open implementation of raytracing on the landscape?

Attached: smiling_face.gif~c200.gif (200x200, 49K)

youtu.be/m42XiyJgyco

Acutally, Turing isn't Volta rather it is a modified Volta design geared towards graphics.

Ray-Tracing acceleration isn't a meme. It is part of the transitional phase from rasterization to true ray-tracing rendering.

It is the RTX family are "Voodoo 1s" of this transitional period. Voodoo 1 were pricy back when they introduced and required a separate 2D card for output.