Ray Tracing on BFV multiplayer at Ultra 1080p with a 1080 Ti shows massive stuttering and poor performance (granted this is an alpha build) youtube.com/watch?v=RLV9ciJZnmg
Only thing left is DLSS, however for 4K gaming arguably AA is not really needed. Remains to be seen if this tech is really worth it for most gamers and how good it really looks since it appears to render at a lower resolution and than upscale via AI.
it'll be a few years before anyone really gives a fuck about the raytracing shit
until then it'll probably just be for impressive looking tech demos and whatever add-on patches nvidia feel like paying for
Noah Lopez
That's true, no one will be buying first gen RTX for Ray Tracing. Perhaps they should have waited a gen or two longer before releasing it? But at the same time they have no competition so maybe it's good enough. I feel like RT will only be good enough for single player and only on certain games.
At the same time RT could be a flop and we will not see much progress in the next 5-10 years at all. Either someone creates a competing solution with far better performance. Or that the graphic fidelity isn't that huge of a leap when compared to modern techniques like cube map reflections, so no one will care about it.
Also no one cares that much about hairworks or physx.
Xavier Gonzalez
Everybody should just wait until the reviews come out for these. Pre-ordering expensive shit when only Nvidia Approved™ benchmarks have been shown is stupid. Inb4 somebody calls me a poor fag.
Asher Wood
RTX will end up being a waste of time. Convolutional neural nets will revolutionaize video rendering.
Adrian Stewart
This begin gold era cards for 3D artist and Deep learning students.
But RTX for gaming is stupid,nvidia really want farm render,3D studios,archviz money and RTX begin half baked product for gamers, because build two silicon one for Quadro and other for gaming is too expensive, nvidia prefer say RTX is for gaming too.
Gavin Scott
wat the fuck did you just say
Grayson Barnes
Just the overpriced GPU I need for my new i9 rig. :3
Evan Ortiz
The independent benchmarks haven't released, this is pure speculation you AMD shill
gotta get it out there even if nobody's buying it for gaymen yet
nobody really used the hardware transform & lighting on the early geforce cards for a while either, takes time to build it up into something worth using and for devs to be comfortable that it won't disappear next year and shit on all their investment in it
Carter Butler
BFV beta ran great on Vega. Like 50% better than the Nvidia cards. Very smooth and high framerates. 70fps 1% minimums @ 1440p at ultra settings on Vega56. Now it can't even do 1080p on a 1080Ti? Hilarious. Great optimization work on an already optimized game by Nvidia.
This is going to be especially ullshit to anyone who played the beta where it ran great, especially on AMD cards, before Nvidia got their hands on it.
Yes they will. But not this gen. I know you know that; I'm just clarifying for others.
They're basically scamming gamers to lower their costs on the Quadro cards and increase their yields. Also as a well to sell a Quadro card without the actual certified drivers and service that is expected with a card that expensive.
It should really be a line graph, with the negatives red and positives green. Should also also have *80 and *80Ti overlapped and arranged by year instead of gen. Should also add in AMD cards.
Are you using perf/dollar based on 3dmark or what?
Bentley Myers
>this isn't anything new and AMD does this as well AND doesn't do this
Luis Hill
>Everybody should just wait until the reviews come out for these. But not even the launch day reviews, because they're already confirmed to be conducted by a list of Nvidia-approved shills. Anybody not hand-picked by Nvidia won't get a review sample, even from AIBs. It'll take at least a few days for reviews to come in from truly independent sources, whilst the likes of (((Tom's Hardware))) are busy calling them the best cards ever released.
Caleb Gray
Is this the AMDrone delusion general?
>That's true, no one will be buying first gen RTX for Ray Tracing. All RTX cards are already sold out.
How is he playing with RTX on and DX 12 disabled ? RTX/DXR runs only with DX 12.
Levi Martinez
Just buy it™ Thanks Novidea.
I'm gonna buy RX 580 or Vega 7mm later.
Justin Nelson
AYYMDPOORFAGS still SEETHING at massive success of Turing RTX cards
Sold out everywhere, widely adopted by developers everywhere, highest performance and power efficiency unmatched by AYYMD HOUSEFIRES garbage
Brody Jackson
>Be Jewvidya >make a deep learning card >tensor cores for deep learning bullshit >want to save shekels >refuse to design new cards for gaming >instead pocket that R&D money >rebadge the Tensor/deep learning card >As an amazing new gaming GPU >has shitty performance and TDP >Is literally a housefire >extra useless AI bullshit taking up space >GPU die is too big so yields are horrible >have to gimp 2080 die cause yields so bad >double gimp 2070 die cause yields SO BAD! >cost/performance is horrible beyond belief
And then a lightbulb clicks on.....
>Have codemonkey slaves cook up software >make new jewvidya shitwerks feature >RAY TRACING.tm >isnt actually ray tracing >just makes shit look like mirrors >but it runs on the Tensor core bullshit >now useless cores have a use >force this shitwerks into all games >can only run mirrors4days shitwerks on RTX >mirrors look terrible >game performance is the worst ive ever seen >literally 35fps at 1080p with shitwerks >on a 4,300 shader RTX 2080ti GPU >for the low low price of 1299$
The most fucked up card lineup since the Fermi Housefire cards. AMD need to speed up Navi mesh designs and get those badboys out by Q1 or Q2 2019 to take advantage of this cluster fuck
Navi's MCM design will offer gtx1080/rtx2070 performance for 250$ and if AMD scales they can easily make an even larger mesh GPU to destroy the 2080ti and nvidia will have no counter until they make MCM cards. >4000 shader 2080ti
Colton Diaz
> Ray Tracing on BFV Oh cool, so a game nobody is buying will look nice, while not even be playable.
Landon Martinez
Also Nvidia's new generation didn't replace the price brackets of the old models but are insanely high. I hope Navi won't fall for this too. I'm waiting for Navi but I can't justify these prices for gaming use. I'll just have to keep my RX480 8GB forever.
Elijah Powell
Oh and by the way. All of these 2080ti's are the leftover chips that were binned as too low quality for the Tesla card.
They are literally repackaging dogshit and selling it to your for a 1300$ premium.
>RX680 480/580 design on 7nm with GDDR6 Will be between1070~1080 ish in performance but closer to 1070. Its gonna be over 2000mhz on the core due to 7nm and the GDDR6 will give it 1080ti bandwidth at 256-bit or they can cut the bus to 184-bit and save costs and give it the same bandwidth as the 1080.
I would make this card 184-bit R6 and do a 6gb full model and 4gb binned budget 670 model with GDR5 and sell them for dirt cheap to corner the budget gaming market at 1080p/1440p
>RX vega
Same vega a chip but on 7nm. Clocks will be close to 2000mhz. Will likely remove HBM controller and other useless shit to reduce die size further and then scrub HBM and go with a GDDR6 memory layout on 384-bit wide bus giving it the same bandwidth as the rtx2080ti. 4096 AMD shaders at 2,000 mhz will destroy the rtx 2080.
>RX Navi
First Mesh GPU ever made. Schedulers and all controllers on a command die in the center with HBM on that package. 4HBM2 stacks giving us 1Tb/s of bandwidth. In a cross pattern we will see four dies with 2000~4000shaders each directly connecting to the command die through a modified infinity fabric. This 8,000~16,000 shader monster will be the new flagship GPU.
Mason Gomez
Wait for CPC hardware review them
Caleb Campbell
>AyyMD shills this desperate for Nvidia to fail
KEK
Ryder Flores
>Speculation Dropped here.
Nolan Peterson
>RTX cards such technology is basically useless or more like non-existant at a hardware-level.
If the game/software doesn't support it, it's not there.My speculation is that those are basically little bit more OC'd last gen cards, more cuda cores(which is obviously to be expected since the market for them brings them money) and "new" GDDR6 memory. Besides those hardware changes, basically i would assume the rtx actual improvements will be made through drivers, which will use the classic nvidia proprietary secret sauce algos that enable those cuda cores to do better real-tracing and other math-packed loads.
These cards will probably make new customers amongst people who have 3D/ML& similar computing workloads, gayming improvements will be small for the ray-tracing shilling considering games have to actually implement it.
Hunter Diaz
So would a gamer still aim for new Vega and wait for Navi to become mainstream in the future generations? It sounds like it might be expensive if even RX680 will exist. Probably better as a profitable work horse at first.
Christian Ward
do the tensor cores/ ray tracing contribute to performance, or just do the gimmicky dlss / ray tracing? why would they give up half the die to useless gimmicks when they could have 7k shaders???
Caleb Jenkins
Because its not a gaymer card The Fiji of nvidia
Owen Allen
It's just to have a new way of gimping older GPUs, just force enable """raytracing""" on every new Gimpworks game :^)
Jaxson Jackson
r9 fury was a good card though.
Elijah Torres
>Devs had access to hardware for less than a week >"WAAH, poor performance. WAAH RTX Is a flop!!!!"
Carson Harris
GDDR6 is actually pretty cheap. The pricing isnt much higher than GDDR5 for almost double bandwidth.
Vega is a solid design and with the HBM controller bullshit for pro cards it would be a decently small die even on 14nm.
On 7nm you will have a very small die with very good yields that will base/boost better than the 12nm RTX2000 series cards can. AMD will have the node advantage.
Vega on 7nm with 4,000 AMD shaders at 2ghz with 384-bit GDDR6 would be cheap enough to manufacture that they could sell it for 499$ easily and it would be significantly faster than the rtx 2080 and might even run close to the 2080ti.
Even without navi that very easy to make and easy to sell vega card would destroy nvidia for an entire release cycle.
Mason Johnson
They contribute absolutely nothing to performance. They just handle ray tracing instead of the core doing it but heres the fucking kicker: Everything they rays trace is extra shit that the main shaders STILL HAVE TO FUCKING RENDER!
The ray tracing demo of tomb raider had the performance crash to 36fps at 1080p with reduced settings and reduced render distance on a fucking 2080ti.
Its worthless.
Leo Cox
ITT: SEETHING AYYMDPOORFAGS CAN;T GET OVER RTX REKTING THEIR PRECIOUS FOUL-SMELLING SHITS.
>This is not surprising. In fact, AMD basically does the same thing. They force AIBs to send cards to their HQ, so only AMD can *officially* seed prelaunch samples.
REPEAT AFTER ME
IT'S OK WHEN AYYMD DOES IT AND AYYMD IS ALLOWED TO DO SHADY STUFF WITH NO REPERCUSSIONS OR OUTRAGE FROM AYYMDPOORFAGS
Your FUD and attacks have failed, Turing is selling out everywhere and there's nothing you can do about it except SEETHING about it
Parker Rogers
NVIDIA RTX cards could be a massive flop > NOVIDEO RTX cards are a massive flop
Adrian Hall
Tesla top Fermi flop Kepler top Maxwell flop Pascal top Turing flop
Just get a 1080ti and wait for the Turing successor.
Angel Gomez
Dont bother buying a 20 series card , there priced the way they are so that nvidia can get rid of there hugh 10 series inventory without having to cut the 10 series prices.
the 20 series prices will drop once that happens.
Cooper Davis
>RX Navi Navi is not high end GPU. fudzilla.com/news/graphics/46038-amd-navi-is-not-a-high-end-card >RX vega It is not GPU. Vega 7nm - according to both Lisa Su and a few others at AMD’s January technology gig - was always presented as an instinct/artificial intelligence product. fudzilla.com/news/graphics/46014-vega-7nm-is-not-a-gpu >RX680 It doesn't exist at all. Unless you mean Navi as a successor of RX580? Yeah, the problem is that it will be released in 2020 or 2021.
I can't get over how much space is wasted on the die for RT and the Tensor cores, especially since these cores are gonna be dead weight while playing games that can't utilize RT and Tensor. I wonder what the power draw will be for the new additions while they're idle when playing other games? At best you'll have a bit of a heat sink to draw away heat from the cuda cores, at worst the new cores will fuck with the drivers making some non-RT/Tensor games not work....I can't fucking wait.
Hudson Martin
> NVIDIA is being super paranoid about RTX launch, controlling how it's reviewed, this isn't anything new and AMD does this as well. But far more extreme than normal, potential indication of poor performance > hardocp.com/article/2018/08/28/nvidia_controls_aib_launch_driver_distribution/ Maybe they are sandbagging, I can't tell.
Noah Young
Find out on Sept 14th
Brody Wood
>Navi is not high end GPU It is, current RTG CTO directly said that. The question is when.
Brody Wright
Navi architecture is the new Multi-Chip module design.
You can scale it up or down simply by using larger dies or by adding more modules.
The low end navi you are talking about is the design going into next gen consoles. Thats an ultra low power MCM design thats cramming gtx1080 level performance into the low TDP budget of a console. That product in a Desktop form factor will simply be clocked up.
Kevin Davis
>the full performance aspect We know the core count and clock rates compared to the previous gen. Unless there is magic involved there are not going to be many surprises on how these cards perform.
Christopher Flores
Navi is not MCM, their fucking CTO said that.
Jeremiah Young
the fact that the only "leak" so far from 2xxx series is a fucking 2ghz card you know shit is going to hit the fan
>“We are looking at the MCM type of approach,” says Wang, “but we’ve yet to conclude that this is something that can be used for traditional gaming graphics type of application.”
Keep on lying though about MCM
Joshua Russell
A rough paintjob of Navi design for the autists.
Purple is command chip, Blue is HBM2+ memory, Red is 2.5D interposter that whole package is on, Orange Arrows are modified connectors based on Infinity fabric, Brown is initial four dies positioned for best latency and performance, yellow is the additional four dies that will be added to later more advanced designs in the 2020's when they make sure the bugs are all ironed out. The yellow slots will have the biggest latency problems so they come much later in successive generations.
The initial four die design can do tiny ultra efficient 4x2,000 shaders or larger 4x4,000 shaders of vega architecture.
>RTG CTO sez they have no idea how to make MCM work for gaymen >hurr Navi is MCM ?
Juan Collins
>gimp 2080 die cause yields so bad what did he mean by this
Levi Lopez
That just shows that Zen cores are stupidly efficient. Full TU104 is reserved for Quadro RTX 5000.
Oliver Ward
Additional Note. The infinity fabric design is dependent on memory bandwidth and latency for its performance. So inifnity connecting GPU dies and relying on ultra fast low latency HBM memory will work MUCH better than infinity fabric on zen CPU's which relies on high latency and low bandwidth DDR4 System RAM.
Nathan Barnes
1080p with ray tracing sister
Ian Howard
>The infinity fabric design is dependent on memory bandwidth and latency for its performance Data fabric being tied to MEMCLK is the design choice, not the rule.
Joseph Myers
do you even understand that it shows the load of zen cores being at 85.73% ?
Juan Myers
Fair point but if they stay consistent with MEMCLK then they wont have the same bottle-necking problems that happened on first gen RyZen because HBM>DDR.
Michael Nguyen
>hardware shillnucks
Kevin Williams
>LOOK GUYS OUR NEW VIDEO >WE ARE SWITCHING TO INTEL A DAY AFTER ADOBE MADE A PATCH TO USE THEIR IGP >WE TOTALLY DIDNT KNEW ANYTHING >I PROMISE GOYS
Caleb Campbell
32cores and 64 threads almost fully loaded the entire package does not hit 180, not even once......absolutely fucking amazing......
Charles Watson
>we've been briefed about performance At least they are honest about just being a mouthpiece for official "benchmarks"
Angel Jackson
Stop kvetching and just buy it, goyim.
Ian Long
Let's face it, it's going to be quite floppy.
But they have been researching ray tracing and working on this for a decade and now finally it became a releasable product. It's a product absolutely worth releasing, but not worth hyping up.
Nvidia could expect a modest ROI of all the R&D sank into it if placed right. Unfortunately, they are hyping it up so much they will be vulnerable to backlash if it falls flat which it likely will do.
>GayTracing >Turing was gay Really makes you think
Gavin Clark
>Unfortunately, they are hyping it up so much they will be vulnerable to backlash if it falls flat which it likely will do. That'd be why they're going for it now, while AMD aren't really competition and Intel are years off being a viable competitor. Even if shit sucks, cunts have no other choice so it won't hurt them too badly
Nathan Morris
>buying 1st gen anything >not wanting others to buy 1st gen for (You) so they can beta test it >not buying two or three gens later the absolute state of Jow Forums
>he kvetches stop and just pay up its just worth it, trust me
Luke Gutierrez
>NVIDIA RTX cards could be a massive flop But Jow Forums and Jow Forums says graphics card sales doesn't rely on gaymers and their success isn't reflected by gaymer sales.
> That's just a big can of nonsense (and I initially wrote another word there). NVIDIA always has tracked what media gets what AIB samples, period. You know who does that as well? AMD, they even regulate what brand and sample end up at what reviewer. How conveniently he forgets to mention that.
>Believing the lies of an AYYMD asslicker Kyle
Jaxon Hernandez
>Now it can't even do 1080p on a 1080Ti? Hilarious. It's a 2080ti with meme tracing on, OP fucked up. Watch the video.
Angel Collins
What a shit 1080Ti. Here is mine, without touching anything
If true (I honestly doubt), that would be pretty interesting testbed for next generations of EBYN with dedicated command chip to keep pushing more cores and see what happens.
Dylan Walker
> I want nvidias new cards to be a flop so I can feel better about not having the money to afford one. please be a flop, please be a flop, please be a flop....
Ayden Diaz
That can also mean that multi chip Navi won't be a gaming card.
Xavier Cox
>not having the money to afford one. A 2080Ti is almost 1 month salary at minimum wage. You're basically NVidia's bitch for a month by paying this shit. Next thing you know it's two fucking month.
>9:50 DICE are currently not using Nvidia tensor cores or AI trained denoising filters to clean up the ray-trace reflections
Holy fuck, you do not even need the RTX to play with ray tracing considering that it was not even used in the development.
Ryder Johnson
AMD cope thread.
Eli Williams
This is now different from PhysX
Juan Thompson
Question : What risk do I take by pre-ordering if I can freely cancel or return this shit 30 days after purchase? I have a good feeling Nvidia will do something to piss me off between now and then.. but, at which point, I send their shit back to to them.
Acutally, Turing isn't Volta rather it is a modified Volta design geared towards graphics.
Ray-Tracing acceleration isn't a meme. It is part of the transitional phase from rasterization to true ray-tracing rendering.
It is the RTX family are "Voodoo 1s" of this transitional period. Voodoo 1 were pricy back when they introduced and required a separate 2D card for output.