What went wrong?

What went wrong?

Attached: rtg-logo_678x452.png (678x412, 39K)

Other urls found in this thread:

nowinstock.net/computers/videocards/nvidia/rtx2080/
newegg.com/msi-geforce-rtx-2080-rtx-2080-ventus-8g-oc/p/N82E16814137350?Description=rtx 2080&cm_re=rtx_2080-_-14-137-350-_-Product
twitter.com/NSFWRedditVideo

with less R&D than Nvidia or Intel, they can only do so much. It's hard to expect they can compete with both of them in all categories.

>Better
>Cheaper

What not to fuckig love?
If somebody is looking for a gpu right mow the XT seems like a great option.
Especially considering Nvidia basically killed overclocking with voltage and power locking but AMD didn't.

It's not like everybody upgrades on the release schedule right now it looks good.

>Inb4 i wanted it for 299$
It's 7nm and gddr6, those are expensive.

Attached: xyYhFMkeAXota8U3giTDJC.jpg (1280x720, 105K)

raja left

Incompetence and greed.

Raja fucked this release cycle. Just wait till next cycle when David Wang gets to show his design

>7nm
It's not consumers problem that they need expensjve node to barely match the competition while offering less features.

Looking forward to my super binned anniversary edition.

Attached: _20190524_204317.jpg (500x497, 79K)

everything

>AMD didn't
they did with Navi

That's more because it's already overclocked.

Navi is decent which is surprising. It's not the cluster fuck of blueballs and broken promises that was Vega for almost a year about "fine vine" knowing it damn well had bottle necks that needed an army of driver developers to work around.

They are cutting the fat on Vega, and it's showing promising gains as a rendering card, not just a computational behemoth.

Amd's CPU R&D is way smaller than Intel's and yet managed to come back, after all these years. What they need is a Jim Keller for their GPUs

That's because Intel fucked up.

And then the product after that with even more enhancements from Samsung. It's going to be very interesting.

Nothing, it's not profitable, so they won't put money under it.

Well that too

>implying raytracing isn't a fuckup
It will probably be ready by 2023

>$379 for 36 CUs to virtually match an RTX 2070 that retails for $500+
>$449 for 40 CUs for 90%+ RTX 2080 performance that retails for ~$800
I mean it's not particularily amazing but it's not bad rither. RDNA did its job and outperforms vega chips with 56 and 64 CUs

Attached: download.jpg (1999x2187, 460K)

You moved both cards up a performance bracket

>RTX 2080
>90%
You mean 85%. You can get custom 2080s for $660

Considering it's only 40 fucking CUs out of 64 possible can we expect the big 64cu navi to compete for the top spot?

No you can't, there's a few that get close to $700 but in general you're looking at ~$800 for a 2080.

nowinstock.net/computers/videocards/nvidia/rtx2080/

Imo pricing
Not a big enough performance increase from last generation
No reason to buy this at launch really
The nvidia cards are better in most cases and they cost the same or less cause they have been out longer

That's a double edge sword desu, they probably could push navi that far and nobody would give a shit if it chewed through 400W as long as it sent the 2080ti flying into the grave BUT not many people would buy it and AMD would most likely price it aggressively.

My gut feeling is AMD won't, they'd rather chip away at nvidia's budget $200-300 offerings like the 1660 and 2060 super so they can wiggle into the laptop gaymen market and strongarm nvidia there and get all the OEMs to jump onboard the navi train in the process. More money there than to get a few spergs to spend $1000+ for a video game card.

Attached: 1524276016303.jpg (1000x1000, 87K)

Well the behemoth GPUs arent really the meat of the market anyways, they simply are a token of prestige.

Like
>Oh Nvidia? Thats kinda old AMD are making the fastest gpus now

newegg.com/msi-geforce-rtx-2080-rtx-2080-ventus-8g-oc/p/N82E16814137350?Description=rtx 2080&cm_re=rtx_2080-_-14-137-350-_-Product

Wow, actually looking it up is hard

With blower fan and gaudy gold?

Pricing, just like with the new Ryzen chips.

Isn't it a good thing tho?

Attached: 1526667714154.jpg (270x320, 18K)

Guess that sale ended, it's now $739 which is okay for a double fan model. Don't know why they even bothered to OC it to 1800, this thing is going to run hot AF and die in a few days of use now that summer is here.

Well, we thought it's a good thing, but it appears thing go just as shitty, so it's not Raja.

Stealth black and sexy gold trim.

Wrong. It's still Rajas shit gcn.

Nothing. They just focused their efforts on things other than the high end desktop market. AMD graphics will be powering both next-gen consoles, their APUs (both for desktop and semi-custom) are unrivalled, they've just signed a big deal with Samsung and still offer competitive products in the mid-range and low end when it comes to discreet graphics cards.

The only market they don't cater to is people who enjoy Huang's cock being rammed down their throat. Can't win 'em all.

Attached: huh.jpg (401x554, 56K)

A pajeet joined and everything turned to shit. Then he left and things are starting to pick up again.

radeon VII and its price point honestly.

it cost 700$, now, a step down from here, the 450$ range is a GREAT step down, it took the high end and made it low high/high mid in price point.

This is a SIGNIFICANT price drop.

but nvidia wasn't competing in price, and amd could take a large price drop as good enough and move on.

If we don't see a 64+cu gpu, then this is still gcn with most of the guts of gcn replaced, 64cu was a hard, if we alleviate this bottleneck then we may as well make a new arch, problem. Fundamental things about gcn are great, but they had bottlenecks

Personally, 450$ for a 40cu card does not leave a whole hell of alot of room for a 'high end' card that is 64+ unless amd will price themselves out of gamers pockets.
This seems like a mid gap between gcn and 'rdna' My understanding with nvidia is they look at everything amd makes, and assumes the best, and puts a product out around that, amd has something that that is touching a 2080 and still has around 37% more power to give, and you are looking at 14tflops, with how navi scales, it could realistically get close to a titan if there isn't a crippling bottleneck between 40 and 64

Nvidia could put out a new gpu that replaces the current lineup at the same prices and is a better competition with amd, or they could potentially undercut amd prices, then when nvidia moves to 7nm, they get a 30% base uplift.

amd had a chance to get some market share if they went at 300~ range with the 40cu, now, they put it out to die, as its close enough to nvidia that people on the fence will go nvidia.

> Rajas shit gcn.
GCN isn't Raja invention?

They have no Jim Keller to save them from their proverbial dumpster fire.
Also don't discount how successful nvidia's developer shilling/outreach/bribes/etc has been. When nvidia is helping you develop your engine then it leads to certain obvious benefits for nvidia.

The will be a reactive price cut soon, especially if nvidia's super line put the pressure on. Even if the rrp doesn't reflect it retailers will have to adjust and my guess is ~$50 down.
Honestly, the rrp of all of amd's top end cards has always been pretty bad. The cut downs on the other hand have been much better recieved. 5850, 7950, 290, fury, rx470, r570, vega56 and the 5700 is too expensive and deviates.

LOL. Who's is it?

The problem here is even the cut down for the 5700 is too expensive relative to what I want to pay and there is no true mid range upgrade, everything is at an inflated 'cost to fucking much for me' price and everything in my range still performs about the same as the range performed when I bought new. the card beyond navi will likely do a mid range god, but the current cards are kind of pathetic for the price. Nvidia has SOME undercut potential being on a better yielding/cheaper node, but they fucked the die size to push ray tracing bullshit, get me something 2080ti performance without the ray tracing/ai bullshit and that would halve the gpus die size, putting it in the 1070 price bracket if not lower.

Cant you find a cheap Vega 56?

I know how you feel about the performance thing. Got a 290trix at £215 before the 390x launch and even the 970 scandal. Then for years there was nothing beating it for the price and only today is there better performance below £300 but not significantly. I do have a "borrowed" 1080 so I'm good for another couple of years.
nvidia is just taking the piss with prices. It started with the 8800gtx and then the titan solidified the days where the media will peddle even the most inane shit and it's evolved into the current titan price point being the 2080ti in a cut down chip. I really expect the super stuff to mix things up though, nvidia's sales are dropping and there's amd now competing in the mainstream segments, although the 2060 and below already were technically better value, albeit a small improvement.

Where's my RX 5650 goddammit!

AMD merger, ATI used to make legit amazing cards but shit quickly went downhill once the AMD merger happened.
ATI Exit when?

Never.

Cant make a dirt cheap card on 7nm.

The last great "value" card was the 8800GT. Pretty much everything after it was ballooned in cost due to meh Profits. Also one of the last designs where you could get a performance card in a single slot config. Thank god the games I play now are all older titles. A 740 GT is more an enough for them (1999 - 2k4).

I was on a 280x and msi held the card hostage till they offered a 1060 6gb in return
Nvidia's drivers bluescreening, black screening, and other bullshit has put me off ever updating drivers unless required because it takes a while till I get magic ones that work for everything without bullshit, I want off this fucking card do god damn bad, but nothing has a performance jump that I can consider worthwhile.

The 56, while good, isn't a big enough jump, I want guaranteed 1070 performance in everything as a minimum, with better depending on optimizations for the 250-300$ range, 56 is close, but its also comes at dealing with stock coolers for most of them or paying out the ass... honestly, I could probably find a 1080ti used for the same price range.

Honestly, here is what I see happening.

vega was a dead end, and 7 got canned with a potential cap increase if demand was there
it got canned for navi which showed great promise.
navi is still not finished, but enough work has been done to justify an interim release
nvidia will counter with either better performance and keeping the price range, or dropping the price range
amd has some wiggle room to move prices down as the best math I can do puts the card at costing around 180-280 depending on cost of materials (this is non cut down ones, the cut down ones you can more or less consider a scrapped part so all the cost of the 7nm chip is out the window, or you bring the cost of all chips down to even out a wafer which gives the full 40 even more room to move down.)

next gpu from amd will likely happen sooner then you think, due to nvidia having a 7nm they can pull at any time, and they will fast track shit when competition is actually solid, which it currently seems to be.

the next gpu will be the full new arch, not a part gcn part new arch we have with navi. if im wrong about navi being half new half old, amd may as well give up, as nvidia has a 30% boost they can trigger nearly at will.

AMD went all in on a "jack of all trades" gpu architecture (like they did with bulldozer), and gcn wasn't terrible for rendering early on, and cards like the 580 are incredible value. But GCN was/is only good at simple math hence why it owned the entire mining market. Literally without the mining craze they probably would have gone under, they got so fucking lucky that basically an entire industry popped up over night that was the perfect market for their gpus.

A radeon vii has almost as much fp16 performance as a 2080ti, but that means dick all when it comes to real time rendering and geometry. Navi is the first step towards unfucking themselves in the rendering market, but at the same time su bae mentioned vega wasnt going anywhere at CES for a reason. Cause it do compute better

>same price as fucking nvidias overpriced offers
>about the same performance but no raytracing

>Especially considering Nvidia basically killed overclocking with voltage and power locking but AMD didn't.
>they did with Navi
>That's more because it's already overclocked.
HMMMMMMMMMMM

>If somebody is looking for a gpu right mow the XT seems like a great option.
Wow, aren't you breaking NDA? I didn't even know they were sampling them this early.

Attached: Raja-Koduri.jpg (711x400, 33K)

Yes?

Attached: _20190612_183435.jpg (720x410, 65K)

ok, so lets say you play older games. Games that don't need a $300 or $400+ card in order to get high fps, max quality settings used, and high res. Pretty much any $100 card should do the job cause these games came out like in 2004 or earlier. So the performance gains of the $100 card should blow away anything that was top of the line in 2004. But wait, it don't. I don't see any 256 bit bus cards cheap. Meanwhile the 8800GT card had a 256 bit bus and only cost $200.00. What I do see is higher clocks. But that don't mean shit if the data bus is thin as a toothpick.

read the tree quotes, to understand how dumb double standards are

The mining market bit amd so hard in the ass the gpu division nearly died due to it.

Nothing like having gpus for gamers, but no gamers own them, and everyone who bought them are now flipping them for next to nothing, and the only way to sell new is to sell extremely cheap.

now you have no gamer market, you have no miner market, and all the gamers bought nvidia.

the only saving grace here for amd is is nvidia gave out such shit performance upgrades that people held onto older cards longer. the gen that comes next, the performance increase gen form both amd and nvidia is going to be a floodgate gen, and amd squandered it by pricing themselves out of the current gen, so its even harder to get people to make a switch when nvidia is going to have not only smaller dies, a more mature node, and a 30% bare minimum uplift, and amd already has the 7nm node uplift gain.

the cards back then didn't have compression, a new 256 bus gets more shit through then an old one did.

>and amd already has the 7nm node uplift gain.
This.
NVIDIA's better design is allowing them to use a cheaper, more mature 12nm node to compete with the 7nm node while probably making even more money than AMD. They can even drag the market forward on raytracing at the same time, which is yet another thing AMD will need to compete with.
When NVIDIA moves to the then cheaper and more mature 7nm next year, they can either destroy AMD on performance, or rake in the profits. Either way they extend the lead.
Hopefully Intel uses Samsung's 7nm to create something magical and brings more competition to the space.

traitor

no, nvidia is on the 12nm node (custom) because it incorporates processes to make their retarded die size viable.

Amd and nvidia have been looking at chiplets for a LONG fucking time, and raytracing is honestly the best thing for it as its all retardedly simple but at the same time time consuming math
Nvidia forced ray tracing not because it was ready, or even good, but because this fucks their competition up, amd has been able to do ray tracing better than nvidia till tensor cores got used for them, even then, its iffy if nvidia is better than amd by an appreciable number, all we know is that ray tracing is locked to nvidia in their implementation.
the above with ray tracing does a few things

1) it shifts the need for a strong cuda gpu to one needing strong tensor cores, as nvidia hit a bottleneck with maxwell/pascal/turing performance, much like amd did with fury, but nvidia, instead of making it better, decided to change how the game is played because they have a head start

2) because ray tracing is VERY mcmable, in fact it absolutely loves it, it enables nvidia to brute force raytracing if they want to. once ray tracing is done, and can be acceptably realtime, there is no longer a need to reinvent the wheel every few years, there is just getting it to more and more passes, which will likely last for another 20+ years, so while it is an end game and will hit a point where more no longer matters, it has enough of a tail to justify the presuit.
b) we already see quake at 20fps with full ray tracing, at 4k, 40-50fps at 1440, and over 60 at 1080... GG this is it, if nvidia chipsets and puts 2 rt cores on a chip, they effectively double the ray capabilities, they put 4 on it they quadruple it.

3) allows them to have video play of in game that look better then amd when they lock it.

Its honestly a very smart move, but amd is a number crunching king (At least traditionally) and a custom solution shits on turring for ai... going to be a fun few years