Leatherman: "Radeon VII Lousy, Nothing New"

techspot.com/news/78202-nvidia-boss-jensen-huang-calls-radeon-vii-underwhelming.html
>Su showed off the Radeon VII’s power on stage with a series of benchmark tests and a 4K demo of Devil May Cry 5, which mostly ran at between 70 – 120 fps. Huang, however, was far from impressed. “The performance is lousy and there’s nothing new,” he said during a roundtable interview published by PC World. “[There’s] no ray tracing, no AI. It’s 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we’ll crush it. And if we turn on ray tracing we’ll crush it.”

>“It’s a weird launch, maybe they thought of it this morning,” he quipped.

>Responding to Huang’s comments about Radeon VII, AMD’s Su said: “What I would say is that we’re very excited about Radeon VII, and I would probably suggest that he hasn’t seen it yet.” Elsewhere in the interview, Huang said consumers on a budget were right to be angry at the RTX cards’ high prices, adding that the company now catered for this market with the $350 RTX 2060, which has “twice the performance of a PlayStation 4.” He also took a dig at Intel when asked about its move into graphics. “Intel’s graphics team is basically AMD, right?” Huang asked. “I’m trying to figure out AMD’s graphics team.”
I think we need more gigarays to properly render HOW MAD he is.

Attached: nvidia-jensen-huang-no-chill-amd-3_feature.jpg (1391x883, 108K)

Other urls found in this thread:

yro.slashdot.org/story/19/01/05/0312215/nvidia-slapped-with-class-action-lawsuit-tied-to-cryptocurrency-implosion
twitter.com/SFWRedditGifs

leatherman bad
asian mommy good

He doesn’t even need to lift a finger to BTFO both intel and amdead.

Imagine being this naive and thinking any of this matters. Jensen Huang is Lisa Su's uncle and their family control the entire GPU industry.

Of course he would say that of competing products

interesting, I see he subscribes to the "be a cunt in public" school of business pr

>begins to recite “Ragie Wagie” by user

>And if we turn on ray tracing we’ll crush it.
Man I sure wish AMD would have a handy switch to cut FPS in half. It sounds so practical.

COPE

>fuck new technologies
>PROGRESS BAD
Sasuga AMDrone

he's probably talking about visual quality, and he's right. there's no reason at all to buy this radeon piece of shit as a gamer. it has the latest and greatest cutting edge features an caters to grafixfags with ray tracing.

>>fuck new technologies
>>PROGRESS BAD
I have no problem with new technologies but pros must outweigh the cons. If FPS drops significantly I'm turning that shit off. Then it's just useless weight that I still paid for. When it stops affecting FPS I'll use it.

>tfw his, company released a meme of a card that was released too early even tho, they had more than 2 years, of timeand gazzilions of dollars

>2070 can't even compete with 1080ti

>2080ti is at Titan prices

>"budget" 2060 is more than 100$ expensive than previous gen

Get out Nmemevidia shills.

bruh watch thos ,

>halving FPS for barely noticeable eye candy
>progress

Give the guy a break. He's got a lot on his mind after Nvidia's stock collapsed.

>tfw AMDead only catched up with 1080 released in 2016

I'm mostly mad about him saying they invented VRR and that FS doesn't work.

ray tracing in real time will never not affect fps you moron. it's computationally demanding as are most graphical advances. hardware juust gets better at dealing with it. by your retarded logic we may aswell stick to n64 graphics with no AA, no AO etc because they all affect fps

Attached: 1486057857047.jpg (259x383, 81K)

He's just seething that ray tracing will get better with a new generation narrowing the FPS gap while AyyMD has no business model but to keep selling overclocked Polaris and GCN trash.

The bodganoff of deep learning lmao

Nvidia really should have held off on the RT/Tensor core implementation until they had gotten fab space at TSMC for a node shrink. The die itself is already fuck hueg. I know the argument will be
>MUH MACHINE LEARNING
>MUH DATACENTER/ENTERPRISE
That's fine, there's entire product lines dedicated to those client use-cases: Quadro & Tesla
They should have segmented the Turing features where they made sense. Instead of waiting for the underlying tech cost/performance penalties to decline and developer support to mature organically they tried to rush it and have left a bad taste in gamers' mouths with respect to the entire rendering concept largely perceiving it as bloat that drives up the GPU market's cost in the wake of the mining craze which breeds resentment. And honestly, unless they have some inside assurance from AMD that Navi will support hardware accelerated ray-tracing for consoles, Nvidia may have shot themselves in the foot as no large number of developers will spend the time/money to add features that only a subset of PC players will be able to utilize. They can fake reflections, soft shadows, and local lighting already well enough.

except it catched(sic) up with the 2080 you dumb disinfo shill

>ray tracing in real time will never not affect fps
Let's all pretend that Nvidia doesn't have separate hardware to deal with ray tracing. If they would have more then it would be smoother.

AMD will have to come up with a ray tracing solution because Microshaft and Snoy will demand it in future console generations, maybe not the next one but the one after.
I also think that they would not abandon AMD for Nvidia or Intel because those two companies aren't willing to sell their chips for peanuts (as we can see from the Switch price point of around 300€) which is ridiculous if you ask me.

I agree with Leatherman here, the only positive thing I can say about Radeon VII is that it will probably be really cheap in comparison to Nvidia's offering, but everything else is just disappointing. While at this point ray tracing is still a meme in the games industry due to a lack of games supporting it (same for DLSS/AI), those games will eventually be made and the tech will only continue to improve going forwards.

Nvidia making the RTX cards is a gamble, which usually would fail if AMD had good high-end cards when they launched, but will now likely pay off because AMD was fucking around doing nothing to challenge them. My guess is that they will suffer for it in the next generation when ray tracing doesn't tank the FPS in half and the graphical upgrade looks good enough for them to always be enabled, then AMD with them lacking these features will appear to be antiquated in comparison.

>We supported freesync please make cucktracing a reality

Nigga why you so insecure

Why the fuck would Microsoft and Sony give a shit about cucktracing?

>Hey AMD can you implement this really shitty idea that will utterly tank our consoles limited graphics and fps?

Since AMD controls consoles it's up to Nvidia to make it worth AMDs time, and so far it isn't worth shit.

AMDjeets when RTX2080 was released
>I-it's the same as GTX1080 but with ray tracing nobody wants
Nice damage control, faggot.

Attached: 1497910377702.jpg (568x612, 66K)

The Xbone and PS4 will have had a 7-year lifespan by the time of their replacement in 2020, so unless the refresh cycle for consoles grows shorter (it well could) I don't think AMD has to worry all that much about a lack of design time to before the Next-Next console gen.

Attached: maxresdefault.jpg (1280x720, 116K)

Yes, lets pretend that the RT cores, just like Tensor cores aren't just crippl-..specialized cores that do nothing for the general consumer and are wasting die space that could've been used by more Cuda cores. But no, everyone who buys a GPU these days also runs AI assisted dataset filters and machine learning. I could tinfoil your shit up about closed source drivers and the fact that the only way to see if RT cores do anything is by checking your card's power consumption via 12V rail clamps, but that's too far fetched even for me.

Nvidia being cunts doesn't stop him being right. Other than the memory specs being totally badass, the level of performance it's looking to achieve is 18+ months too late and 50w too hungry.

Still only 1 of their releases and it competes with (and beats in some cases) Nvidias 2nd highest card at 100 bucks less, not including refinement which is AMDs bread and butter.

What the fuck are you talking about, Pajeet? You do realise that the 1080 and 1080 Ti are different cards, right? Your manager should have properly briefed you on the product stack before you started.

even if that's the case, (You) comparing it to the older card implies that its performance is outdated, which is disingenuous shilling. now I need to figure out why you are doing all of this for free

> nVidia pushing technology before it's ready, again
Reminds me a lot of their 3D Vision thing. Remember that? 3D but this time it's good, we promise! Buy our special 3D Vision monitors and our 3D Vision glasses and make sure to buy 3D Vision certified games! Oh and buy a second card to render that second frame!

At least it meant they had the tech mostly ready for when Oculus finally appeared and now I can look at waifu butts in 3D.

I'm still holding out for their navi gddr6 stuff, that's going to be a lot more interesting.

> 7-year lifespan
What? They were already replaced by the Xbone X and the PS4 Pro. I mean yeah, you can run new games on the old ones, but all serious gamers have moved to the new gen.

Nvidia Hates Intel, so what if this just set up to hide that they plan to kill Intel. Now this is some family Dynasty level stuff right here. Nvidia gets the GPU market and AMD gets the CPU market.

Has the monthly sales or install-based of the Scorpio/PS4Pro exceeded the originals? If so I'll retract my point.

Imagine being an Nvinigger

Attached: IMG_20190111_083525.jpg (1440x2267, 381K)

>Nvidia Hates Intel
Why?
Isn't it a "the enemy of my enemy if my friend" type of deal?
I know they used ARM cpu for their jetson but still

Not when AMD produces both GPU/CPU, if AMD only produced CPU's like in the ultrapast they would still be friends with Nvidia.

jews hate other jews, plain and simple.

I remember back when Nvidia made chipsets for AMD motherboards. Simple times..

Attached: Capture.png (671x730, 563K)

Is he scared or what.
>nothing new
DLSS might be nice? Idk. But who gives a shit about ray tracing. Way too costly on performance for a gimmick. I want to play at 1440p with high decent fps without spending a fortune.

I still think they fucked up good buying ATI, maybe if they haven't AMD/Nvidia APU's would exist right now.

>ray tracing

Attached: IpDAgCi.gif (250x250, 396K)

Honestly, there's no need for this kind of behavior even if you're at the top. It's petty and childish, but it makes sense to make such statements to deflect attention from the fact that your company is readying to face a class action lawsuit made against you by investors for lying to them.

Jensen in all investors earnings calls for Q4'18 months straight up lied about record breaking growth (in the closing months of the FY, usually GPU sales drop off because gamers save their money for the next lineup) was entirely from gamers buying their GPUs because everyone wanted to play PUBG and Fortnite & the competition simply couldn't deliver the price/performance metric. Except, in those months, 90% of the sales were due to cryptominers gobbling up the cards no matter how crazy the prices got. Then, when investors pressed him on that and inventory management, he lied further, and told them that he and Nvidia are masters of inventory/channel management and that even if the cryptomemers bubble burst, they'd stay on top of their game--in time for their RTX release, which TOO would sell like hotcakes.

But then the cryptomining bubble burst. Then in a Jan 2019 earnings call, an investor correspondence from a major bank asked him what happened--and he fucked up; went on record stating along the lines of "we were really surprised by everything, we didn't expect any of this to happen; we're still evaluating..."

Which, to anyone reading in and hearing Jensen's words the prior three months and that statement, would realize that he fucking lied and lied for 3 months. Their stock tanked from 290 to 145.

>"The Company's public statements were false and materially misleading," argues the complaint from a Los Angeles law firm, seeking investors who purchased shares in NVIDIA between August 10, 2017 and November 15, 2018.

yro.slashdot.org/story/19/01/05/0312215/nvidia-slapped-with-class-action-lawsuit-tied-to-cryptocurrency-implosion

He's about to get reamed.

Attached: 1489157908943.jpg (650x488, 32K)

Had an nForce2 DFI Lan Party with Athlon XP3200 unlocked, at 2.5ghz and Radeon 9700 Pro back in the day, simple times indeed, and EVERYONE wanted them.

Meanwhile AMD has the opportunity to fuck em good and they only released Radeon V
(ega) II instead of either Navi or a decent 2060/2070 competitor at a lower/similar price for gamers.

Those were the days man.

Attached: 1531379630446.png (719x768, 753K)

If Zen2 is basically taping out of engineering sample stage, then Navi is somewhere around there. Early samples are promising, but driver work has a very long way to go before it's ready for demoing. Additionally, even if they had Navi ready--there's a good chance they can't show off its potential until much closer to 2020 (when PS5 & Xbox Scarlet are expected to launch). There's probably some contractual bindings/NDAs present between Microsoft & Sony with regards to Navi, as both are footing parts of the bill, for developing that GPU core.

Navi is the first true semi-custom GPU core, and is therefore a different beast altogether; which has long-term massively positive market implications in desktop space. Vega II is most definitely a stop gap of minimizing losses with MI-50/60 accelerators that didn't make the enterprise cut. They're being sold to let investors know that they've bridged the performance gap between last-gen Nvidia and this gen Nvidia at the 2080 spec. The margins at the Ti and TITAN cards are fucking garbage--those segments are purely dick waving segments and can only be done when you have stupid amounts of money in the bank. It's meaningless to pursue it when you are trying to work in multiple markets: embed, ultra-thins, mobile, desktop, enterprise, and console space. Nvidia has no CPU division, and basically only has to compete against AMD in 5 of the 6 markets only with GPUs.

They talk shit, and to some extent they're allowed to, because their products are better; but you have to look at that inside of a bubble. If you were tell them "ok faggots, compete with CPUs and GPUs." They'll shut the fuckup like someone sewed their mouths shut. They have no answer for CPUs, because they have no x86 license. Nvidia is the biggest player at the fucking park, but its still a small player on the world stage beyond its speciality.. Their ARM ventures bore no fruit.

And in 2020, Intel enters the market with discrete GPUs. They're VERY afraid.

Well for now it's a switch for "hardware and driver not supported" LOL.

He's a be a MASSIVE cunt in private kinda guy though.

>barely noticeable eye candy
Just like the barely noticeable AMD texture filtering, right?

>when you were a 3DFX fanboy and it's 2019

Attached: 5344957855.gif (305x164, 1002K)

True that, about Intel being afraid they should, at the time the only thing they got right are IGP's, last time they tried to make a "GPU" well, we all know how that turned out. Thank you user you have some great arguments.

Its a weird situation to be in right now as a consumer.

If you own a 9 series card , and didint want to upgrade to the 10 series , the 20 series is not likely a big enough upgrade for the cost.

i mean if you held off on buying a 1070 , why would you go buy a 2060 for the price a 1070 would have cost you 2 years ago.

Now we live in an insane Duopoly...

I wonder why he suddenly needs to bash the competition. Isn't that usually a sign of weakness/a mistake?

Attached: sophie questions.jpg (401x500, 33K)

Larabee wasn't really a GPU, as much as it was a many-core CPU repurposed for vector tasks. A CPU can do what a GPU can do, and can do it much better and accurately than a GPU can--but the scaling for such is trash and you need a crazy amount of CPU cores to do what a single GPU can do in real time. There's a reason why huge render farms are still massive amount of CPU cores over massive amount of paired GPU accelerators. Still, it'll be interesting to see what they do with their 2020-2021 discreet core. They'll most likely launch on their extremely mature 14nm+++++ process for first-gen, and then refresh up to 10nm and so on.

Technically speaking, they could just take their IGP and scale it up to the size of an Nvidia Turing die and ship it with 8GB of GDDR5X and call it a day. The hardware is plenty capable, just the drivers are complete shit and that's their Achilles heel--even more a joke than AMD's driver situation at their worst.

I dont get it either , nvidia have at worst matched amd's price to performance at mid range , and had better enthusiast cards for years now.

the 7970 and 7950 was the last worth while amd cards outside of finding a good sale.

>The hardware is plenty capable, just the drivers are complete shit and that's their Achilles heel--even more a joke than AMD's driver situation at their worst.
If that's the problem then they are already ahead of AMD vs. Nvidia because they have the dosh to throw at a large enough software team to rival Nvidia not to mention development budget to answer Gameworks.

At the end money is the name of the game.

Looks like Lisa finally took the last step and went on hormone replacement. Why the fuck did she have to cut off those awesome milkers? Gonna miss them.

Attached: lisa_su.jpg (275x183, 5K)

leatherman
>new radeons are lousy and dont offer nothing new
jurnalist
>but if they are on the level of your best card doesnt that mean that you are dissing your own product?
leatherman
>they can be above our products but our products are better even when they are bad at it
jurnalist
>not sure from what world you teleported from but here on earth it means that bad=worse
leatherman
>no more free gpus for you

but that IS mommy. She on hormones and titties cut off.

Money they might not have in the future, when the taiwanese and mainland factories close up due to "State Inspections" right after the US impose another tariff hike.

remember how he said that titan series is a cut down version of quadro yet somehow magically when vega fe was shitting on quadros and titan series he enabled the rest of the cuda cores?

FUCK YOU, LEATHERHEAD

Intel is the hero we deserve.

Attached: maxresdefault.jpg (1280x720, 94K)

NONE OF YOU GET IT

LEATHERMAN AND LISA SU ARE THE SAME PERSON

>no more free gpus for you
Keep seething Steve

That's what you get for blatantly shilling exclusively for AMD

This never happened. The added some of the quadro driver optimizations for pro visualization to the titan drivers

Intel's GPU drivers have traditionally been kind of rubbish. Whatever the case is they dont invest a lot of effort into GPU drivers

I believe the quadro-user

QUADRO GET.

Attached: 0001050_nvidia-quadro-p4000-8gb-gddr5-4displayports-pci-express-video-card.jpg (1280x960, 100K)

Driver Gains:
Catia - 72% Increase

Creo - 107% Increase

Energy - 54% Increase

Medical - 53% Increase

SNX - 654% Increase

Solidworks - 95% Increase

yes
>some optimisations

as if this kind of perfomance at a half rate card comes without unlocking hardware features

What is his problem? It looks extremely odd him recognizing AMD exists. Is he nervous? I mean, they're just bottom of the barrel MI50s never meant to see the light on mainstream. I like how the die is much smaller and its competing with fuck huge xbox dies, what I don't like is the double 8 pin.
Also, this might be the last chance in a while to have SR-IOV on the mainstream. If this card doesn't have it, you can kiss it goodbye.

Attached: thinking Darkness.jpg (871x917, 101K)

>what I don't like is the double 8 pin.
you got 650w PSU right? why would anyone buy smaller PSU unless you are hipster faggot without a tower case?

Why would you have sr-iov on a consumer card?
Enterprise pays 4000+ for that feature, and you are just a poor goy.

I got a 750W PSU but that's beside the point. Since it's dumb Vega, it's inefficient as fuck despite being on a newer node.

>why would a consumer GPU have tensor cores

Those were for viewport performance which use conventional renderers using DX or opengl. Double precision has 0 impact on it, it's from tile based rendering optimizations

Why doesn't he get out of that uh, jabroni outfit?

>AMD will appear to be antiquated
Moot point. Most people buy Nvidia either way. AMD strategy is paying off with it's CPU division. The question is when does it become viable for AMD to focus on it's GPU division.

Wait...
Lisa Su
Leesa Sun
Lease Son
Leathe Mon
Leather Man

Holy shit, you're right! DUBS=TRUTH

Totally. Look closely, Lisa has been transitioning gradually over the last few years so we're not shocked. She only recently took the hormone therapy. Compare.

Attached: im_now_Man.jpg (210x247, 12K)

See, these latest photos of leatherman are actually Su AFTER the hormonse.

Attached: compare_to_lisa_su.jpg (1920x1200, 209K)

Wtf are you on, they look nothing alike. Are you racist or something? All Asians look the same to you?

They are family so it's not racist to say that they look alike.

What a whiny little bitch

Not racist. They LITERALLY look similar even though they are chinks. Can;t you see the jawline? Its obvious.

The second photo is trimmed. You can see "he" has mantits in the original photo. She/he hasn't had their milkers removed yet.

I ran their faces through nVidia's AI image processing framework and got 0% similarity between them. They're obviously different.

See? This photo looks like Leatherman but with lipstick/makeup on. Its really obvious if you squint.

Attached: leatherman.jpg (1800x1100, 86K)

Xie even has a dildo attached its ear and near mouth.

>I ran their faces through nVidia's AI image processing framework
you mean, you ran it by your cat?

Listen, nVidia has DLSS and RTX technology. I leveraged their unparalleled performance in synergy analytics for big data artificial intelligence image processing deep learning and ray-traced the results through a graphical user interface. They do not match.

kek