Why does the amd shill fear rtx?

Why does the amd shill fear rtx?

Attached: nvidia_rtx.png (779x332, 7K)

Other urls found in this thread:

anandtech.com/show/773
youtube.com/watch?v=d8enuguM3lA
nvidia.com/object/io_1240224603372.html
en.wikipedia.org/wiki/OpenCL
web.archive.org/web/20080916151600/https://www.amd.com/us-en/Corporate/VirtualPressRoom/0,,51_104_543~127451,00.html
scalibq.wordpress.com/2010/10/25/amdrichard-huddy-need-to-lie-about-tessellation/
scalibq.wordpress.com/2010/12/01/amd-tries-to-do-more-damage-control-for-tessellation/
scalibq.wordpress.com/2011/12/24/amd-and-tessellation-a-difficult-relationship/
twitter.com/NSFWRedditGif

Because AMD has nothing.

Because AYYMD will take 6 year before they'll have equivalent functionality which by that time Nvidia would have leapfrogged performance with continuous iteration with new GPU microarchitectures

Nvidia released Tesla microarchitecture CUDA GPU GeForce 8 in 2006, AYYMD had no answer until 2012 and their Terascale turd garbage had poor compute performance

Attached: 1492573533249.jpg (796x805, 159K)

Your bitch cunt whore mother is a slut? Everyone already knows that, faggot

Too bad she didn't abort you, saving the precious oxygen in this world instead of being wasted by a cunt like you

Ooooh snap
he mad!

Why does the midrange RTX card cost $100 more than the midrange of last gen and only run on par with the flagship of last gen?

Shouldn't you be checking your stock price, Jensen?

(You)

>enable rtx
>frame rate nose dives to 50%
What is there to fear exactly?

Because they aren't designing their chips for machine learning and NVidia is repurposing thier Machine/Deep learning architecture to be used for raytracing

i dont want it to explode my pc

I fear it may burn down my house

> ARE TEE REX
> IT'S IN THE GAME
What did Nvidia mean by this?

Same thing with tessellation. Except AMD can't do it at all this time. It took them until Fury X to have a built in tessellation engine.

Navi should already be deep into development that they can't slap it on. So it'll be at least another 3~5 years before amd is capable.

Right now, that's accurate, because there's still just a single game that supports it.

Attached: if i see that fucking chitose one more time.gif (500x281, 1.73M)

Tessellation was yet another useless feature that added barely any visual improvement at the cost of FPS. Hairworks was worse, though. Raytracing has to be the worst by far.

RTX on suicide watch

Attached: oh no no no no.png (791x797, 53K)

Please don't post your ignorant comments here

Nvidia implements tessellation correctly as per Microsoft DX11 spec which asks for 64x support and handles 64x tessellation without losing performance unlike AYYMD garbage

Even funnier, AYYMD bitched about tessellation being left out of DX10 when their tessellation unit can't even handle it properly as per Microsoft DX11 spec

Just because you're extremely biased to AYYMD doesn't mean you should show your extreme ignorance by saying tessellation is a useless feature when AYYMD themselves agreed to the DX11 spec in the first place

Nvidia, forward thinking and visionary company leading the masses to a brighter future with real time ray tracing while AYYMD can't compete at all, being Sony's little bitch for PS5

It was still useless, though.

based reddirt-spacer

Someone like you without any vision will say that, obviously, keep being an ignorant faggot

All of Nvidya's gimmicks are shit and remain shit.

t. brainlet gaymer who cannot distinguish between gimmicks like hairworks and actual technology

LOL what a retard you are, you don't even know history

AYYMD was first with tessellation gimmicks like TRUFORM

anandtech.com/show/773

Keep on being an ignorant retard, zoomer :^)

If it cuts FPS to 1/3, it's a gimmick. I don't care how useful it is outside of gaymes.

Their gimmick of dominating the high end seems to be working for them

But it was still useless. Actually, it was a lot more useful back in 2001 when poly models weren't nearly as complex.
1080 ti has 1.5% marketshare on Steam.

>Why does the amd shill fear ghosts?

NVIDIA SHARE PRICE IS TWENTY PERCENT DOWN

AYYMD's Truform "tessellation", they couldn't do it well in the past and they certainly can't do it well even today

Attached: ltuij49.jpg (461x354, 32K)

MID range
Its not supposed to outperform the flagship of last gen. Its midrange. End of, simple as.
Get done, idiot.

Amd for your cpu, nvidia for for your gpu.
If ypu are doing anything else, especially out of ((brand loyalty)), you are a total dunce.

Meme tracing is useless in gayming cards. It

>nvidia stock tanking
>RTX cards glitching and bursting into flames
>RTX ON cuts FPS down to 1/3

b-but look at this ancient image of some driver bug! (that was probably fixed the next week)

Attached: Loli_Laugh.gif (260x317, 390K)

AYYMDPOORFAGS can't accept the bitter truth

Nvidia is pioneering the future with RTX real time ray tracing and tessellation done RIGHT while AYYMD is stuck in the past with PS5 console garbage

RTX 20 series is being widely adopted and marketshare increases every day while AYYMD serves your regurgitated HOUSEFIRES garbage like Poolaris 590 that doesn't even support Feature Level 12_1

>This is a D3D feature level 12_0 part – meaning it lacks 12_1 features like conservative rasterization and raster ordered views. Which was fine back in 2014, but NVIDIA has been shipping 12_1 hardware since 2014 and AMD since 2017. So from one perspective, a brand-new Radeon RX 590 in 2018 is still lacking graphics features introduced by GPUs 4 years ago.

AYYMD is a company with no foresight, no vision and no creativity like Nvidia, well done with holding back the industry as usual

What does it take to be this butthurt?

This was never fixed because Truform is garbage and AYYMD gave up on Truform almost immediately

Stay mad though, AYYMDPOORFAGs that don't understand 3D history :^)

The only butthurt I see is you, AYYMDPOORFAGS with no vision, no foresight and no willingness to take risks

I'm willing to bet that image is older than you.

It's older than an ignorant zoomer like you that don't understand history, that's for sure

>ignorant
Nothing of what you say refutes the post you replied to, which said that tessellation was another useless feature. Whether or not it's "implemented correctly" doesn't make it more useful as a feature, and it is still mostly useless. I actually use nVidia for graphics, but I can definitely see that AMD just hasn't cared about implementing tessellation efficiently simply because who cares.

>ctrl+f
>AYYMD
>17 results

Someone's a little analblistered.

Attached: nvidia tessellation goggles.gif (500x500, 994K)

Amd did tesselation first though.

Then AYYMD is retarded company for implementing tessellation first in Turdscale 5000 series garbage since it's a useless feature

That doesn't mean that nVidia cards doing tessellation is something to brag about.

Found some old videos and aside from the FPS being crap (because it was recorded on an old ass PC apparently) I don't see any obvious visual issues. youtube.com/watch?v=d8enuguM3lA
At least they realized it was a waste of time first.

You seriously just live on this board spamming AYYMD all day. You've been doing it for years. Pretty sad, honestly.

why are you so mad? LOL holy shit.

>DXR finally launches
>it's an utter distaster
>novidya stock tanks, as seen in
>within hours, OP comes and shills for novidya
Pic is related.

Attached: 229337994_fd22c810f9.jpg (500x327, 174K)

Wrong, it's part of DX11 spec and Microsoft made it clear the requirements, it's AYYMD taking shortcuts by making a garbage tesselator unit in the first place despite agreeing to the DX11 standard in the first place

Nvidia implements it correctly and handles 64x with grace and finesse without performance tanking like AYYMD, simple as that, stop trying to argue with your ignorant ways

AYYMD can't even implement primitive shaders that actually work, being broken in hardware in Pooga while Nvidia Turing mesh shaders just WORKS beautifully as tested by developers

Nvidia is the real innovator, first to CUDA compute GPUs, first to real tessellation that doesn't tank in performance, first to real time ray tracing

The pure bitterness on AYYMDPOORFAGS vile attacks on RTX pretty much says it all

Just stick to your PS5 babby console, it has Poovi in it

Okay kiddo. That's the normal protocol you fucking tool.

Are you seriously retarded?

can't ray trace my encoded videos so idk.
what I actually thought was a bit funny is when video decoding using vulkan works better on nvidia than amd.
or at least it did half a - / year ago. haven't benched it since then

>Wrong, it's part of DX11 spec and Microsoft made it clear the requirements
Who made Microsoft gods? Who even doubts that novidya forced tessellation into the DX11 spec because goyworks?

>first to CUDA compute GPUs
>nvidia was first to their own proprietary interface
Now there's a shocker. In other news, AMD was the first to implement OpenCL, an actual standard.

It's their platform and their API

Don't like it? TOO FUCKING BAD for you

Wow, RTX is great. I'm glad I spent $1500 for this.

Attached: gta_sa_2018-11-15_13-07-01.jpg (2560x1440, 1.16M)

RTX is just another gimmick thing no one cares.

LOL NO

Nvidia was first to OpenCL

nvidia.com/object/io_1240224603372.html

MONTHS before AYYMD

Again, people that don't know and fail to learn history are doomed to repeat it

>Wrong, it's part of DX11 spec and Microsoft made it clear the requirements
Well yeah, it's a standard interface for using tessellation, but that doesn't mean that implementers who don't care about the feature it's an interface for have to implement it well.

What are you doing dude? No one cares. It's literally money, you're arguing shit that means nothing. SAD.

Attached: 1300044776986.jpg (250x250, 17K)

Tesselation is very similar to a technique used in CG for film. It makes sense for GPU vendors to push the visual envelope for games. It's a shame tessellation hasn't taken off, largely due to politics.

en.wikipedia.org/wiki/OpenCL

>April 20, 2009: Nvidia announced the release of its OpenCL driver and SDK to developers participating in its OpenCL Early Access Program
>September 28, 2009: Nvidia released its own OpenCL drivers and SDK implementation.

NVIDIA, FIRST TO IMPLEMENT AND SHIP OPENCL TO CONSUMERS MONTHS BEFORE AYYMD

AYYMDPOORFAGS DON'T LEARN FROM HISTORY AND THINK THEIR GARBAGE COMPANY IS GOOD, TOP KEK

>63 replies
>24 posters
Oh, it's one of THOSE threads huh.

This board isnt fun anymore. The brand wars and people actually unironically raging about it every day is really a drag. Half the threads on this board is someone trying to defend a purchase

AYYMDPOORFAGs lie about things and history, correcting their lies isn't defending purchases

RTX buyers are very happy with their cards and it's the most power efficient and highest performance GPUs in the world, AYYMD has no answer until 2020

too bad initial openCL was SHIT. Not a fan of the fragmentation in GPU compute but I can see why it happened.

dude reddit lmao xD

>APRIL 20, 2009
web.archive.org/web/20080916151600/https://www.amd.com/us-en/Corporate/VirtualPressRoom/0,,51_104_543~127451,00.html

How so?

There always were and will always be shit threads like this one. Doesn't mean there isn't a somewhat good thread every once in a while too.

AYYMD didn't ship anything that supported OpenCL on that day

en.wikipedia.org/wiki/OpenCL

>December 21, 2009: AMD released the production version of the ATI Stream SDK 2.0,[86] which provides OpenCL 1.0 support for R800 GPUs and beta support for R700 GPUs.

Stop lying about history, AYYMDPOORFAGS

Nvidia was FIRST to OpenCL support, MONTHS before AYYMD

>April 20, 2009: Nvidia announced the release of its OpenCL driver and SDK to developers participating in its OpenCL Early Access Program.
>September 28, 2009: Nvidia released its own OpenCL drivers and SDK implementation.

Point being that they were first to stand behind it, unlike nVidia who were more keen on developing their own proprietary, lock-in standard and not care about openness and compatibility.

What is Mantle? Stop trying to lie about AYYMD being "OPEN", they're not, they pushed proprietary stuff also and no, Vulkan is not Mantle, it's totally different from Mantle even on 1.0

AYYMD IS NOT AN OPEN COMPANY PERIOD

They literally gave Mantle away and willingly abandoned it in order to bootstrap the Vulkan effort. They just pushed it to demonstrate what was possible and to get a standardization effort started.

For one performance. CUDA ran circles around it.

It would be great if Nvidia would kiss and make up with OpenCL and really devote to it, by they have no need. They got us by the balls and have zero incentive to let us go.

Because it might burn their house down.

>For one performance. CUDA ran circles around it.
Sounds more like a driver issue than an issue with the standard.
>They got us by the balls and have zero incentive to let us go.
Exactly my point.

33%

>fear rtx
Uh, yeah. Literally shivering over here.

Attached: mpv-shot0001.jpg (1920x1080, 211K)

Nvidia and AMD stock tanked hard, but Intel's stock only tanked a little and is fast recovering. What did they mean by this?

>AYYMDPOORFAGS bitching about proprietary
>Using PROPRIETARY Windows OS, proprietary ECU, TVs with proprietary hardware & OS, microwaves with proprietary OS & software in cars, even their AYYMD CPU or GPU is PROPRIETARY and the Windows GPU driver is PROPRIETARY

Always hilarious to see AYYMDPOORFAGS HYPOCRITES in action

Why don't you AYYMD guys stop using proprietary stuff that you use every fucking day in life? :^)

nope it wasn't drivers. It was enough of a fuck up to allow CUDA to dominate. The only thing that will save is us everyone hates Nvidia.

JEW FEAR THE SAMURAI

There's also a pretty big difference between proprietary implementations and proprietary standards. Also I'm not using Windows.

Stop using games consoles too, they have PROPRIETARY AYYMD hardware that certainly isn't OPEN

AYYMDPOORFAGS hypocrites, so funny when you undress them naked for all to see what a huge faggot these hypocrites are

>Stop using games consoles too
Can't really stop using something I never used to begin with.

>handles 64x tessellation without losing performance
i_cant_even.png

scalibq.wordpress.com/2010/10/25/amdrichard-huddy-need-to-lie-about-tessellation/

scalibq.wordpress.com/2010/12/01/amd-tries-to-do-more-damage-control-for-tessellation/

scalibq.wordpress.com/2011/12/24/amd-and-tessellation-a-difficult-relationship/

AYYMD tessellation has always been garbage in performance

>AYYMD
>tessellation

CHOOSE NONE

Attached: 101948.png (678x400, 26K)

>Cherry-picks synthetic bench is tailored towards bendor X that does not reflect real-world performance.
>Being this much of a shill

Attached: 1267133339395.jpg (400x300, 38K)

>>frame rate nose dives to 50%
you are being very generous here

>They literally gave Mantle away
Once it became clear it was going nowhere.
Much like Close-To-Metal.
Pursuing open standards has only ever been a hail mary to give AMD relevance when its proprietary plans fizzled.

Honestly I don't think anyone should fear RTX except nvidia shills - the performance is pathetic, and no one will use the new tech until 5+ years down the line when a 2080 will be obsolete as heck anyway...
so RTX really is just a marketing ploy to try to get people to pay more for less.

>can't tell the difference between endless rofl and fear
OP, you're a tard.

Yes, RANDOMLY capitalising words like A five year-old WILL definitely convince us OF the error of OUR ways.

Why are you such a retard, user?

>ITT: nvidia shill pwnd, raped, and its belted, bleeding body left for dead in the hole it crawled out of
>and its too fucken retarded to notice

>mantle is proclaimed dead in favor of vulkan less than two years after its announcement
>"that wasn't planned at all, they just switched targets after their lock-in attempt failed"

Attached: okay-1.gif (245x285, 916K)

Attached: file.png (398x131, 20K)