Was there a bigger fraud than RTX last decade?

Was there a bigger fraud than RTX last decade?

>overpriced as fuck
>barely noticeable in a couple of places and doesn't even look any better (muh realism, who fucking cares)
>just a couple of games "support" it with just a couple of scenes
>it tanks FPS so much that it makes it barely playable
>DLSS makes it so fucking blurry that even consoles with their 720p graphics look better

It's a fucking joke.

Attached: iez4NMdhNoQKuJSz9KQJD9.jpg (2560x1440, 367K)

Other urls found in this thread:

semiaccurate.com/2009/10/01/nvidia-fakes-fermi-boards-gtc/
pcworld.com/article/3333637/nvidia-driver-freesync-monitor-support-geforce-graphics-cards.html
graphics.stanford.edu/~henrik/images/cbox.html
vimeo.com/290465222
tftcentral.co.uk/reviews/lg_34gk950f.htm
twitter.com/NSFWRedditVideo

>Was there a bigger fraud than RTX last decade?
>3D vision.
>PhysX
>G-Sync
>Nobody will use DX12/Vulkan
>Wooden screws
>sub-pixel AA
>hairworks
>goyimworks
>the way you meant to be played

>Wooden screws
what's that referring to?

Imagine being this new.

>>PhysX
>>G-Sync
What the fuck are you even talking about?
Both work just fine.

we were you when novidia faked even their paper launches?
semiaccurate.com/2009/10/01/nvidia-fakes-fermi-boards-gtc/

Attached: charlie.jpg (400x400, 27K)

nvidia showed off new card the new card had wooden screws because it was a prototype, people went ape shit, nothing happened.

>PhysX
moved entirely to CUDA
PhysX is just dead.
>G-Sync
pcworld.com/article/3333637/nvidia-driver-freesync-monitor-support-geforce-graphics-cards.html

nice the images aren't even there anymore

>moved entirely to CUDA
>PhysX is just dead.
Do you even understand what Physx is, retard?

Not all monitors support shitsync, not all Nvidia GPUs support shitsync. There's a bunch of issues with shitsync and Nvidia's GPUs as well.

Gsync works flawless.

you should totally buy that RTX card.
You don't want your game to look like the image on the left, right? :^)
wow, so good looking. I don't know how we even rendered games before.
The more you buy, the more you save!

Attached: rtx-cornell-box-side-by-side.jpg (2500x703, 90K)

ofc I understand what physx is.
do you know what physx is?
do you know what DC is?
do you know what openCL is?
do you know what vulkan is?

physx just got phased out by novidia, the underlying hardware disappeared and the "marketed" brandname is just like every other physics accelerator out there implemented either in opencl/gl or vulkan or D.C..
Apparently you don't know what physx was and what it has become.
As for freesync. freesync, as a standard, is a superset of gsync, better implementation, less hardware, cheaper product, royalty free.
g-sync is a subset of freesync, but idiots like you are comparing $100 freesync monitors to $1000 g-sync monitors.
>hurr durr his $100 monitor has only 90% of the features of my $1000 monitor
if freesync was bad, novidia wouldn't pick freesync monitors to support.

>crippled shadows and reflections
vs
>regular shadows and reflections
get fucked faggot
check these renders from 2002
graphics.stanford.edu/~henrik/images/cbox.html
>RTX MY ASS

>based nvidia reinvented graphics
>fraud
kys retarded AyyyMD cucks

The point was that they literally went on stage and showed that shit off with the phrase "This is computer graphics today"
I know it's bullshit, but Nvidia keeps doing these retarded presentations.

Attached: nvidia.jpg (1049x654, 34K)

JUST

Attached: memeTX.jpg (2496x1460, 169K)

Back in the year 2000...

Was there a bigger fraud than pixel shaders last decade?

>overpriced as fuck
>barely noticeable in a couple of places and doesn't even look any better (muh realism, who fucking cares)
>just a couple of games "support" it with just a couple of scenes
>it tanks FPS so much that it makes it barely playable
>lack of MSAA makes it so fucking blurry that even the PS2 with its 480i graphics looks better

It's a fucking joke.

>The point was that they literally went on stage and showed that shit off with the phrase "This is computer graphics today"
That's right, without pre baked shit THIS is real time computer graphics today.

Reminder that most amdrones are zoomers in their teens or in thier early 20's with absolutely no knowledge in hardware history. I still remember the same kind of shitstains bitching about GF3 which was actually slower than GF2Ti

PS&VS were borderline silly before 2.0.
GF3 didn't pull a nearly ~30-50% ASP hike.

>being proud of being an nvidia drone for two decades
Fuck off, your kind killed off SGI.

SGI was even worse than Nvidia.

>Reminder that most amdrones are zoomers in their teens or in thier early 20's with absolutely no knowledge in hardware history. I still remember the same kind of shitstains bitching about GF3 which was actually slower than GF2Ti
Why would you be an AMD drone? Just because you are poor?
AMD is always behind, both with CPUs and GPUs.

Attached: 1459552436484.jpg (511x509, 36K)

Time and time again the people who use terms like gay tracing and "meme" don't understand ray tracing at all.

>GF3 didn't pull a nearly ~30-50% ASP hike.
R&D costs aren't really comparable and nobody forces you to buy new GPU. 15 years ago GPUs got obsolete in a year while now you can max out games with 3 years old cards. I can't even imagine the zoomer outcry if they can't even start a game with their 2 years old card like in the early 2000's.

Too bad K7 and K8 were a thing.

>PS&VS were borderline silly before 2.0.
Exactly. At first they were useless because no games used them, then they picked up steam, and now they're the only way anything is rendered.

>prerendered cgi that took five months to compute vs real time rendering done in nanoseconds
LOL

Attached: rarted.jpg (800x450, 26K)

>R&D costs aren't really comparable
Amortized over the waaaaaaaaaay larger number of units moved than back in 2000.
>nobody forces you to buy new GPU
Not an argument.
No, you dumbass, pre 2.0 SM was literal fucking ass.

Kids these days are not used to drastic jumps in graphics, hence all the bitching about a new feature.

Only 17 years and many many shrinks away.

>No, you dumbass, pre 2.0 SM was literal fucking ass.
You think you're contradicting me and yet you're supporting my point exactly. At first, it was ass. Nobody bought a card on the strength of it. No games used it. Sound familiar? And now it's everywhere, it's fundamental to how every game renders.

>At first, it was ass
It was ass because the spec was ass, took IHVs screaming at MS to make it non-retarded.
>No games used it
Few games used pre-2.0 programmable shading because pre-2.0 programmable shading was ass.
>Sound familiar?
That it doesn't.

This is the only time I feel kinda sad about not being able to get a Freesync version over Gsync one with LG 34GK950.

Freesync version has native IPS 34" 144hz panel while Gsync version is gimped with 100hz+20hz OC because of the outdated Gsync module that doesn't support 144hz.

However I can only order a Gsync version because our local stores have no Freesync one.

Attached: img_20190212_001855s3klw.jpg (2000x1500, 1.04M)

that's 20 years apart, idiot.
you had a 0.5 Gtexel/s card back then and now you have 200 Gtexel/s cards.
that's 400 times the filrate, not even counting the computing difference, the api overhead(have you ever tried to do anything without vbos and vaos?)
nvidia claiming that the shadows and reflexions on rtx off is last year's renderind, they should go back to check Doom from 2004 rendering at 60fps on 15yo cards

Attached: minresdefault.jpg (1280x720, 125K)

>AMD is always behind, both with CPUs and GPUs.
Gee it's almost as if they're competing with two huge companies, each of them separately being bigger than AMD.

They're mostly competing with Intel right now.

maybe if we all fanboy enough and buy their shitty products they will be good one day

AMD is almost as old as intel. Nvidia started from zero to market dominant while AMD's ineptitude buried ATi.

>Gee it's almost as if they're competing with two huge companies, each of them separately being bigger than AMD.
Am I supposed to feel a sympathy to a smaller corporation or something?
AMD is supplying Sony and their Playstations and that alone is making them more money.

>only 2 years and 0 shrinks away
vimeo.com/290465222

>They're mostly competing with Intel right now.
You mean they're competetive with Intel right now. Novideo currently rapes them in GPU market and they just can't compete as they are.

>AMD is almost as old as intel.
But it never was as big as intel, for most of its existence AMD was Intel's retarded little brother that was copying intel's stuff because intel's customers required an independent backup source for their products.

>Nvidia started from zero to market dominant while AMD's ineptitude buried ATi.
It's not like ATi was doing particularly well at the time otherwise they wouldn't have to sell themselves in the first place.

>AMD is supplying Sony and their Playstations and that alone is making them more money.
There's been a lot of rumors at the time saying that the contracts with console makers were really unfavorable for AMD as they simply had to win them to stay afloat after bulldozer disaster.

>Was there a bigger fraud than RTX last decade?
Yes, the cut down RX 560 with 14CUs
The RX 590 which was just an overclocked RX 580
And the whole Bulldozer "8 core" which was a disguised 4 core

At least with nvidia and intel you know what you get, AMD is lying and deceiving

>You know what you get
>3.5GB

amdrone buttmad is pretty entertaining

Attached: 1527032318381.gif (250x194, 47K)

I run Windows 7 with my RTX 2060

Attached: untitled.gif (404x522, 25K)

I love my 1080

Should I get 2080 or 1080Ti for 34"?

>pre 2.0 SM was literal fucking ass
The spec wasn't great but it wasn't useless and actually plenty of games supported older SM models than 2.0.

Plenty of games released in 2004 supported both 2.0 and older models and they barely look much inferior.

>Was there a bigger fraud than pixel shaders last decade?
yes

pentium 4 willamette with RDRAM is the ultimate PC fraud and it will never be topped

come on now

Attached: raja.jpg (678x484, 28K)

Can't you buy it online or something? A lot of FreeSync monitors work perfectly with NVIDIA cards now, you can have a look to see if anyone tested the model you're looking at.

Linux.
/thread

>Can't you buy it online or something? A lot of FreeSync monitors work perfectly with NVIDIA cards now, you can have a look to see if anyone tested the model you're looking at.
I can buy it only with the international shipping + taxes.

And I guess that still comes out as more expensive as the other model with the GSync tax?

OpenCL and Vulkan are inferior, and will phase out shortly.

Gsync would obviously cost less + they sell it locally so I can at least return it if something is fucked up with the display (which is not a rare case despite the $1000 price).
Ordering Freesync ver with Amazon would include shipping costs + tax + shipping risks (can be kicked / thrown away, etc) + mindfucks with returning it if something is broken.

>You mean they're competetive with Intel right now. Novideo currently rapes them in GPU market and they just can't compete as they are.
No, they're mostly competing with Intel.
The TAM for GPUs is so pathetic that AMD has years of things to do before actually bothering with GPUs again.

>The spec wasn't great but it wasn't useless and actually plenty of games supported older SM models than 2.0.
I can say the very same thing about DX10.
That makes it only tiny little bit less ass than it actually was.

Maybe but to say it was unless or a fraud is wrong. GeForce 4 Ti was PS 1.3 and VS 1.1 and it could happily play anything in 2004 with decent image quality.

>Vulkan
Brainlet detected

But R300 played everything better, with better visual quality including AA.
Anyway, pre-DX11 age API vendor decisions were pretty often painfully retarded, whatever.

R300 was better than GeForce 4 Ti but it was also 6 months newer.

Nvidia trying to sell RTRT with like two games supporting it is weird.
Nvidia axing RTRT from lower-end dGPUs is even weirder.
It was also the very first DX9 GPU ever so being 6 months newer is excusable.
Good stuff.

This

>not getting a microcenter warranty on a 1080ti and then waiting until a good card comes to cash out and get a replacement at the cost of the warranty.
waiting for what comes after this RTX.

Attached: TheHoeRoganExperience.jpg (615x628, 160K)

>Nvidia axing RTRT from lower-end dGPUs is even weirder.
Why? Do you understand how the technology is working?

>Why?
Because they're pushing RTRT.
You need to push it across the entire stack, top to bottom, for it to make sense.

No, you can't push the technology with the low-end GPUs with the architecture that leaves no space for rt / tensor cores. It wouldn't make any sense since it wouldn't be able to properly operate even if they could try to put them.

>Was there a bigger fraud than RTX last decade?
What the fuck you mean son? I enjoy dropping 1200€ on a new GPU for a 30% performance increase. While you're here whining, I'm enjoying all these great RTX games such as BF5, Tomb Raider, BF5 and Tomb Raider

>for a 30% performance increase
It's bigger increase for 3,5 / 4k.
You must be an idiot to use 2080Ti for 1080p.

Oh yeah, I'll just raytrace my gaymes on 4k, genius

Why not?
Cinematic 30 fps is fine.

Attached: 2019-02-15-image.jpg (3840x2160, 465K)

>LG 34GK950
jesus what a screen

tftcentral.co.uk/reviews/lg_34gk950f.htm

Freesync version is the best 34" screen on the market.
Gsync is unfortunately gimped by the Gsync module which downgrades the native 144hz panel back to 100Hz+20Hz OC, so 120hz.

Freesync works flawlessly with 144hz without any OC needed.

thats a 2060 though

Even better then, solid 60 fps with 2080ti.

just get the gsync one and sell it when a new monitor comes out with higher hz. im sure it will retain a good proportion of its value

>just get the gsync one and sell it when a new monitor comes out with higher hz. im sure it will retain a good proportion of its value
Meh, I'm bad at dealing with random people and selling stuff, I still have old hardware like 4670k / DDR3 memory, Noctua cooler, etc that I wanted to sell but didn't make a post.

im in london with a shitty laptop. how much would you sell it for? im a nice guy in med school

Well, truth be told there isn't much of a difference between 120Hz and 144Hz. You're unlikely to notice in practice, plus 120Hz is actually the better choice for general purpose use beyond gaming, since it can display both 24FPS and 30FPS video without judder. At 144Hz you'll get judder when watching 30FPS or 60FPS video, which are quite common online (YouTube and such). It sucks that you can't get the best option and it especially sucks that you'd have to buy GSync which locks you to NVIDIA cards for the future, but in practice the monitor is probably still going to be good (though I wouldn't buy curved shit at all, but that's another matter).

>Well, truth be told there isn't much of a difference between 120Hz and 144Hz. You're unlikely to notice in practice, plus 120Hz is actually the better choice for general purpose use beyond gaming, since it can display both 24FPS and 30FPS video without judder. At 144Hz you'll get judder when watching 30FPS or 60FPS video, which are quite common online (YouTube and such). It sucks that you can't get the best option and it especially sucks that you'd have to buy GSync which locks you to NVIDIA cards for the future, but in practice the monitor is probably still going to be good (though I wouldn't buy curved shit at all, but that's another matter).

I just feel like I wouldn't be able to sustain it well enough with 144hz anyway with such a huge resolution, probably not even 120hz, at least not in AAA games with the ultra settings.

>im in london with a shitty laptop. how much would you sell it for? im a nice guy in med school

16Gb DDR-III 1866MHz Kingston HyperX Fury Black (HX318C10FBK2/16) (2x8Gb KIT)
Noctua NH-U12P SE2
Intel Core i5 - 4670K OEM
MSI Z87-G43 GAMING

Dunno, $300 optimistically, probably less. I didn't check the prices so I don't know.

Intel is as old as IBM.

>I just feel like I wouldn't be able to sustain it well enough with 144hz anyway with such a huge resolution, probably not even 120hz, at least not in AAA games with the ultra settings.
That is likely true, but that's why you're buying a variable refresh rate monitor in the end. GSync and FreeSync are basically useless if you can always sustain FPS equal to the monitor's refresh rate. Of course, that doesn't mean you won't need a fast card for the resolution though. I probably wouldn't want anything slower than a 1080 Ti and at 3440x1440 and up to 120Hz you could probably make good use of a 2080 Ti as well. This shit is at the high end, neither the monitors nor the PCs to drive them will be cheap.

That's more like a future proof screen and I'm also tired of my old and busted Dell u2410 with 60hz and 24".

After all a monitor is what you use every time and it's right in front of you, it's more important than any hardware updates and it can last a lot longer as well.
Doubt that 120-144hz would be pushed any further up to 200-240hz any time soon. And even if 34" could have 200hz - how the hell would you be able to get 200fps? Maybe in some old games, but getting 200fps in modern games with 3,5/4k?

You now realize that nvidia pays AMD off to remain in the otherwise unfavorable (for AMD) GPU market because otherwise nvidia would be a true monopoly and would be subject to federal regulations.

Wow we've got an insider who lives in his mom's basement.

On a scale from 1 to impossible, how hard would it be for a third competitor to enter the GPU race?

>Was there a bigger fraud than RTX last decade?

You probably forgot what happened at the beginning of 2015.

Attached: 1424308036140.jpg (640x706, 27K)

Dunno, ask Intel since they're going to do just that in a year or 2.

They have been threatening with that for a few years now.

No, I don't think they have at all. It's been a while since their last failed attempt with the Larrabee or whatever it was called. That was like 10 years ago.

Pile driver and Vega

They've been poaching talent from several chip makers - including AMD - for quite a while for their GPU division. I'm pretty sure they're going to at least release something.

Vega 56 was pretty good. 64 started out pretty terrible, but is at least decent now with driver improvements.

>tfw still using gtx 980, skipped 10xx series, skipped 20xx series
Shieet, 20xx is even more expensive now, 30xx will be expensive as fuck as well most likely.

man, you AMDrones are just eternally salty, aren't you? must suck to worship a company that gets btfo every year by their competitor.

Intel Core processors in general, and the fake "generational gains" in IPC.

i want intel to get into dedicated GPUs and nvidia to get into CPUs. we'd finally have real competition.

Vega 64 was never "terrible", that was just marketing BS pushed by youtube influencers. It just had slightly worse value than Vega 56 before the prices came down.