>overpriced as fuck >barely noticeable in a couple of places and doesn't even look any better (muh realism, who fucking cares) >just a couple of games "support" it with just a couple of scenes >it tanks FPS so much that it makes it barely playable >DLSS makes it so fucking blurry that even consoles with their 720p graphics look better
>Was there a bigger fraud than RTX last decade? >3D vision. >PhysX >G-Sync >Nobody will use DX12/Vulkan >Wooden screws >sub-pixel AA >hairworks >goyimworks >the way you meant to be played
Brody Hernandez
>Wooden screws what's that referring to?
Christian Campbell
Imagine being this new.
Christian Torres
>>PhysX >>G-Sync What the fuck are you even talking about? Both work just fine.
>moved entirely to CUDA >PhysX is just dead. Do you even understand what Physx is, retard?
Not all monitors support shitsync, not all Nvidia GPUs support shitsync. There's a bunch of issues with shitsync and Nvidia's GPUs as well.
Gsync works flawless.
Christian Taylor
you should totally buy that RTX card. You don't want your game to look like the image on the left, right? :^) wow, so good looking. I don't know how we even rendered games before. The more you buy, the more you save!
ofc I understand what physx is. do you know what physx is? do you know what DC is? do you know what openCL is? do you know what vulkan is?
physx just got phased out by novidia, the underlying hardware disappeared and the "marketed" brandname is just like every other physics accelerator out there implemented either in opencl/gl or vulkan or D.C.. Apparently you don't know what physx was and what it has become. As for freesync. freesync, as a standard, is a superset of gsync, better implementation, less hardware, cheaper product, royalty free. g-sync is a subset of freesync, but idiots like you are comparing $100 freesync monitors to $1000 g-sync monitors. >hurr durr his $100 monitor has only 90% of the features of my $1000 monitor if freesync was bad, novidia wouldn't pick freesync monitors to support.
The point was that they literally went on stage and showed that shit off with the phrase "This is computer graphics today" I know it's bullshit, but Nvidia keeps doing these retarded presentations.
Was there a bigger fraud than pixel shaders last decade?
>overpriced as fuck >barely noticeable in a couple of places and doesn't even look any better (muh realism, who fucking cares) >just a couple of games "support" it with just a couple of scenes >it tanks FPS so much that it makes it barely playable >lack of MSAA makes it so fucking blurry that even the PS2 with its 480i graphics looks better
It's a fucking joke.
Cameron Howard
>The point was that they literally went on stage and showed that shit off with the phrase "This is computer graphics today" That's right, without pre baked shit THIS is real time computer graphics today.
Brandon Williams
Reminder that most amdrones are zoomers in their teens or in thier early 20's with absolutely no knowledge in hardware history. I still remember the same kind of shitstains bitching about GF3 which was actually slower than GF2Ti
Xavier Carter
PS&VS were borderline silly before 2.0. GF3 didn't pull a nearly ~30-50% ASP hike.
Brody Roberts
>being proud of being an nvidia drone for two decades Fuck off, your kind killed off SGI.
Henry Sanders
SGI was even worse than Nvidia.
Jackson Nguyen
>Reminder that most amdrones are zoomers in their teens or in thier early 20's with absolutely no knowledge in hardware history. I still remember the same kind of shitstains bitching about GF3 which was actually slower than GF2Ti Why would you be an AMD drone? Just because you are poor? AMD is always behind, both with CPUs and GPUs.
Time and time again the people who use terms like gay tracing and "meme" don't understand ray tracing at all.
Blake Price
>GF3 didn't pull a nearly ~30-50% ASP hike. R&D costs aren't really comparable and nobody forces you to buy new GPU. 15 years ago GPUs got obsolete in a year while now you can max out games with 3 years old cards. I can't even imagine the zoomer outcry if they can't even start a game with their 2 years old card like in the early 2000's.
Jonathan Martin
Too bad K7 and K8 were a thing.
Robert Robinson
>PS&VS were borderline silly before 2.0. Exactly. At first they were useless because no games used them, then they picked up steam, and now they're the only way anything is rendered.
Elijah Barnes
>prerendered cgi that took five months to compute vs real time rendering done in nanoseconds LOL
>R&D costs aren't really comparable Amortized over the waaaaaaaaaay larger number of units moved than back in 2000. >nobody forces you to buy new GPU Not an argument. No, you dumbass, pre 2.0 SM was literal fucking ass.
Matthew Torres
Kids these days are not used to drastic jumps in graphics, hence all the bitching about a new feature.
Jordan Rogers
Only 17 years and many many shrinks away.
Oliver Hughes
>No, you dumbass, pre 2.0 SM was literal fucking ass. You think you're contradicting me and yet you're supporting my point exactly. At first, it was ass. Nobody bought a card on the strength of it. No games used it. Sound familiar? And now it's everywhere, it's fundamental to how every game renders.
Samuel Russell
>At first, it was ass It was ass because the spec was ass, took IHVs screaming at MS to make it non-retarded. >No games used it Few games used pre-2.0 programmable shading because pre-2.0 programmable shading was ass. >Sound familiar? That it doesn't.
Gabriel Cook
This is the only time I feel kinda sad about not being able to get a Freesync version over Gsync one with LG 34GK950.
Freesync version has native IPS 34" 144hz panel while Gsync version is gimped with 100hz+20hz OC because of the outdated Gsync module that doesn't support 144hz.
However I can only order a Gsync version because our local stores have no Freesync one.
that's 20 years apart, idiot. you had a 0.5 Gtexel/s card back then and now you have 200 Gtexel/s cards. that's 400 times the filrate, not even counting the computing difference, the api overhead(have you ever tried to do anything without vbos and vaos?) nvidia claiming that the shadows and reflexions on rtx off is last year's renderind, they should go back to check Doom from 2004 rendering at 60fps on 15yo cards
>AMD is always behind, both with CPUs and GPUs. Gee it's almost as if they're competing with two huge companies, each of them separately being bigger than AMD.
Asher Young
They're mostly competing with Intel right now.
Jaxon Hill
maybe if we all fanboy enough and buy their shitty products they will be good one day
Luke Nelson
AMD is almost as old as intel. Nvidia started from zero to market dominant while AMD's ineptitude buried ATi.
Colton Brown
>Gee it's almost as if they're competing with two huge companies, each of them separately being bigger than AMD. Am I supposed to feel a sympathy to a smaller corporation or something? AMD is supplying Sony and their Playstations and that alone is making them more money.
>They're mostly competing with Intel right now. You mean they're competetive with Intel right now. Novideo currently rapes them in GPU market and they just can't compete as they are.
>AMD is almost as old as intel. But it never was as big as intel, for most of its existence AMD was Intel's retarded little brother that was copying intel's stuff because intel's customers required an independent backup source for their products.
>Nvidia started from zero to market dominant while AMD's ineptitude buried ATi. It's not like ATi was doing particularly well at the time otherwise they wouldn't have to sell themselves in the first place.
>AMD is supplying Sony and their Playstations and that alone is making them more money. There's been a lot of rumors at the time saying that the contracts with console makers were really unfavorable for AMD as they simply had to win them to stay afloat after bulldozer disaster.
Justin Roberts
>Was there a bigger fraud than RTX last decade? Yes, the cut down RX 560 with 14CUs The RX 590 which was just an overclocked RX 580 And the whole Bulldozer "8 core" which was a disguised 4 core
At least with nvidia and intel you know what you get, AMD is lying and deceiving
Can't you buy it online or something? A lot of FreeSync monitors work perfectly with NVIDIA cards now, you can have a look to see if anyone tested the model you're looking at.
Lincoln Clark
Linux. /thread
Jonathan Smith
>Can't you buy it online or something? A lot of FreeSync monitors work perfectly with NVIDIA cards now, you can have a look to see if anyone tested the model you're looking at. I can buy it only with the international shipping + taxes.
Colton Bennett
And I guess that still comes out as more expensive as the other model with the GSync tax?
Benjamin Cooper
OpenCL and Vulkan are inferior, and will phase out shortly.
Landon Morgan
Gsync would obviously cost less + they sell it locally so I can at least return it if something is fucked up with the display (which is not a rare case despite the $1000 price). Ordering Freesync ver with Amazon would include shipping costs + tax + shipping risks (can be kicked / thrown away, etc) + mindfucks with returning it if something is broken.
Evan Howard
>You mean they're competetive with Intel right now. Novideo currently rapes them in GPU market and they just can't compete as they are. No, they're mostly competing with Intel. The TAM for GPUs is so pathetic that AMD has years of things to do before actually bothering with GPUs again.
Mason Moore
>The spec wasn't great but it wasn't useless and actually plenty of games supported older SM models than 2.0. I can say the very same thing about DX10. That makes it only tiny little bit less ass than it actually was.
Christian Murphy
Maybe but to say it was unless or a fraud is wrong. GeForce 4 Ti was PS 1.3 and VS 1.1 and it could happily play anything in 2004 with decent image quality.
Anthony Baker
>Vulkan Brainlet detected
Brody Brown
But R300 played everything better, with better visual quality including AA. Anyway, pre-DX11 age API vendor decisions were pretty often painfully retarded, whatever.
Jordan Perez
R300 was better than GeForce 4 Ti but it was also 6 months newer.
James Young
Nvidia trying to sell RTRT with like two games supporting it is weird. Nvidia axing RTRT from lower-end dGPUs is even weirder. It was also the very first DX9 GPU ever so being 6 months newer is excusable. Good stuff.
Michael Gutierrez
This
Nathaniel Walker
>not getting a microcenter warranty on a 1080ti and then waiting until a good card comes to cash out and get a replacement at the cost of the warranty. waiting for what comes after this RTX.
>Nvidia axing RTRT from lower-end dGPUs is even weirder. Why? Do you understand how the technology is working?
Hunter Murphy
>Why? Because they're pushing RTRT. You need to push it across the entire stack, top to bottom, for it to make sense.
Jaxon Stewart
No, you can't push the technology with the low-end GPUs with the architecture that leaves no space for rt / tensor cores. It wouldn't make any sense since it wouldn't be able to properly operate even if they could try to put them.
Ayden Adams
>Was there a bigger fraud than RTX last decade? What the fuck you mean son? I enjoy dropping 1200€ on a new GPU for a 30% performance increase. While you're here whining, I'm enjoying all these great RTX games such as BF5, Tomb Raider, BF5 and Tomb Raider
Bentley Gutierrez
>for a 30% performance increase It's bigger increase for 3,5 / 4k. You must be an idiot to use 2080Ti for 1080p.
Tyler Thompson
Oh yeah, I'll just raytrace my gaymes on 4k, genius
Freesync version is the best 34" screen on the market. Gsync is unfortunately gimped by the Gsync module which downgrades the native 144hz panel back to 100Hz+20Hz OC, so 120hz.
Freesync works flawlessly with 144hz without any OC needed.
John Bell
thats a 2060 though
Robert Williams
Even better then, solid 60 fps with 2080ti.
Carson Jackson
just get the gsync one and sell it when a new monitor comes out with higher hz. im sure it will retain a good proportion of its value
Julian Morales
>just get the gsync one and sell it when a new monitor comes out with higher hz. im sure it will retain a good proportion of its value Meh, I'm bad at dealing with random people and selling stuff, I still have old hardware like 4670k / DDR3 memory, Noctua cooler, etc that I wanted to sell but didn't make a post.
Adrian Brooks
im in london with a shitty laptop. how much would you sell it for? im a nice guy in med school
Gabriel Wilson
Well, truth be told there isn't much of a difference between 120Hz and 144Hz. You're unlikely to notice in practice, plus 120Hz is actually the better choice for general purpose use beyond gaming, since it can display both 24FPS and 30FPS video without judder. At 144Hz you'll get judder when watching 30FPS or 60FPS video, which are quite common online (YouTube and such). It sucks that you can't get the best option and it especially sucks that you'd have to buy GSync which locks you to NVIDIA cards for the future, but in practice the monitor is probably still going to be good (though I wouldn't buy curved shit at all, but that's another matter).
Jose Cox
>Well, truth be told there isn't much of a difference between 120Hz and 144Hz. You're unlikely to notice in practice, plus 120Hz is actually the better choice for general purpose use beyond gaming, since it can display both 24FPS and 30FPS video without judder. At 144Hz you'll get judder when watching 30FPS or 60FPS video, which are quite common online (YouTube and such). It sucks that you can't get the best option and it especially sucks that you'd have to buy GSync which locks you to NVIDIA cards for the future, but in practice the monitor is probably still going to be good (though I wouldn't buy curved shit at all, but that's another matter).
I just feel like I wouldn't be able to sustain it well enough with 144hz anyway with such a huge resolution, probably not even 120hz, at least not in AAA games with the ultra settings.
>im in london with a shitty laptop. how much would you sell it for? im a nice guy in med school
Dunno, $300 optimistically, probably less. I didn't check the prices so I don't know.
Jack Rodriguez
Intel is as old as IBM.
Jonathan Richardson
>I just feel like I wouldn't be able to sustain it well enough with 144hz anyway with such a huge resolution, probably not even 120hz, at least not in AAA games with the ultra settings. That is likely true, but that's why you're buying a variable refresh rate monitor in the end. GSync and FreeSync are basically useless if you can always sustain FPS equal to the monitor's refresh rate. Of course, that doesn't mean you won't need a fast card for the resolution though. I probably wouldn't want anything slower than a 1080 Ti and at 3440x1440 and up to 120Hz you could probably make good use of a 2080 Ti as well. This shit is at the high end, neither the monitors nor the PCs to drive them will be cheap.
Michael Price
That's more like a future proof screen and I'm also tired of my old and busted Dell u2410 with 60hz and 24".
After all a monitor is what you use every time and it's right in front of you, it's more important than any hardware updates and it can last a lot longer as well. Doubt that 120-144hz would be pushed any further up to 200-240hz any time soon. And even if 34" could have 200hz - how the hell would you be able to get 200fps? Maybe in some old games, but getting 200fps in modern games with 3,5/4k?
Leo Gutierrez
You now realize that nvidia pays AMD off to remain in the otherwise unfavorable (for AMD) GPU market because otherwise nvidia would be a true monopoly and would be subject to federal regulations.
Gavin Bell
Wow we've got an insider who lives in his mom's basement.
Lincoln Smith
On a scale from 1 to impossible, how hard would it be for a third competitor to enter the GPU race?
Alexander Morgan
>Was there a bigger fraud than RTX last decade?
You probably forgot what happened at the beginning of 2015.
Dunno, ask Intel since they're going to do just that in a year or 2.
Brandon Martinez
They have been threatening with that for a few years now.
Blake Cox
No, I don't think they have at all. It's been a while since their last failed attempt with the Larrabee or whatever it was called. That was like 10 years ago.
Asher Young
Pile driver and Vega
Mason Stewart
They've been poaching talent from several chip makers - including AMD - for quite a while for their GPU division. I'm pretty sure they're going to at least release something.
Logan Walker
Vega 56 was pretty good. 64 started out pretty terrible, but is at least decent now with driver improvements.
Alexander Moore
>tfw still using gtx 980, skipped 10xx series, skipped 20xx series Shieet, 20xx is even more expensive now, 30xx will be expensive as fuck as well most likely.
Samuel Morris
man, you AMDrones are just eternally salty, aren't you? must suck to worship a company that gets btfo every year by their competitor.
Camden Russell
Intel Core processors in general, and the fake "generational gains" in IPC.
Thomas Campbell
i want intel to get into dedicated GPUs and nvidia to get into CPUs. we'd finally have real competition.
Jason King
Vega 64 was never "terrible", that was just marketing BS pushed by youtube influencers. It just had slightly worse value than Vega 56 before the prices came down.