post dem nvidia memes
Post dem nvidia memes
COPE thread?
SHITTER'S CLOGGED
AMD BTFO!
This one gave me a chuckle last night, but the first picture has been the best so far.
More memes boys.
BBBBBBRRRAAAAAAAAAAPPPPPP
dis is da best
AMD cope thread.
OY VEY GOYIM BUY IT
based and redpilled
Only one relevant
You know, this is all funny and shit, but on a side note why is nobody posting about how much nvidia fucking sucks on Linux these days? Member 2003? Member when Nvidia was a must for a Linux system? Well, those days are long gone, and now Nvidia is pure fucking garbage on Linux, especially if you run a rolling distro. Such a shame, I blame (You) for this, yes (You) Linus
Nope, I always remember free Radeon drivers and Nvidia being shit to set up. Way back when Beryl came out, pretty much the only non-work related reason to have 3D acceleration on Linux at that time.
rare merchant
Jensen should be the flamethrower guy
You realize US lost the Vietnam war?
I agree
Legitimate comedy genius at work.
What got worse?
The frequent updates and support for literally everything?
lmao..
considering the die area of the 2080ti
will it be the new 480 that burned up`?
Die area is unrelated, think about physics. The power consumption of the new card is 25W more which is something, but not much so don't quite expect fermifires.
the TDP is still the same at 250W
what are you on about?
miners stole all the gpus
cant win a war without weapons
Ah was thinking of the rumours that said 285W, guess those weren't quite accurate. Well, might be for aftermarket coolers.
its not a rumor, its when compared to the 1080 and not the 1080ti, +20W more then
>rumours
Jesus Christ, that's a real article
The only actually funny one so far.
On a side note, this whole turd-flinging shitfest is actually quite sad.
In my country, we have a proverb which goes something like that: one shouldn't show a fool half of the work.
Our ancestors were wise indeed.
Cringe
fpbp
He waxed his scalp!
can't be real
nice one
they have a point, the new cards are actually good and you're being drama queens, if it's time for you to upgrade and you can afford it i don't see why not, otherwise you'll be waiting several years for the next gen because amd won't have anything competitive any time soon, and when the next gen comes you'll probably be disappointed by those cards as well, you just want something to be salty about
Would you look at the time
Change the IRL grill to Kuzya.
>no proper TTY support
>no wayland support
>unfixable tearing with the proprietary driver
>they won't contribute to the free driver like AMD does
amd btfo
That misleading scale bar
This has been a wonderful thread.
Is that trademe?
Fuck of le redditfags
>Toddposting without a Todd picture or skyrim as a product
Fucking newfags.
DELET
How did ya get the address bar on the bottom?
Fuck off
You hold your phone upside down
AMDBTFO
swag fag terminology thread
This wasn't posted yet?
I fell for this meme. Kill me.
It was you fucking brainlet
2003? even 2013 nvidia was a must and amd was pretty fucking broken
really shows you that any company can flip the switch and push their shit program perfectly fine on linux in months.
>they have a point
No, they don't. We have no real testing with real world games to base performance off, nothing using raytracing or DLAA to measure performance impact and visual improvement.
There is literally no reason anyone should be saying right now to just buy them, which is why they also have an article saying not to pre-order the cards.
If you were hanging out to buy a top end card then it's probably gonna be better to get the 2080 in the long run. But if you were just looking to upgrade from a 980 or 1060 then the 2080 isn't going to blow your socks off to any degree the 1080Ti won't.
that's the joke, my redditor friend
just finished this
OY VEY GOYIM BUY IT. REMEMBER THE 6 GORILLION GIGARAYS
The founder of Jow Forums was originally a redditor.
It's a real article, this is just what they consider (((journalism))).
It still looks a fucklot like Toddposting.
We need an nvidia version of this.
If i learned anything from the 2016 elections it's that reality is now influenced by meme's whether we like it or not.
>no proper TTY support
You mean the full KMS support and EFI framebuffer that Nvidia has zero issues with?
>no wayland support
They actually submitted patches for Wayland, which were refused because Wayland is full of a bunch of contrarian sjw cucks so the project failed.
>unfixable tearing with the proprietary driver
Use gsync.
>they won't contribute to the free driver like AMD does
AMD doesn't have a free driver, Nvidia does, and they do contribute to it (tegra).
Yep. It's not really that the Nvidia driver has gotten shit lately, it's just that AMD has done a great deal of work in making the situation a lot better. Not only has Marek done an excellent job at optimizing the hell out of it but we've seen lots of contributions from other AMD devs as well as Feral Interactive, RedHat and Valve that has resulted in the driver being leagues ahead of the competition
make one with old tomb raider and nu-tomb raider
so i see they've made lara 56%
it all makes sense now
Welcome to Jow Forums newfriend
>Goy why aren't the gayland devs implementing Nvidia's proprietairy SheckelStream.
>Being antisemitic and not using Goysync.
>Look away from the open source AMD driver you semite! Nvidia has a better totally *not open source* solution!
|
>
|
|
|3
way before 2016 normalfag
already been done, i saw it in a wccftech comment section, can't find it rn tho
>tfw my xiaomeme won't go upside down
>9800
>Graphed as 98
Again with the trickery
g-sync is superior because it is synchronized with the display, freesync is a hack and an afterthought, it's the wayland devs that are being unreasonable
put your trip back on retard
freesync is a marketing name for adaptive sync which is part of the displayport standard
it's about as much hack as HDMI audio, which was added in a later revision as well
>g-sync is superior
asus.com
>**Adaptive-Sync (FreeSync) technology supported (40Hz-144Hz)
>below 40Hz it goes to shit
>g-sync has no such limitation
>not superior
Underrated
>below 40 hz
>thinks its an advantage