Nvidia Gimping 10 Series Cards Already

youtube.com/watch?v=mFSLro_OXLQ

Ready to upgrade, goyim?

Attached: cucked.png (1920x1080, 1.46M)

Other urls found in this thread:

youtube.com/watch?v=lY7_ujxbdNY
bugzilla.mozilla.org/show_bug.cgi?id=1461268
kitguru.net/components/graphic-cards/ryan-burgess/independent-tests-show-that-amd-has-the-most-reliable-graphics-drivers/
github.com/Xilinx/triSYCL
reddit.com/r/nvidia/comments/9ngovl/driver_41634_faqdiscussion/e7moxv5
youtube.com/watch?v=WtRYdAYgJTc
twitter.com/AnonBabble

FreemanSync vs GoySync
Relive vs literal botnet goyforce experience
Opengpu vs CuckWorks

Nvidia is for cucks. Amd for free men. Fact.

if you cant make the "new" series good, gimp the "old" one, then let the retards fall for the "benchmarks"

will jensen be able to afford a new leather jacket after the low yields?

wow, they have zero competition and still this... i would be so furious...

well then, off to play quantum break on 1920x1200 with medium settings, courtesy my 4yr old r9 270x gpu

funny, i even thought about upgrading to nvidia, what an idiot i am

My gpu still runs the same.

>wow, they have zero competition and still this
Nvidia is competing with their older gen cards.

look at my right leg, isnt it awesome?

>>shoots left leg for benchmark purposes

>the went with 1060 instead of 580

Attached: 1536702960042.png (960x960, 378K)

How can they gimp my card if I don't update the drivers?

people are rolling back, thats not the point

Man, it's a shame so many devs neglected to take full advantage of vega's raw FP32/FP64 throughput. It will go down as one of the most underrated GPU architectures to date.

Attached: Dirt_1080p (1).png (1295x1392, 45K)

>shoots left leg for benchmark purposes
I'm more concerned on who is doing the benchmark. The temps are a little different, so
maybe there's something going on in the background of one of the benchmarks?

Install FineWine™

Attached: 1534625333650.jpg (512x288, 22K)

u really think thats an isolated case?

i'm romantic and a little naive but you take the cake

not the first time i've heard that

Attached: 4e5d757bf9b771914b0caee77d3d777a-1200-1017.png (1200x1017, 58K)

>i even thought about upgrading to nvidia
nice digits. the thought had crossed my mind, but then I see shit like this and I'm glad I've stuck with Team Red for a decade and a half. AMD ages like fine fucking wine (R9 390 owner, Futuremark now puts it right behind the 980 in performance whereas it used to butt heads the 970)

im interested in this can you explain it to me as if i were a child pls?

Guess it's time to start saving up money for those great RTX GPUs then

Novidia's cards are 110% dependent on driver specific optimizations for games to run.
You might need to run a newer version of one of the games you already have or you might want to run a new game.
With an old driver you are prone to low fps, crashes, unplayable features and generally etc.
Tl;dr you'll be forced to update.

Just keep using older driv... oh wait, windows 10.

This one is a moar coar mess only good for very specific int cases.

Fx was honoured in many supercomputers for its humongous capabilities in I/O while having low overhead supporting fast peripherals.
There were companies who ditched xeons in order to build gpgpu monsters around opteron.

Same, GCN is such a fucking good architecture. Not the most efficient thing in the world but goddamn was it full of untapped potential.
Bought my 290 for €325 3 years ago and still haven't felt the need to upgrade, though I am considering buying a new monitor in the near future to take advantage of Freesync.

>oh wait, windows 10
wtf is that about? I still have to manually update my Nvidia drivers with Goyforce Experience

Except this was much easier to do than optimizing for more than 4 cpu threads.

Basically amd vega GPUs rape nvidia's pascal GPUs if you make them do single precision or double precision math (FP32/64). However to tap into this raw power you must make your game very optimized for shaders which is harder to than for nvidia cuda cores.

Attached: aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS81L0wvNzAzOTI5L29yaWdpbmFsL0Rvb20tRlBTLTI1NjB4MTQ0MC1VbHRyYS5w (711x453, 72K)

youtube.com/watch?v=lY7_ujxbdNY

DERP

And that's unironically a good thing.
If was handled by win10, it would just shove the nvidia gimping down your ass without anything you can do to stop it.

based and freepilled

Personally, I'm waiting to see what AMD has to offer. The RTX cards are literally Pascal with Ray tracing garbage tacked on, so if AMD doesn't deliver I'm just getting a GTX 1080 Ti for high refresh rate 1440p.

Look at theoretical peak FP32/FP64 of GPUs, THAT is the true performance they can crank out but only if you program your game in a way that can utilize all of that power (easier to do on nvidia GPUs).

pic related vega 64 vs gtx 1070

Attached: Comb15102018010134.jpg (1385x1119, 173K)

So the only way for AyyMD GPUs to compete is to have programs specifically tailored to run good on them?
That's beyond pathetic. I'm sticking to Nvidia

>ShillwareUnboxed
He literally just says whatever he's paid to.

For vega and older pretty much. Something would have to drastically change in the upcoming navi architecture for devs to have an easier time optimizing games for AMD graphics.

It's the same problem on nvidia though, devs also have to work tirelessly to make sure they can squeeze out as much performance out of cuda cores.

I was planning on buying a 1080 TI, but what's the fucking point? NVIDIA are a bunch of shifty fucking pieces of shit.

Attached: 1522287218730.jpg (500x500, 23K)

You know what? PC gaming can go suck a fat fucking dick. RAM prices up the ass, GPU market completely fucked with zero competition, the only to go high end is with nvidia but the prices are insane and they will gimp your shit, PC getting cucked hard by Rockstar and Sony killing it with exclusives.

I bought a Dell AW3418DW with a 1080ti this year and I'm honestly balls deep into buyer's remorse. Gonna sell all this useless garbage and stick to consoles from now. Consoles for vidya, cheap laptop for work, as it should be.

Prove it

>How to make them buy my new screencards?
>well hell I'll just nerf the drivers a bit for the old gpus and just focus on optimizing for the new ones
>????
>profit
this is also why you want open source drivers

There's no reason to upgrade. I'm running most games at ~60fps (144fps on some) on ancient 2012 hardware.

FX was simply a flawed architecture. Having tons of cores that have worse IPC than your previous architecture and half of your competitor is just retarded.

>So the only way for AyyMD GPUs to compete is to have programs specifically tailored to run good on them?
You do realize that's also entirely true of nvidia? The only difference is that because of nvidia's shady business practices (almost Intel tier at times), they have the leading market share, so when devs are too lazy to tailor their games to run well on competing GPUs they invariably pick nvidia. If they picked AMD instead you'd see the exact opposite situation, with nvidia cards struggling to keep up or even failing miserably on the majority of games.

>Ready to upgrade, goyim?
I'm already on the latest nouveau, thanks

such a pity all mayor Deep Learning libraries were cucked by nvidia and only support cuda.

based and redpilled

You mean FP16?

>Nvidia gimps 10 series
>buys 10 series anyway

Are you retarded? Soon 1080ti will perform under the 2070 that is a 2 tiers lower grade chip.

Then go to v and don't come back. Only a brain-dead nigger would choose to downgrade to a console after using your setup. Sony and Microsoft are the biggest jews around and you're willing to suck their dicks just because it's convenient.

can't you just not update?

At least you're paying like 200 bucks to get games without worrying that your 800 bucks gpu isn't gonna cucked to hell and beyond just a few months after you buy it. Go suck nvidia's dick.

Hol up how can I roll back?

Still trying to post fake news & FUD, AYYMDPOORFAGS?

Just don't update the fucking drivers. My 1060 has been running perfectly fine on stock evga drivers from the CD. Still, it's my last Nvidia purchase but you're asking to get fucked. If you don't like nvidia's jewry then you're in for a rude awakening with consoles

>devs have to tailor their renderer to specific chips to get decent frame rates
The only conclusion I can draw from this is that Microsoft royally shit the bed with DirectX

Attached: 1445649262664.gif (200x150, 2.48M)

youtube.com/watch?v=lY7_ujxbdNY

DEBUNKED HARD WITH NO SURVIVORS

Keep on trying though, AYYMDPOORFAGS

Your lies and FUD have failed over and over again

That too though it was probably used less than even FP64 ops.

FAKE AND GAY

Only if I play new games.

Turing is much better than pascal because it can execute integer operations in parallel to floating point and it has less shader processors (64) than previous architectures (maxwell and pascal had 128).

Integer ops were completely underutilized in the past because they gimped floating operations which were more important, so integer ops were offloaded to cpu when possible. Once that changes, in new games turing is going to gain 20%-30% above current results just from that + it's going to be much less cpu dependent (a gain for amd cpus, ironically) + less pcie bandwidth dependent.
The shader count is likely to _partially_ repeat the kepler story. Kepler had 192 shaders, but maxwell and (gaming) pascal 128. As a result new games utilized only 128 and the rest was wasted.

It's always easier to code for less shaders, because it means if you have more data you only repeat threads with the same shader count, or if you want to run something else in parallel. On Pascal you must do the same thing 128 times, on turing you can two different things 64 times independently.
Which means that unless games specially optimize to use 128 shaders at once when possible, it's likely half is going to get wasted in some cases on maxwell and pascal. In some cases even optimization doesn't help - what if you only have 64 (or less) data points to work on for one code? There turing can run two different things at once, but pascal must run them sequentially, losing 50% just from that!

Same story applies to driver optimizations. At some point the added complexity of holding two different incompatible sets of optimization gets too expensive. I think they will try to fix performance regressions for a year, but that's it.

All in all, once games start to code for turing, I wouldn't be surprised at 2x+ performance over pascal (1080 ti vs 2080 ti).

Actually, Windows 10 can deliver an NVidia driver update to you, so watch out.
I had it once, though it solved a problem I was having with CUDA work units not showing up in boinc.

>There turing can run two different things at once, but pascal must run them sequentially, losing 50% just from that!
to add, that's going to start coming up with drivers. Nvidia is going to implement this optimization in major games, where possible, so even for current games turing is going to get faster and faster. You can't do everything with a driver, but 20%-30% over current results is realistic

Nvidia drivers are the best in the industry and improve performance, how many times have your gimping lies been debunked over and over again?

Meanwhile AYYMD gimps Rebrandeon garbage by removing features in their driver updates

bugzilla.mozilla.org/show_bug.cgi?id=1461268

>Recent AMD driver releases should have removed the Hybrid VP9 support already.

Straight from an AYYMD employee

Nice FUD, shill. Too bad it was all for nothing.

kitguru.net/components/graphic-cards/ryan-burgess/independent-tests-show-that-amd-has-the-most-reliable-graphics-drivers/

Keep on trying to COPE with your remorse over that 1080 Ti purchase though. You don't need the best anyway.

Nvidia does the same with Gameworks.

>Comissioned by AYYMD
>Paid shilling by AYYMD
>believing AYYMD's paid shilling lies

TOP KEK

KILL YOURSELF FAGGOT

COPE

Is it true that AMD cards these days actually do age well? If that's the case I might buy into Navi if it isn't a power hungry space heater.

>Intel: I can ruin i9 9900k launch by faking benchmark...
>Nvidia: hold my beer

Such a pitty that amd opencl compiler was fucking broken for years, even now is still shit compared to any other opencl compiler out there, including intel and nvidia ones.

Seriously doing something complex with opencl using the amd compiler was impossible that's why cuda is more extended, its easier to use and it didn't matter that much if it wasn't going to work on amd either way.

...

BASED NVIDIA HOLY FUCK THIS IS BASED

>upgrading your graphics drivers religiously
LOL

>fine wine
>light-bodied garbage.
That’s a big yikes from me, bossman.
Sent from my iPhone

there's opencl c++ made by amd, but they forgot to advertise it. Sad because it's quite cool

>GoySync
>GoyForce
How have I not heard or thought of this before

Attached: 1471489322483.jpg (268x284, 17K)

disregard I meant clc++ (opencl++, amd extension that adds templates and OO), which is quite a cool language, but it looks like SYCL is now advanced enough to supersede that
github.com/Xilinx/triSYCL
queue {}.submit([&](handler &h) {
auto accA = bufA.get_access(h);
auto accB = bufB.get_access(h);
h.parallel_for(myRange, [=](item i) {
accA[i] = accB[i] + 1;
});
});
this is nice

High end is a meme in general. Go midrange and enjoy using AMD you dork. It's still a significant step up from consoles.

fucks sake man i have a 6gb 1060/i7 8500h notebook on the way am i fucked?

Attached: Norwegian-political-map.gif (1412x1797, 561K)

8750h*

This was due to the Windows Spectre patch.
reddit.com/r/nvidia/comments/9ngovl/driver_41634_faqdiscussion/e7moxv5

Indeed it is. It's just not worth it at all because games are shit. But in any case, pc gaming is a whole is fucking garbage. It only has shitty multiplats when the best games are always the exclusive ones except for very few indie titles.

Still 20% less than the 1080 Ti and not even at 2100mhz. Only 3584/3840 with less memory bandwidth too...

I hate these fucking fucks so damn much. I would drop these faggots in a heartbeat if everything I needed wasn't based on fucking cuda bullshit damnit.

For the love of HolyC, AMD better put out a decent and not overpriced high-tier gpu. As someone with a gtx 960, I don't think spending $250 (more than my cpu) on a graphics card that'll only marginally boost my fps is all that worth it.

A used RX 580?

At $1k profit a pop, I think so.

look at the 580, so happy I was able to get my Nitro+ for $250 before the memecoins raped the market

Bulldozer was used in super computers, because of cost per core, but that trend did not last that long.

>People arguing about the value of CUDA vs OpenCL on consumer grade hardware
Literally none of you use CUDA for anything serious or you wouldn't be having the discussion at all. Even the shittiest card is a ridiculous amount of power for any single-machine based CUDA processing, and if you're actually doing anything more than that you wouldn't be giving a shit about what the next gayman card that dropped did, or whether it was from nvidia or not.

Honestly who gives a fuck about a 7 frame difference?

>you must make your game very optimized for shaders
No
>which is harder to than for nvidia cuda cores.
Also no
GPU Shaders make heavy use of floating point math almost universally, it's extremely rare to see direct integer precision being done at the shader level, and you don't write shaders in CUDA. Like, literally. At all. Direct linkage between CUDA applications and rendering is haphazard at best and is relegated mostly to the realm of realtime data visualization, not GAYMAN.

youtube.com/watch?v=WtRYdAYgJTc

Cucked by Intel. Funny.

t. Raja Koduri

Explain that to game devs. I mean- they did, but they also provided lots of documentation for it, and provided just slightly more low level access than open gl did, and hence the problem. Never mind that OpenGL is so much more fucking standard and easier to implement, and is extensible, so you can swap out one library for another, even at the firmware level.

Attached: Buy_THIS_Instead_-_RTX_2070_Review_-_YouTube_-_2018-10-19_12.54.52.png (1405x740, 801K)

>runs cooler with driver update
It almost seems fake and the person is using frame limiter. Using a frame limiter would increase the ms and lower the temps.

You're literally a cuck by being an nvidiot in this day and age

what gayme is this?

RTX 6000 > 2080 Ti > Vega > 2080

So the effect is only for the 10 series cards and not the 20 series cards?

And whats the alternative? AMD

i rather killmyself than go back amd again

Attached: 1514881475776.jpg (324x291, 15K)

Does Nvidia really expect people who bought a $200~ GPU to go out and buy a $500 one now? Because that's the only new stuff they've released.