Why do people bitch about sli

Why are a lot of people too poor to test it themselves? They said sli gives bad frame pacing, but using old sli bridges and 8x pcie lanes. Then go on to say SLI is trash
>pic related smooth frame time

Attached: A8CC38A0-278A-41B9-9B2A-59E0060306CE.jpg (4032x3024, 2.25M)

sir please do the needful

980Ti sli user, yeah once I stopped using the old sli bridges and went x299 for 140$ (inb4 AMD user says house fire) it made a difference frame pacing. I see a lot of YouTubers testing with low clock speed cpus or 9900k cuz it’s a gaymer cpu. It has 8 lanes though per slot

Attached: 84FF687F-A57B-4507-A314-13BDFCFE02A7.gif (399x152, 135K)

What does this even mean?

Its a meme about poos begging people to buy AMD.

Attached: amdramadan.png (578x644, 507K)

There are a lot of other things I'd rather spend money on than graphics cards.
Not that I hate games or anything, but a reasonable mid-range graphics card is enough to play most of the games I'm interested in playing.
Spending so much money just for graphics, just for more layers of shader techniques and resolution, doesn't turn me on.

so SLI requires pcie 3.0 x16?

Attached: SLI 1080s.jpg (1280x720, 84K)

Or be smart? Jow Forums always acts really poor rather than being smart. Scored these 2 1080Ti off eBay for 600$ together and they paid for themselves mining. The only thing I had to pay for was a 1200w psu and water cooler for the top card since it gets really hot sucking hot air off the back plate of the bottom card. Still worth it idc what Jow Forums bashed on because they don’t own half the hardware they bash on. Also YouTubers are dumb as fuck

That isn’t your rig is it? Also yes if you play high resolution. Maybe at 1440p 60Fps you could get away with 8 lanes, but some games swap textures from ram to gpu frame buffer aggressively so 16 lanes would be best. X99 or x299 would be a great start.

I used to be the same way till I got Vega 64 and 1440P 144 hz and I now I can’t look back.

How long did it take for them to pay for themselves through so-called mining?

6 months. Still worth it though. Free 1080Tis. You idiots cried about the mining crazy, but didn’t use your brain

Damn, wish I thought of that. Instead i'm sitting here with zero 1080Tis.

SLI works great in the titles where it is supported, the biggest issue is that in some titles it just doesn't work at all since it's fundamentally incompatible with the game/engine.

9900K can be a good choice with a PLX mobo, there are a couple Z390 PLX mobos around and SLI runs fine on those since the bridge provides PCIe 3.0 x16 bandwidth between the cards.

Yes, if you're on Pascal or older and especially at 4K. GPUs need to transfer a lot of data between each other and most of it goes through PCIe since the SLI bridge is actually slow as fuck. The exception to this is with the RTX cards - these have a much faster NVLink connection instead of the old SLI bridge and as such don't need to rely on PCIe to communicate, so RTX 2080/2080 Ti SLI works fine on x8/x8. More than fine really, it works even better than x16/x16 with Pascal or older generations actually.

6 months. Still worth it though. Free 1080Tis. You idiots cried about the mining crazy, but didn’t use your brain

Plx. Found the dumb ass

No, you're the retard who doesn't understand what it achieves or what SLI requires in order to function properly.

I was a big fan of the whole concept.
The fact is most developers do not spend time optimising their game for SLI.
Great example of a well optimised game is star wars battlefront and battlefield.
With 2 GTX760s I could hit steady 60fps+ at full HD on ultra settings. And ~40 fps at 4k.
Which was fucking amazing since i got them both at around 300€ back then.
Other games are glitchy as fuck or don't scale performance.
It's sad because it's clearly a software issue.
I now have a 1080ti and I don't think I would ever go for SLI again. Although I still like the idea.

I ran 2 770gtx in SLI until just recently, they only really stopped performing because of a lack of vram. Not sure I would do it again though, developers just don’t really cater to it.

Shutter, specially microshutter on heavy loads.

>inb4 screenshot that stares at the wall

Attached: 1550949501655.png (1167x800, 1020K)

Do you think console giants Microsoft and Sony would consider making a "two GPU" console?

When Moore’s law comes to stop soon that will be the only choice. It’s not that hard to make, but software is where it becomes a mess. I own 980s in sli and 70% of my games support it. I don’t mind if the game doesn’t work with sli as I can simply turn it off just for that title

Is the SLI meme worth it? I need to upgrade my shitty card. It would be much cheaper to buy another shitty card on top of it than get a newer card

Poos aren't Muslim

stutter

Probably not, sli tends to be better at running shit you could run anyway much faster. If your current GPU is really struggling it probably won’t help.

Depends your card, cpu and ram. Memory bandwidth plays a role in keeping stable frametimes. Can’t be running 2133mhz kits anymore if you want stable FPS with future games

SLI (and multi-GPU in general) is nice as a general concept because it should in theory provide easy horizontal scaling when you're bottlenecked on graphics performance. You'd think it would be a good fit for graphics cards too, I mean you have thousands of cores in each GPU - the load clearly is horizontally scalable. More cores always means better performance (in simple terms), so adding more cores on a different card should work. It's not quite so simple in practice though, mostly because the cards do not share the same memory pool. This means explicit management of memory, plus a potential performance bottleneck due to available bandwidth and latency between cards. This management either has to be done by the devs in the game itself, or otherwise by the driver (hence per-game profiles).

Devs don't much care since too few people are using multi-GPU and they do not make engine design decisions based on multi-GPU, hence sometimes it is actually impossible to do even through a driver profile, so multi-GPU isn't in a great state nowadays and most certainly isn't a universal way to boost performance, it's situational.

I think if they were to build a multi-GPU console they might take a different approach. I think they may hypothetically add multiple GPUs but let them share the very same memory pool. That could be done in a purpose-built console and would entirely remove the issue of keeping 2 separate copies of data in sync in different memory pools. That isn't something which can really be done in a PC, since you just expect to plug another graphics card in a PCIe slot. There's no way to connect to the same memory pool in a PC without a serious performance downgrade to the access (such as going through NVLink or PCIe).

Generally no, because the boost is situational and not universal. You're better off buying a new card which will work with everything and come with newer tech.

I'd say save the money from buying the second card and spend it on the card that's going to be 2-3 times as fast compared to the one you have by the time you need a new pc.