AMD:

AMD:
>runs hot
>is a power glutton
>heavily oc'd just to keep up with nvidia
>supports loonix to compensate; popular with street shitters as a result
Nvidia:
>frosty subzero temperatures are the norm
>requires only a fraction of the power
>underclocked by default, still runs just fast enough to win the race
>has no interest in soviet operating systems

Attached: save-disk-space-by-cleaning-up-old-nvidia-driver-files-featured.jpg (1280x720, 133K)

Other urls found in this thread:

youtube.com/watch?v=sRo-1VFMcbc
youtube.com/watch?v=WAbl0fLY06U
twitter.com/NSFWRedditGif

>>has no interest in soviet operating systems
News to me.

Attached: Screenshot_2018-06-13_20-22-44.png (1087x195, 79K)

>supports loonix to compensate;
t. I don't know what I'm talking about but I'm going to keep making monkey noises anyway

t. street shitter

>Nvidia:
>>frosty subzero temperatures are the norm
youtube.com/watch?v=sRo-1VFMcbc
hehe, nothing personal, kid

Attached: burns up in your sleep.jpg (2591x1727, 512K)

ah fuck, wrong video link
youtube.com/watch?v=WAbl0fLY06U

this burns the novidiots

I have a 1080ti ftw3 I bought it because its Evga and I figured it would have the best cooling of normal cards

Oh boy was I wrong. Its hot and loud. And overclocking is useless unless you up the vcore. And Id I do that i get 50mhz oc with 25c higher temps reaching the 90's area.

My 290x was not this hot even lol

Has any other company had this issue besides EVGA with this particular model?

Only the first batch of reference design FE 1080s had VRM issues. Nvidia pushed the reference 1080 too hard and save money by recycling 1070 PCBs/layouts for the 1080 FE.

The AIB 1080s never had any of those issues.

Pretty sure that was a fault of EVGA and they gave owners the choice of getting shit shipped to fix the issue or send their GPU their way and they'd fix it for you all for free.

They knew what they were doing when they released that shit.

Profiting off the lazy buyers who don't feel like making a phonecall.

Because they limited themselves to a retarded two-slot form factor. All all the other cards are two and a half or three slots, meaning they're much quieter.

Let's not forget all the reference Kepler and maxwell x80/it's that are blowing up their memory vrm inductors. Reminder, we're not talking about aibs cutting corners, it's literally reference design which most models use.

Nvidia:
>Next gen graphics cards with a midrange chip will cost 1000+$

>evga make shit
>"muuhhhh nvidia fault xDDDDD"
Go back Jow Forumsamd, manchild.

Attached: 1492573533249.jpg (796x805, 159K)

I have a gtx760 that i was expecting to use until 4k was the standard. What can i do to save it?, underclock it?

Nvidia has superior Linux support you spastic

Hey Lisa. How are those primitive shaders coming along?
You and your slavish coterie kept spamming Jow Forums for months about it so I'm just checking in on that front to hear the latest.

Attached: hqdefault.jpg (480x360, 42K)

You probably don't have to worry about a 760, if that was a typo first confirm whether or not you have a reference board. The easiest prevention plan is to put a thermal pad between the memory inductor and the fin stack to try and keep it from burning out. The best solution is to replace the inductors with better ones. The kicker is they even silk screened a marking leaving space for a larger inductor, but chose to use a bare minimum part instead.

I didn't follow the primitive shaders meme that closely. But not that long ago it got changed from implicit to explicit due to the efforts required on the driver team or something.

Nvidia is hot like a furnace too. It just consumes less power at stock.

No one is going to mention the bullshit about Gsync having $2000 local dimming monitors, yet the Nvidia driver itself has absolutely no support for forcing dithering in games so you get atrocious, unsightly banding?
Or the Gsync shit in general?
How the voltage is locked?
How when you overclock them they often run SLOWER because of Nvidia nannypowerstates?

Meh I don't feel like going on about the many other shit that makes modern Nvidia GPUs unbuyable for anyone with a brain. Someone else will I'm sure. Suffice to say that OP is a retard.

>le redidiot boogeyman
what a nice way to out yourself, summer

the proprietary nvidia drivers for Linux are better anything AMD.

>Linux
>proprietary

Do you see the problem?

nope, just werx :^)

>t.pajeet

?

Attached: 1494217889661.jpg (345x271, 39K)