DVIpilled

DUDES. TAKE OFF YOUR HDMI CABLES AND SWITCH TO DVI.

IT'S SO MUCH BRIGHTER. OOOH MY GOD SO GOOOOOD.

Attached: dvi-connection-types.svg.png (750x438, 12K)

Other urls found in this thread:

i.imgur.com/Bw61l68.jpg
twitter.com/NSFWRedditGif

Same brightness 4 me

Literally the same quality as hdmi but without audio

It's the exact same TMDS signal being sent on both cables, if there's any difference between the two it's because of your monitor or GPU settings.

why would you even want brighter? are you blind?
you're sitting 2 feet away from your screen how fucking bright do you need it?

... tried using full colour range instead of 16-235?

Literally the same signal.

the conectors are shit tho

the hdmi is also shit, and worse shit desu

component is where the peak video connector is

Almost guaranteed this is it, the OP is using NVIDIA hardware and it thinks his monitor is a TV

This Nvidia uses limited RGB by default with HDMI for some stupid TV compatibility reason. I've been using it for a few years like that. Changing it to full range felt like buying a new monitor.

hmmm

Attached: AppleDisplayConnector.jpg (960x594, 85K)

>component
My nigga

Display port anybody?

It actually does carry audio. At least the digital one

I irrationally hate DVI. It is the worst shit imaginable. Thank god the 20-series vid cards are replacing it with USB -C.

Maybe just do a digital, dual RCA LVDS connector?

If everything was correct you wouldn't see much of a difference between limited and full
Limited and full on a TV looks the same (albeit with some quality loss) as the TV can recognize the limited signal and treat the 16 in limited RGB as pure black and the 235 as pure white
Monitors often don't do the proper correction so it fucks it up

Yes I love DVI. I love using 40hz on one of my two monitors because the laptop dock only handles single link.

Attached: thanksDVI.png (388x413, 12K)

hipster fag

I do get audio out of my Nvidia's DVI output though. Was the same from my old HD4850. I'm still using the DVI->HDMI adapter that came with my ATI card, even.

>accidentally plug it into a DVI GPU
>destroys GPU
nothing personal kid

step aside, kid.

Attached: DMS-59.jpg (575x323, 50K)

As long as make sure to buy overpriced cables because chink ones will have pin 20 connected and will fry your GPU.

Based, when I needed a cable for one of these nobody sold them outside of ebay.

its quicker, brighter, and better. hdmi fags BTFO

Look at this image, if you see 4 colors you're good. If you see two columns you are in TV mode.

i.imgur.com/Bw61l68.jpg

This image is only relevant if your display is set to limited and your putting a full video range signal
Setting both display and signal to limited would look exactly the same as if they where both set to full
Setting signal to limited while the display is set to full would also allow you to see all 4 squares

So what are we calling these people
We already have audiophiles who buy $10000 headphone setups just because they can hear music 3% more clear than normal people without spending problems

Displayphiles? Eyediophiles?

lol what the fuck, just changed it and now my monitor is so much nicer
why is this a thing

Attached: 1517449995361.jpg (596x600, 43K)

Nvidia's fucking driver software not only looks like its from 2004 but it also has default settings from 2004
This is really the only explanation I can come up with
Like no HDMI display made in the last 10 years, not even TVs, are restricted to limited RGB only and HDMI is also not limited to TVs anymore
Even the most shitty god awful TV can display full RGB, it's a default setting that is not relevant anymore and my only guess why it's still around is because nvidia driver UI is archaic

optic link is superior and more cheap

I'm using audio through DVI right now

are you retarded, dvi in general is digital.

DVI-A is analog only, and DVI-I is analog and digital. It's a VGA compatibility thing, probably.
yuo are the ratard

I got a new card some years ago with DVI only. I had to purchase 4 different cables because the companies that sold the cables didn't know how to label them properly. All where "DVI cable" yet when I got them they were of everything except DVI-D. Even when searching for the info online there was only like one remote backwater blog telling about the difference back then.

Now it is all HDMI and they fucking suck because you can only plug them in so many times before they start failing. They are nearly as bad as SATA in that respect. Ironically, I have 30 years old VGA connectors that still work as good as they day I bought them. They have been had something like 5,000 cycles of plugin/unplug.

Do not get me started on USB.

>Error: Bad video stream. Only VP8 is supported.
Chink Moot!!!!

Attached: Five Pounds of TNT and a Volvo!.webm (320x240, 169K)

You don't have to unplug the video cable everytime you turn your computer off.

Videophiles you fucking dumb shit

Never made the switch from dual link DVI to HDMI on my desktop. Been using 144Hz these last few years too. Not a single problem. Still on that 1080p shit though. Not sure if I could go 1440p144Hz with this connector. Might have it maxed out right now.

Uhhhh no. HDMI is for shitty TV's, it does 59hz. DVI is for computers, it does 240hz

>being a dipswitch online in 2019

Attached: 1563320956418.jpg (894x894, 368K)