DVI or HDMI?

DVI or HDMI?

Does anyone still use VGA?

Attached: dvihdmi.jpg (1280x720, 128K)

Displayport

this
everything else is trash
and yes, VGA is still very much used to connect to projectors

>DVI or HDMI?
Same shit. One just includes DRM.
Answer is, Displayport.

>Does anyone still use VGA?
Only for /retro/.

>HDMI
For TVs, blu-ray players and game consoles
>Displayport
For computers and their monitors
>VGA
For RGB consoles and old computers / CRTs

VGA is not as sharp as DVI. There is a noticeable difference. I'm only saying this because I didn't know about it until recently.

HDMI is legitimately DRM. DVI is mediocre but honestly how else you gonna do HD?

The superior option is to shit out SVGA through GPIOs or something.

VGA is a video standard with a set of standard resolutions, it has nothing to do with connectors. DVI can also push VGA.

I had noticeable screen tearing using VGA on my monitor, solved it moving to DVI.

Didn't know about the DRM on HDMI. Isn't DVI supposed to be just as good in terms of image quality, but without audio?

Displayport if you can, else hdmi.

I use VGA to connect to old projectors at uni and so do my colleagues. So yes, it's still been used.

>Connect pc to monitor trough HDMI
>works
...
>Connect pc to monitor trough Display Port
>works

Jesus, Jow Forums is an unhappy board

My monitors only have DVI/VGA, I can't be arsed to dispose of 3 perfectly good running Dells for 3 new ones just for the video input.
So I'm using DP to DVI cables.

The only time you should be using HDMI is if you need to carry an audio or ethernet signal, or if you're a pleb.

Attached: 51Rpjz4DUyL._SL1000_[1].jpg (1000x667, 50K)

Help me Jow Forums.

I'm getting a new BenQ monitor, it has three versions.

DVI x2, VGA x1
HDMI x2, VGAx1
DV1x1, HDMIx1, VGAx1

They increase in price, but just a bit. I thought about getting HDMI for future proof but it seems it offers little advantages over DVI.

Also I already have a monitor with HDMI input just in case I need it, and my GPU has 2 DVI outputs.

New cards are going to drop DVI support in favor of more HDMI and DP ports, heck a lot of cards in the 10 series Nvidia dropped the third DP port for dual DVI because of the VR meme.

meant Dual HDMI
Fuck I dunno what I'm smoking tonight.

So, new trend is DVI + DP?

DVI, HDMI and DP all theoretically have perfect image quality, though HDMI carries some stupid legacy baggage with it and is prone to squashing the dynamic range, making everything washed out and making you sift through your graphics card settings to make it not look like ass. There's also overscan for extra fun.

it also costs manufacturers a shitload of money to LICENSE HDMI for their parts. Paying for your own DRM.

DVI doesn't do compression afaik because it is kind of a dumber spec. This accounts for the shittier range but also more bandwidth for HDest of stuff.

No, it's DP and HDMI.
Normies and retards dunno what DP is and people want HDMI for TVs, VR and easy to obtain cables.

That said DP to DVI is easy as they make DP to DVI cables.
I don't like paying the HDMI tax on my monitors and DP monitors are still much more expensive than DVI monitors.

>does anyone still use VGA?

Unfortunately yes.

I still have to carry a DP or HDMI to VGA converter with me because every business I visit only have VGA enabled projecters and displays.

No joke. Why haven't they changed it out by now?

Why? 640x480 is fine for looking at charts and shit. The real question is why a VGA connector instead of some RCA shit?

Sometimes it's not if you show demo's

It does mostly support resolutions up to 1920x1080 or 1600 something

Good projectors cost a fuck ton, what isn't broken doesn't get replaced.
I've received many calls from clients who order just new machines without monitors calling in and saying that they can't find the port for the monitor, when I follow up on it they all have fucking VGA only LCD monitors.

I always ask if they have monitors and what ports they have on them but some of them don't even know what a D-SUB-VGA port is called and just wave me off saying "yes, yes we already have monitors."
They then can't comprehend why they need to buy a dozen dongles to connect their new computers if they don't want to buy new screens.

At least they still make motherboards with VGA for IGPs or APUs so I just make sure the ones without dedicated GPUs are shipped with VGA motherboards.

Most businesses use VGA.

I love them new little intel NUC's

I think they still come with VGA ports. They are absolutely perfect for most of our clients and pretty fast with a SSD

We mount them on the back of their monitors, looks really clean and professional.

>VGA
Projectors, projectors everywhere.

Anyone in the corporate world either has a laptop with a VGA port or has a dongle capable of speaking down to any projector built in the last twenty years.

>DVI x2, VGA x1
>HDMI x2, VGAx1
>DV1x1, HDMIx1, VGAx1
What, no composite input? How am I to run my 8-bit computers from the mid-80s?

Funny, I have an old plasma monitor which processes 10-bit per channel internally, DVI-DVI is super sharp but when I use DVI-VGA it adds about half a bit of noise and softens 8-bit banding which "looks better" overall.

The difference in sharpness is striking though, especially because it is a high contrast display, I never noticed such differences on TN monitors which are mostly 6-bit + dither. NVIDIA has no option to add dither so it creates a false contour effect when frames are dropped (frame discontinuity reduces the apparent bit depth in DLP and plasmas) so my next card will be a Radeon Vega, I like that AMD doesn't lock basic features away for no reason.

Can't you just use passive adapters for composite to VGA?

fpbp

Attached: 1528895288672.png (600x600, 507K)

I honestly can't see the difference between VGA/DVI/HDMI/DP on 1080 displays. Maybe higher resolutions are different?

DVI/HDMI/DP should look the same because they are interchangeable digital signals that don't require active converters.

VGA/D-SUB on the other hand is noticeably worse even at 1080P, especially when motion comes into play.

Displayport > HDMI > DVI > VGA

this

Um, why is Displayport better than everything else?
Bonus: Is Mini-Displayport just as good but with a different connector at the end, or does it suck more?

>make 16:9 presentation
>show up at place
>4:3 VGA projector with a half dead bulb and a piss-colored screen smaller than my monitor

Attached: 1520903702391.jpg (1274x720, 97K)

Stop carrying your own dongles like a good goy and complain to whoever organizes the meetings that they're unprepared
Obviously if you and presumably others keep bringing your own VGA dongles they won't notice the problem

Why do older TVs have "overscan"?
I've seen HDTVs that claim to have 1080p image but if you account for overscan it's always something bullshit like 812p or some shit. Why?
Also, my 1080i TV is actually 960i, and the 480p mode is actually like 428p.

I have always carried these converters with me ever since my laptops stopped coming with a vga port 4 years ago. There's always this friend that only has a vga monitor when you try to attach another device in a Lan party.

It's small, not really inconveniencing me. Deal with the fact almost everyone is a normie

Displayport is newer, has support for freesync, higher bandwidth for higher refresh and resolutions, it's royalty free, and can be converted to any of the older standards with a dongle.

Mini DP is fucking stupid barely anything uses it, dongles and cords are a fucking nuisance to obtain.

>go to another company and bitch and whine instead of demonstrating that you're thinking ahead and prepare for problems that might arise

Attached: your post.jpg (1162x747, 85K)

>Complaining to companies that they are ill-prepared.

No, they will just turn that around and say that your ill-prepared, make an embarrassment out of you and maybe get you fired.

Thanks for the explanation.
My only guess is mini-Displayport will show up on a lot of laptops where Displayport won't because it is marginally larger.

DisplayPort

>vga
I still use it regularly during meetings. You don't need fancy stuff to display spreadsheet and slide presentation

VGA is the best connector.
>no DRM
That's enough for it to be the best one.

This question actually gets a lot more interesting when you throw in revisions!

HDMI before version 2.0 was actually pretty shit and in terms of television connections at 1080p and above was beaten by Component input, which was capable of the same resolution at higher refresh rates without any loss of color depth. Did you know that before HDMI 2.0 all HDMI connections were unable to provide full color bandwidth at 1080p60 and above? It's a minor thing and obviously it's been fixed for a while now but for a good few years after HDMI came out it didn't reign quite as supreme as people thought, the humble component connection was still king.

>company holds a business meeting, you're invited and you're supposed to prepare a presentation
>you have to bring 30 different converters and adapters just in case they're still using a CRT TV with only a coaxial input
>if you dare to complain that they're using obsolete tech you risk getting asked out and fired
This is the future you chose

Some of the cables connected to my KVM switchbox are VGA.

i have a DP 1.2 5 meters long, whoever made these cables from 3 to 5 meters long say they'll do 4kin 30 hz instead of 60hz. How can I test if the cable can run 100hz in 3440x1440? If the they let me run in in the windows, does that means it's working real 100hz in 3440x1440?

Yep, I do consulting work, and when I arrive on site for a new project, I bring ALL the adaptors for the first few weeks until I know what the client has and what the client doesn't have.

HDMI? Sure. VGA? Of course. DVI? A bit of a rarity but it's still out there on 2nd monitors you'll be gifted with on site.

>Not bringing a single AIO adaptor.
This is why the country is full of brainlets that only complain and want communism.

Im using a Dvi to HDMI cable and the fucker pass sound through it. I really don't get HDMI acceptance, it can't be HDCP since DVI also has it.

>complain
You don't complain, you fix the problem, stop being a whiny bitch.

One also carries audio in addition to video

Analog signals had noise on the edges of the picture image. Cutting off some of it prevents you from seeing it.

Yes, analog is not as sharp as digital. I've even seen monitors that had poor signals to their LCD or the LCD itself had noisy analog, so even a DVI image vibrated a little. Maybe the monitor [referred a DVI-I connection instead of DVI-D.

USE THE COLORS AND SHAPES. It's a lot more obvious to someone if you ask "doe sit have a blue trapezoid, a wide white one, or a little HDMI?"

I deal with monitor bullshit a lot because I used to like showing people stuff when I hung out and I collect old monitors. You can get HDMI->VGA+audio adapters for cheap and they work great.

This mate.
Have a 16x10 in the middle, and 5x4s on each side. They are all of the same product line, and are the same height, so its aesthetic as fuck. Will probably use them until they die.

Thunderbolt 3

Yep. What do you think, what the fuckin servers are use? you piece of pajeet shit.

I'd put DVI above HDMI because it just works with no bullshit. I've had enough of trying to direct people to turn off the stupid ass dynamic range compression.

DVI is technologically worse than HDMI in every possible way. It's obsolete and there is nothing better about it.

It's not HDMI's fault that Nvidia is dumber than a sack of rocks and decides to default to throwing dynamic range compression on every HDMI connection until the user changes it in their driver.

Basically, old CRT TVs had a habit of drawing some of the picture outside of the viewable picture area, so the broadcast signals had a bit of padding around the edges to compensate. Fast forward to today and overscan is still a thing, despite the only reason for overscan's existence being completely fucking wiped off the face of the earth over ten years ago, but here we still are rolling the dice whether or not you actually get a pixel perfect image or get to do some fucking bullshit menu delving to try and unfuck your image. It's goddamn retarded and annoying.

>muh DRM
Do you retards realize that DisplayPort has support for the same DRM that HDMI does? Otherwise you couldn't watch HD Netflix or Amazon over DP.

Also DP has it's own issues.
>the sleep bug where the screen lose signal after computer falls asleep, have to unplug and plug in cable again
>long DP cables are rarer and more expensive than long HDMI cables
>DP1.2 has worse 10 and 12 bit color support than HDMI 2.0
>More of a niche thing but HDMI has better backwards compatibility with component video, while DisplayPort is better with VGA
Overall DisplayPort is better than HDMI but people who say HDMI is useless are retarded.

You get major brownie points for being over prepared...and you know what that means ;)

/thread

Dual-link DVI is the most underrated connection. Almost as much bandwidth as HDMI 2.0 and plays most DRM protected content like HDMI and DisplayPort. Only disadvantage is many monitors don't support it, they might be pin compatible but refuse to operate above single-link spec.

Can someone just list the pros and cons of DisplayPort vs HDMI vs DVI? I get the feeling half the people talking about the topic just think one is better than the other based on how new it is or how expensive a cable is. And other people think one is best because "High definition is in the name" or "It's called DisplayPort so it's best for when u just need a Display"

Can someone comprehensively list all the facts that compare the 3?

I dont know whats up with my new Benq XL2411P, the latest version of their 1080p 120/144hz 24" monitor.
It came with a DP cable but I can only run it at 60Hz with the included cable. I doubt its my MSI GTX 1080's fault either. Had to scrounge for an older DVI-D cable to get it to run 1080p 144Hz.

My two monitors are sadly 2xHDMI, should I get DPHDMI cables for my new build, or not be autistic and just do HDMIHDMI like a normal person? (I'll have a TV plugged into it as well)

Attached: gigabyte-rx-vega-64.jpg (1000x750, 20K)

My HDMI is slower than DVI/VGA. Especially for bios post check.

this

the dvi port on my gtx 1080 carries audio just fine

vga is still decent. I use it for a crt, and connecting my secondary pc to an lcd. My digital inputs were already taken. It looks just fine. Can't tell it from the digital ones at all.

Why do you need dual link if the panel's native display mode runs just fine and dandy on single link? Practically every monitor that actually benefits from a dual link connection does have support for it.

>the dvi port on my gtx 1080 carries audio just fine
that is DVI-D and it is through an adapter OVER HDMI. There is no display on earth that can accept audio signals through an actually DVI cable.

my nixeus edg27 monitor does

>It looks just fine. Can't tell it from the digital ones at all.
Someone please corroborate.

again, if it does, it's DVI-D, not DVI.

to quote from wikipedia:

Some DVI-D sources use non-standard extensions to output HDMI signals including audio (e.g. ATI 3000-series and NVIDIA GTX 200-series). Some multimedia displays use a DVI to HDMI adapter to input the HDMI signal with audio. Exact capabilities vary by video card specifications.

yes, on my poweredges at home and at work