Does the GPU influcence the way your monitor displays its color...

Does the GPU influcence the way your monitor displays its color? For exmaple if I used a RX 480 on one computer and a RTX 2080 TI on another computer, will the monitor color look different? Since the GPU is the one proccessing the images to the monitor.

Attached: 1544538202876.png (700x466, 387K)

No, not since we use digital methods to connect the panel/controller to the GPU.

No, but retards with an agenda will try to tell you otherwise.

only in that color profiles exist, and some gpus have the codec for certain color profiles and others dont

that said, every gpu nowadays has support for all color profiles, and if they dont its because shitty drivers. Occasionally shit drivers will break color profiles, even in 2019. And this can be seen from all 3 sides - nvidia, amd, intel, although intel has much higher quality drivers and far fewer bugs like this, in my experience.

Amd cards have 10bit colours, i don't know if nvidia ones have it.
Also the drivers are a big factor, the trainwreck of a hd 4850 is way darker in win10 and also has a more yellowish undertone compared to all cards i have that support win10

Yeah, Nvidia cards will play video in limited range by default iirc. There's also differences between the temporal dithering options but I don't think they're enabled by default on most systems. Anisotropic filtering used to be hugely different back with the 600 v 7000 series or maybe it was the 500 v 6000 series but some driver updates made them mostly the same.
There's probably some other differences too but usually they're small enough that nobody gives a shit.

They all have 10bit, but you need to use display port usually.

Yes, but not significantly unless you're retarded and use the defaults for nvidia as pointed out.
I've got two ultrashaprs, amd card for linux host and nvidia card for windows guest. Both connected via DP. There is a slight difference between displaying the same picture on both on amd versus one on amd and the other on nvidia. Colors are a (very little) bit more washed out on nvidia.

I don't know, but I think stalker, metro, and movies look better since I switched to nvidia.

What else would you use? VGA or DVI are deprecated and HDMI is something for media devices.

> Does the GPU influcence the way your monitor displays its color?
Color depends on:
- Compression used by a GPU to deliver the signal (ex. chroma subsampling in Nvidia Kepler to deliver 4k@60Hz over HDMI 1.4)
- Gamut range (in the past, NV used limited range by default, thus colors were bleak)
- Monitor calibration
- In some cases, even port matters: my calibrated LG 8bit displays calibrated palette only over HDMI
- In other cases, color depends on TV settings: for example, LG TVs require to tag its HDMI port as a "PC" port, only then it will allow 4:4:4 full RGB to pass through.

>What else would you use
Either HDMI, or Display port, nigger.

Reminds me that it's been years you can see people claim that there's a difference in colors between AMD and Nvidia GPU.
Years without any shred of evidence of course, all based on muh feelings

> HDMI
Can't push 4K@60Hz@10bit@4:4:4 on 2.0. Some panels are 10bit already and it's not enough.

it's probably just my imagination but I swear the colors look better on my RX 580 than on my old intel iGPU.

No shit. What are you arguing at?

Matrox looks the best for 2d.

The amount of bullshit in this thread, it's like a cattle herd walked through here.
As a matter of fact, the truth is that anything which processes the data which is output to the screen can, and does, affect the colour displayed. There is a whole industry dedicated to colour profiling and getting the correct output. Photographers spend inordinate amounts of money to buy hardware and software which will monitor the cameras images, given which camera is being used, and will match the monitor and graphics card etc to produce a desireable output. It comes at a price. The professional photographers I know will use cameras and lenses that cost around 8 to 12 thousand dollars. The monitors they use are well in excess of $5000
they use hardware to monitor what the actual output from the screen is and what the input is - I forget what that is called but I know it was expensive.
I can tell you for an absolute fact even the same make and model of graphics cards can give slightly different colours, even the leads going into the monitor can affect that

The way it is controled is by colour profiling where the input is matched to the output

no, color is just a number and that number is the same regardless

a graphics card isn't a lens you stupid fuck its a processor that passes digital color information from the computer to the monitor

Who says that it can't do the reverse, and that it can not record what's in front of the monitor?

>I can tell you for an absolute fact even the same make and model of graphics cards can give slightly different colours
>Muh 0.01% difference is important
Now ask them to test that with their eyes and watch those professionals crumble.

On Linux there's a difference since the Linux Nvidia drivers dither.