When will there be a computer monitor that is
ultra wide- 21:9 or more
IPS
HDR VESA 600 or 1000 or at least a 10bit panel
120+ hertz
Freesync
plus for
more than standard 1080 resolution ratio
maybe glossy screen, non matte coating
When will there be a computer monitor that is
ultra wide- 21:9 or more
IPS
HDR VESA 600 or 1000 or at least a 10bit panel
120+ hertz
Freesync
plus for
more than standard 1080 resolution ratio
maybe glossy screen, non matte coating
Other urls found in this thread:
pcgamer.com
en.wikipedia.org
twitter.com
when the consumer is fed with the current stuff we have. just buy something now and you will get your next tec faster
plus for getting normies into high-end monitors so it will bring the demand up and bring down the market costs of manufacteurers + accelerate new technologies.
look at smartphone market for proof
The perfect monitor is
1:1 (4096x4096)
OLED
Matte
Freesync
Man you're too deep into that shit.
actually i prefer gsync even though it costs more. the quality is justified where in freesync, it is all over the place and even AMD admitted that they fucked up since they had to create even a new spec label called FreeSync 2 HDR
I just have high standards for it. and the monitor market doesn't have very high standards.
Did you know there is like only 2 monitors at the moment that support hdr... Fucking ridiculous
why 1:1 ratio? and what screen is this
yeah oled is better than IPS, but there are barely even any options for monitors that are oled. plus it would be expensive
matte isn't good bro! I swear it looks washed out compared to a glossy screens. matte is like having fuzzy washed out coating on top of your screen and makes colors look dull
I don't know if I should wait for at least hdr screens to come out or go ahead and buy something. Right now I'm using a decade old 1440x900 19" screen
>matte isn't good bro!
en.wikipedia.org
There's a reason why glossy displays are illegal at German workplaces.
OLED is a literal meme technology. People have been hyping it up since the 2000's and the price is still too high and the panels still have burn-in/color fade issues. Worthless for monitors.
Grandpa here.
Same was being said for LCD back in the day.
well there have been HDR ready monitor out in the wild already, starting at $200
IPS is fucking trash and you should be embarrassed for wanting it. Especially when you're mentioning HDR, as IPS lacks the capability to display anything closely resembling black, making HDR in anything but incredibly bright scenes pointless. VA is far better.
OLED is completely unsuitable for a computer monitor, but only retards don't acknowledge its overwhelming superiority when displaying moving content. MUH SCREEN BURN is a far less impactful downside than the inherent compromises of LCD displays, especially when only a retard would ever get screen burn in the first place. As for uniformity issues, that's not a road anyone promoting LCDs wants to head down, when 99.9% of LCD monitors out there have utter garbage backlight uniformity and look like shit unless they're displaying a white screen.
Yes. But they will cost like $2000-$4000. LCD panels suck dick though.
but isn't ips better than TN when it comes to hdr and contrasts?
and what about this VA, what is a good VA monitor?
Hopefully never
like what?
>VA is far better
Nice meme
like fucking google?
TN has better black levels than IPS and is therefore likely to have better contrast. it's the off-axis colour shift that is an absolute deal breaker for me.
my issue is that 99% of the monitors that fall within my required specs are either TN or IPS, in which case I have a solid preference for IPS. i'd love to give VA or even OLED a try but it doesn't seem to be a reasonable option.
okay i googled for your sorry ass.
here is one example
BenQ EW277HDR
>google
Please avoid using the term “google” as a verb, meaning to search for something on the internet. “Google” is just the name of one particular search engine among others. We suggest to use the term “search the web” or (in some contexts) just “search”. Try to use a search engine that respects your privacy.
better blacks doesn't mean better contrast user.
if a monitor has like 0,5 candela minimum luminance but only native contrast ratio of 1000:1 instead
every heard of common speech? when you want a nose tissue, you simply say tempo tissue in german. or zewa instead of watersoaking tissue. and for searching on the web, we google
now get off my lawn
>you simply say tempo tissue in german. or zewa instead of watersoaking tissue
No.
>we
Reddit.
i'm fine with this
the special snowflake syndrome has to leave
Asus and Acer are releasing an ultrawide 200hz monitor later this year.
predator x35 and z35 already exist
my main complaint with G-Sync is that Nvidia gates it off as a "premium" feature.
I don't care that I have to pay a little extra if out for a 1440p 144hz high end model, since the extra cost has gone towards the custom hardware and feature curation.
But adaptive is WAY more important on 60/75hz monitors (144hz takes care of tearing and smoothness issues nicely already), and is arguably needed more on slower cards because those won't be as likely to hit framerates at or above refresh.
And that is where Nvidia fucks you over. I am currently using the 200€ 32" 1440p 75hz VA Freesync monitor AOC makes along with an RX580, and there is just no way you can get close to the same gameplay experience value wise with Nvidia. Spend all your fucking money on high end hardware or forget about adaptive sync - or buy a midrange graphics card and be limited to a 1080p 144hz TN model that still costs a lot if you want adaptive sync on your midrange budget.
Same thing with laptops. Adaptive sync for underpowered laptop GPUs, or for limiting framerates for power usage benefits, is just insanely useful on paper - and all Nvidia is doing is putting software based "G-Sync" in fucking expensive gamer models.
not trusting these one bit until reviews, TFT-Central tore the Z35 panel a new one for the marketing stunt of offering 200hz overclocking when the VA panel barely performs well enough for 120hz
Freesync is shit
Why freesync
Don't tell me you actually bought an AMD GPU
yeah but if you consider g-sync, you play above mid-tier hardware so that's the normal cause
i get your price/performance thinking for mid level hardware starting with 1050/560 and ending with 1060/580 but there are people that simply demand more after the 1070
enjoy 21:9 movies on a 1:1 monitor, you retard
I have one with all those features except HDR.
LG 34UC79G-B
freesync is open and doesn't require a $200 module to work. It's also supported by Xbone and some TVs. It's objectively superior.
HDR is pretty new so maybe next year. I don't think any 21:9 monitors are glossy, at most I've seen is semi glossy.
>matte isn't good bro! I swear it looks washed out compared to a glossy screens. matte is like having fuzzy washed out coating on top of your screen and makes colors look dull
it's better than watching my ugly mug and everything behind me
Good point.
Now if they can start making a Display port without HDMI it could be 100% royalty free and they could add that savings to the customers.
Too bad HDMI is locked in for HDCP(DRM) to prevent 4K rips.
>watching movies on computer
???
>hdr
>10bit
>120hz
>ultrawide
>1440 minimum
>WHEN user WHEN
Do you even understand the bandwidth requirements for what you're asking? Let's completely ignore other necessary metrics like the bare minimum theoretical pixel response and look only at the bandwidth for the signal.
Everything they said about LCD backintheday was and still is true.
Everything they said about OLED backintheday was and still is true.
>I have one
>34"
>1080p
once again, the nips lead the way.
>burn-in/color fade issues
CRTs also had these issues, and people are still memeing them today as superior to LCD.
>not OLED or Gsync.
The real question is when the fuck wil Nvidia adopt freesync? Or more realistically an identical standard of variable refresh based on the display port standard they can use and put a badge on?
Why should they when Nvidia already owns 75% of the gaming market? It's not miners buying Gsync or Freesync monitors. The only way I could see Nvidia ever adopting the DisplayPort Adaptive Sync standards is if Intel supported Freesync and Freesync started being on every single monitor, even low end ones. Basically if Freesync became so widespread it would be impossible for Nvidia to ignore.
AMD still doesn't have a competitor to Lightboost/ULMB, which is by far the greatest "gaming"-related feature a monitor can have. There are only a couple vendor-neutral panels with backlight-strobing, the vast majority of them are Gsync. Meanwhile AMD is working on meme shit like low framerate compensation which is only useful for Xbone.
>Why should they when Nvidia already owns 75% of the gaming market? I
Cause they are losing the variable refresh race with their snake oil scalars.
>The only way I could see Nvidia ever adopting the DisplayPort Adaptive Sync standards is if Intel supported Freesync
Good news. Intrl already stated they will support Display port adaptive sync.
>Freesync started being on every single monitor, even low end ones.
It is on every monitor?
>Basically if Freesync became so widespread it would be impossible for Nvidia to ignore.
I mean it is at this point but other things are holding back AMD Gpu sales so Nvidia can just tell everyone to use gsynch or fuckoff. (The latter being more frequent)
More people use Freesync monitors with Nvidia GPUs than Gsynch ones. Hell xbox supports it now and Samsung is fucking shipping out freesync TVs.
when will we have 2:1 monitors
>ultrawide
>120+hertz
retarded /v/irigin
When there's a market for it.