Which one of you cucks wrote this?

Which one of you cucks wrote this?

Attached: Screenshot_20190918-171759_Chrome.jpg (1080x2280, 694K)

Other urls found in this thread:

en.wikipedia.org/wiki/Median_income
twitter.com/AnonBabble

...

CRT is better, get over it. We all fell for the LCD jew but it's time to admit we made a mistake.

SED when?

Yeah, I do miss huge monitors that took up my entire desk, weighed 30 pounds, ran hot and burned in over the years. Goddam good times.

>the image quality is worse but surely all that superstitial shit makes up for it

I enjoy having a 4K 50" sitting on my desk that only has the footprint of a toaster oven. CRT just can't compete.

Old LCDs sucked, but they're good enough now. As long as it's a good IPS, you will have no reason to complain.

Trading the CCFL backlight for LED was a much bigger mistake, I think.

Work out.

>trinitron
If only these idiots actually used what most people had they wouldn’t be so quick to praise

the LED is better on quality screens but its way easier to make a shit LED implementation

its a spread desu

>weighed 30 pounds
i hope you aren't saying that like 30 pounds is at all heavy.
god forbid you lift an infant

???? literal budget shit heap CRTs still were great

Those are some serious rose tinted glasses you are wearing
There is a reason why LCD took over so quickly in the mainstream, no longer did people had to pick between a useable resolution vs a refresh rate that isn’t ass to look at
The people who unironically question why CRTs left sit behind monitors that cost someone $1000 back then when new and you are not going to catch a single one of them use a mainstream tube and there is a good reason for it, most of the CRTs people actually used where complete shit in their capabilities

Except none of color quality, accuracy, reflex or black levels matter in the slightest when all you're doing is reading documents (i.e. programming) or playing games at the non-pro level.

LCD/LED is better in every way except refresh rate, and even that is reaching CRT-level speeds nowadays. Only nostalgiatards want CRTs.

>We played modern games on a $4000 24" 1920x1200 CRT monitor (using retardedly low resolution for no reason) - and the results are phenomenal compared to $100 WLED trash.
Now go play modern games on a modern $4000 display setup...

>>trinitron

trinitrons had horizontal aperture grill wires which were distracting once you noticed them. Also geometric distortion and color fringing in the corners.

The response times on TVs are terrible.

only thing quality-wise that LCD does better is picture geometry (due to having a fixed pixel matrix vs. photon gun controlled by magnets; at the cost of being stuck to one "native" resolution)
CRT has better dynamic picture not just because of refresh rate, but because of very low image persistence.
"LED" in regards to monitors refers to the backlight, the display is still LCD. LCD/LED monitors often have backlight bleed - non-existent issue with CRT displays

LED is better mostly because of economic/ergonomic factors, i.e. it's cheaper to produce, smaller, more energy efficient, easier on the eyes

Yes, and? Why the fuck would you use a TV for anything?

My new 4k 65inch HDR TV gets me more Netflix and chill than any CRT ever could

Attached: crt vs modern display.jpg (500x500, 88K)

How can this be better than an OLED? I thought OLEDs had like zero motion blur or something, which is the chief reason people wanted CRTs.

Not me. CRTs fucking suck.

OLEDs by default are still sample and hold and will as such suffer from the inherent blur since this kind of blur isn't about how quickly pixels respond. They still need to be strobed in order to mitigate this issue (just like LCDs), though strobing obviously reduces brightness. Honestly I don't even know if OLED TVs use strobing or not.

>my $1800 CRT back then was soo much better than the 1366x768 TN panel in my new walmart laptop! LCDs all suck!!!

Attached: boomer.jpg (223x226, 11K)

I'm not sure either, but I think that OLEDs don't strobe since each pixel is lit individually but have a similar technology called Black Frame Insertion which simulates strobing. It does affect brightness I believe.
Why does sample and hold cause blur, and not whatever CRTs use? What do CRTs use?

CRTs are rolling scan, the electron gun sweeps across the screen and only the phosphors hit by the electron beam light up (and then take some time to decay). Due to how human eyes work, it looks like the whole screen is lit, but at low refresh rates you can notice the flicker, this is why.

Black frame insertion on OLEDs should basically be the same thing. I mean it's not like there's a backlight to strobe on an OLED display.

It's true there were some amazing ccfl LCDs back in the day.

>>B b b but they're slightly thicker

They had phenomenal colours. Really eye popping and they didn't even cost that much.

Bonus trivia. Sony made 100hz hd 32" and larger crts with ccfl back lights in the naughtiest. They were awesome

So basically, the way I understand it, the "hold" part of sample-and-hold is perceived as persistence of motion, and this leads to blurring.
But the way CRTs light the screen means that the image doesn't ever appear off, and this is why there is zero motion blur on it.
Is this correct?
Also does scanline interleaving have something to do with how CRT works?

Fuck me these hipster retards are going to destroy the market fuuuck
The fuckers already ruined vintage audio PVM CRTs retro gaming in general and now this. Fuck normies FUCK them.

The CRT beam will hit a phosphor and light it up for only a fraction of a second. Hence the light output is lower than LCD but the response time is so much faster

LCD use sample and hold BUT the problem is the color doesn't change to the one it's supposed to when using high refresh rate "gaming" monitor. It's called pixel overshoot

Does pixel overshoot contribute to motion blur though?
I'm trying to understand why CRTs are so much better at motion than OLED or microled or other cutting edge LCD tech.

Yeah I love dim screens, low resolutions, thick bezels, burn-in, glass that curves AWAY from you (okay later CRTs had flat glass - we had a TV like this - but earlier ones didn't), and I also love having something that takes up a gigantic amount of space and weighs a fucking ton.

Oh no wait, I don't.

I don't really miss going blind.

>EUROgamer
lmao yuropoors and also

>pro level

HAHAHAHA, casual, it doesn't matter what you think at all.

Screens on CRTs are brighter, capable of high resolution, less susceptible to burnin than LCDs, and the curved glass is fine.

What you played on was a 80s piece of shit and you're trying to compare to "high end" products from today.

>screenshot of an article thread
>jewgle amp
>phone poster
go and stay go

Five countries have a higher median per capita income than you (in international dollars, purchasing power adjusted).

All of them are European. Namely Norway, Sweden, Luxembourg, Denmark, and Finland.

Lol, Ameripoors.

en.wikipedia.org/wiki/Median_income

(Mods, I hope you do not consider this off-topic; I am responding to an issue that's already been raised.)

Attached: Screenshot 2019-09-19 at 01.55.57.png (1260x570, 118K)

cope

And it's all lost to socialist policies

Maybe I'm a brainlet, but how the fuck could a CRT monitor be better? It's not high definition display.

This. And if you bother to kick out for the equivalent top of the line, modern panels, there is virtually indistinguishable difference between CRT and LCD/OLED.
If CRT was that much better, then industries that thrive on color accuracy would still use them. Not even laboratories still use CRTs.

He isn't wrong - I tried gaming on a 17" 1024x768 85 Hz CRT. Thing cost me all of $7. Having >60 fps looks gloriously smooth even if it's only 85. There's not much visible flicker, though a CRT capable of 100-120 Hz without dropping to 800x600 would be nicer. Colors pop way more than I've seen anywhere but my phone's AMOLED and are generally brighter and more lifelike than that. It also runs perfectly smoothly with a rather poor GPU since it's only 1024x768.

Ahh, Wizardry 1.
That takes me back

Probably just essentially making the image blurrier, hiding imperfections.

The one mentioned in the article is.

Less image persistence in CRT means sharper movements for games.

Lmao sorry you got butthurt kid.

Oh no, people can get healthcare even if they lose their job so they don't have to kill themselves instead? How terrible.

A decent one can do more than HD. They also do high refresh rate and motion blur performance LCD and OLED can only dream of.

>except refresh rate, and even that is reaching CRT-level speeds nowadays
Arcade CRTs can do like 15khz

Is this all kids do these days? You know I what like to say?

Shit fuck damn bitch cunt cocksucker asshole twat faggot slut

But keep using words like that

This.
That's why I buy used LCDs for $10. Currently have 5, but only 4 plugged in, because GPU has only 4 outputs. Should get an extra GPU, the cool thing with linux is that you can just throw random GPUs in and have extra outputs. None of that crossfire or sli or whatever else shit

This is how it was during the days of the Xbox 360/PS3
Games looked more realistic on a crummy CRT than they did on an HD panel or high def CRT
The lower resolution and blurriness really did mask some of the less detailed stuff

>what is horizontal vs vertical refresh rate

Enjoying funding somalis to come over and fuck your wife Sven. Sweden yes!

FPBP, fucking kill yourselves redditfags.

What makes you think it's not high definition? In 2005 1600x1200 CRTs were common, then LCDs came along with that 1024x768 bullshit. I'm sure there are CRTs with better resolution than a mid range 2005 model.

>but it's not HD
it's a different technology
one isn't objectively better than the other
same with HDDs and SSDs
most technology people think is "obsolete" tends to have something better than it's replacement, though sometimes it's niche
with CRTs they're extremely good for old games and good use of them could make something much more detailed with LCD screens the details don't appear how they were intended to
even modern games do have a nice feel on a CRT, though that's more just novelty (but some of it is do to not having input lag and feeling much smoother)
the most common thing I hear against CRTs is that they're heavy, but that doesn't even apply when you're using it

>1600x1200 CRTs were common, then LCDs came along with that 1024x768 bullshit
LCDs existed at 1600x1200, even in 2003, the dell 2001FP is an example

The article was specifically about Sony Trinitron FW900 - 24" 1920x1200 display (that cost thousands of burgerland dollars decades ago)
>HD is considered to be a display with greater than 480 vertical lines

If I get a CRT, will I have big fucking lines moving vertically up my screen like I see on literally every CRT?

Attached: Screenshot from 2019-09-18 21-29-26.png (820x478, 154K)

We're at 1440p and 4k now, gramps

That's an artifact of a mismatched refresh rate between the camera used and the screen's refresh rate
You don't see it in real life

>there are people on Jow Forums who have literally never seen a CRT

>people are born at different times
The more you know

No, that's caused by the camera being out of sync with the display.

But on hi-def crt's there are 2 or 3 vertical wires running across the screen and are very visible on bright/light screens at hd resolutions. All triniton fanboys conveniently never mention it.

People born after 1990 will always be too young to use internet.

CRTs stopped being used by the majority in what? 2007? you're literally a zoomer.

I get that filming or photographing my 85 Hz CRT but if I drop it to 60 Hz it flickers visibly for me but it's golden on film. It's all about matching the camera speed with the refresh rate.

>Secondly, PC hardware has evolved now to the point where running at higher refresh rates than 60Hz is relatively simple - and a great many CRT monitors can easily run at much faster frequencies, up to 160Hz and even beyond, depending on the display and the input resolution.
240hz? They're acting like LCDs are stuck on 60 and that high refresh rate is something special

Attached: AFuckingDell.png (1011x519, 304K)

CRTs are nice for things designed for CRTs. trying to force modern stuff on there is retarded.

You can also set the CRT refresh to be a multiple of the camera refresh
a 60fps camera filming a 120hz CRT also works

I used to have that FW900 monitor back in the day. they were all very blurry in the corners right out of the box.

I'd do this but I'd probably have to drop to 640x480 to get 120 Hz on my CRT.

>I'd do this but I'd probably have to drop to 640x480 to get 120 Hz on my CRT.
CRToddlers BTFO!

Attached: satania.gif (540x304, 1.52M)

Yeah it wouldn't be great for resolution
My P275 would do 1280x960 at 120hz but it didn't like it, there was some odd distortion on the left side
1024x768 was okay but is pretty poor for that size of screen

>afuckingdell
you say this like it's a bad thing but dell has been making some of the best monitors for years...

>30 pounds
You mean 2 tons

To be fair this was possible in the way early 2000s possibly into 1999
a good 120hz LCD wouldn't come for over a decade

What can I say, it was $7, doesn't weigh a ton, and I mostly use it for retro stuff that 1024x768 is generally plenty for. As much as I'd enjoy a big high res CRT, it would break my budget and my desk, so I use a 27" 1440p 144 Hz IPS screen for modern stuff.

You forgot about how it irradiated your eyes and made you eventually have to wear glasses.

This is bullshit. The flicker caused dry eye and staring at something at a short fixed distance could result in nearsightedness. So long as you didn't have like a 50s TV set, it blocked almost all the radiation and using it gives you less than a banana.

>2019
>$300 for a 1080p screen
Things sure gave advanced

If you really take a look at what $300 gave you for a new CRT you would think you are getting a hell of a deal for the LCD

While they do make a nice case for the advantages of CRT over modern displays, even if they were made today, I don't think they'd be more than a niche item. The relative advantages CRT has (low lag, instant response times, high motion clarity, not being resolution dependent) are all only really useful for gaming. In addition to the motion benefits, the high contrast and color intensity would stand out against the mediocre competiton in today's PC monitor space, although they wouldn't be able to hold a candle against a decent HDR display, let alone an OLED.
Many consumer OLED displays (as well as some high-end LCD PC gaming monitors) have a black frame insertion mode, which flicker the backlight on and off in order to increase the motion resolution. While it is true that OLEDs could in theory strobe in a similar way as CRTs and achieve some interesting results due to being able to light each pixel individually, none of the OLEDs out there today flicker in such a way. The BFI modes on today's OLEDs effectively treat it as if it has a backlight, and just flicker the entire screen off and on 60 times a second.

Even then, using one of these modes on a sample and hold display is not as clear when it comes to objects in motion as a CRT or plasma display.

Even if a CRT made today would be a niche item, PC monitors are already a niche item, and gaming is a pretty big sub-sector. If the industry can justify very expensive FALD monitors and Miniled monitors intended for gaming, then they should be able to justify a CRT by saying it's "better for gaming".
I think the real reason they don't is because CRTs are expensive as hell to manufacture nowadays and quite toxic to dispose of properly.

>If the industry can justify very expensive FALD monitors and Miniled monitors intended for gaming, then they should be able to justify a CRT by saying it's "better for gaming".
The innovation for desktop monitors comes off the backs of the reference and TV display industry, There is basically nothing new nowadays that has been introduced on monitors that hasn't been already existed elsewhere
For example everyone is somewhat excited for FALD monitors but TVs have had it for years
Monitors are only getting it now because the tech has matured and is cheap enough to be applied to an overall declining desktop monitor market
Because this tech has already existed in reality the FALD and miniLED monitors are not that expensive to make but since the PC market is willing to pay more for for what they perceive as new technology they are going to sell those monitors at a premium

>I think the real reason they don't is because CRTs are expensive as hell to manufacture nowadays and quite toxic to dispose of properly.
It's really because the tooling just doesn't exist anymore and there is no one willing to make new tooling

I have the FW900. It is good for retro games and does look beautiful for this, but forget about CRT for modern games or content. The good contrast of a CRT is for theoretical black-white transition and the image just looks washed the more you increase brightness. The image is never perfectly sharp even under optimal conditions and the glass panel doesn't help either making it even worse. These things are just not capable of displaying an image as good as the average LCD these days, even with theoretical benefits in contrast it just doesn't look good in practice.

Modern displays are far more capable especially ones made for HDR, these have the required brightness so forget about CRT if you are after the best image quality get a 10-bit high nits OLED or similar LCD.

Thanks for the response user. What you said makes sense, and you're right about FALD and miniled.
>It's really because the tooling just doesn't exist anymore and there is no one willing to make new tooling
But why get rid of the old tooling at all? I think some of those reference CRTs were still getting made for awhile into the late 2000s, so obviously the creative industry at least understood top-end CRTs were still better than LCD. Why just outright stop making them? They could absolutely refurbish an old CRT factory and use it to make CRTs now if their was a demand for it, no? It's only been a decade or two since they shut them down.

big deal I had a 1440p CRT in 1999 and still to this day it's considered more than enough

>But why get rid of the old tooling at all? I think some of those reference CRTs were still getting made for awhile into the late 2000s, so obviously the creative industry at least understood top-end CRTs were still better than LCD. Why just outright stop making them?
Many did hold on to their CRTs for a few years after production stopped, but the industry did eventually move on and its partially the reason why new old stock CRTs still float around because no one was that hardcore to buy them at the price they where going for not when good LCDs existed

>They could absolutely refurbish an old CRT factory and use it to make CRTs now if their was a demand for it, no?
That is assuming the factories still exist and the factories that made the good CRTs haven't been taken down to make LCDs
If there was any demand to hunt down tooling to make new CRTs it would come from /vr/ types who need new screens for their old arcades or retro consoles

> no one was that hardcore to buy them at the price they where going for
Yeah, didn't think about how expensive I'm sure they must have been, even with LCD as a relatively new technology
>/vr/ types who need new screens for their old arcades or retro consoles
Thanks user, that makes sense. Hopefully /vr/ types, who despite being a small market truly love CRTs, can save us all.

Based.

Attached: 1568094250771.jpg (4032x3024, 2.02M)

>CRTs were still getting made for awhile into the late 2000s, so obviously the creative industry at least understood top-end CRTs were still better than LCD

If you were a professional at that time there were already much better options than a CRT for image creators like the CG210. The best CRT monitors might have had the most accurate color reproduction but you could simply use that as an output test, for actual work I don't know anyone who would prefer CRT over LCD due to the geometric problems of CRT and it's lack of uniformity.

Thanks user for the info.
>CG210
Wow, I looked at the specs of this and was surprised by how far LCD has come. I would think an Eizo from 10 years ago or so would rival at least a 5 year old consumer display, but nope.

CRTs are nice in a lot of ways, but I don't fucking miss eyestrain one bit. It's been years since I've felt like my eyes were hurting from being on the computer too long.

That's the horizontal rate.
Multiply how many progressive scan vertical lines you need (240 on a standard arcade monitor) by your vertical rate (most arcade machines are somewhere around 60Hz, some slightly lower, some above). You get 240*60=14400Hz, which fits under 15kHz -- there's also a bit of signal not on the visible display area which pushes the number up a bit.
An ordinary SD res TV is 15kHz. A "high res" arcade monitor (640x480, 60Hz vertical refresh, progressive scan), like you'd see late-90s, early 2000s arcade games running on, is 31kHz.

A CRT computer monitor being driven at 1024x768, 75Hz will have a 768*75=57600Hz horizontal rate (so you'll need a 58kHz or better screen)
On that 58kHz screen, you could maybe almost get 120Hz at 640x480 and 96Hz at 800x600.
Computer monitors can go pretty high, like 100kHz horizontal scan (eg, a 1600x1200@80Hz picture needs a horizontal rate of at least 96kHz).

Well it's a bit older (2005), it depends what you mean by rival, these monitors were designed for uniformity of the display, 14-bit capable and high luminance for it's time. It took a long time for consumers to have access to 10-bit panels and LCDs that provided uniformity at higher brightness.

Oh 14-bit, yeah nvm then that's incredible. I only sort of glanced through the basic specs like contrast and a few other things. And 2005 too? What bit depth are reference panels nowadays up to, 16? 20?