+ higher resolutions

+ higher resolutions
+ higher refresh rates
+ better latency
+ better colors

- bulky as fuck
- heat

All things considered, did we get memed with LCDs? It seems like a downgrade for me.

Attached: crt.jpg (225x225, 6K)

Lots of radiation from the back, though most of it is caught by the shielding.

Fuck off braunfag, CRTs are obsolete

No, the power savings alone justify lcd monitors, also the fact that you can have huge LCDs and not have them weigh the same as a small car.

It had to light pixels one at a time.
That's the big problem.

Surely that's what gives the better response time

I know this is full autism, but I have the monitor in your pic, it's an LG Studioworks something or other
>Higher resolutions
Only does up to 1024x768 while looking blurry as shit
>higher refresh rates
Highest you're getting on that is 85Hz @ 640x480
>better latency
Right on the money on that one.
>better colors
Certainly a lot brighter and a tad more accurate (when compared to an LCD of a similar category), but nothing too special.
A cheapo CRT is way worse than a cheapo LCD. An extremely high end (think a LaCie or a Barco, none of that FW-900 and such memes Jow Forums seems to love so much) CRT will absolutely destroy the highest end LCD/OLED/Plasma in every single respect in terms of image quality. SED could've been the only worthy competitor for CRT, but we all know how that ended up.

>full autism
>doesn't even know the model of his own monitor

>All things considered, did we get memed with LCDs? It seems like a downgrade for me.

I was born in 1997 and didn't get to really experience the CRT-LCD transition properly (obviously wasn't knowledgeable about response time and refresh rates at the age I was when that was happening), how did so few consumers notice or care about 60hz? Going to that shit after using >100hz for even a short amount of time makes me feel nauseous.

Here you go. It's 3AM and my night anti-autism pills have just kicked in, so please forgive my stupidity.

Attached: IMG_20180725_031159.jpg (2560x1920, 2.1M)

CRTs dont work above 20"

This post is retarded.

Early LCDs were mostly crap. People were taken in by the flatness more than anything else.

>CRTs dont work above 20"
Care to explain? And remember we're talking about PC monitors here, not 15KHz TV trash
>This post is retarded.
Why is it so?

i'm not the OP but when I make general threads about CRTs i just tend to grab the first image i see of any model that's beige and box-y looking because i know it's what most people using the catalog would immediately recognize

CRTs gave me headaches when I used them in the early to mid 2000s. Got my first LCD monitor in 2006, a Dell 4:3 with 1280x1024 resolution. Haven't missed CRTs at all, never played any games on them either.

Everyone also believed the meme that crts gave you eye cancer and we're eager to switch so they could feel good about spending hours in front of their screen.

You forgot:
- susceptible to magnetic fields
- susceptible to screen burn-in
- longer warm-up period
- shitty screen-to-bezel ratio
- blurry at higher rez
- usually VGA-cable, analog only
- pincushion effect,
unless you get a Trinitron display, then you have to deal with
- even more weight
- 1-2 permanent, dark lines across the screen

Yeah, no. CRTs can definitely fuck off for good.

We did get memed. Thinness sells despite it being not remotely useful most of the time.

CRT monitors over 20" are rare and expensive, almost all consumer monitors were 17-19". Honestly that's too small nowadays, I have a 27" 1440p monitor and it's amazing. I imaging 40" would be even better.

Most people only use it for office shit and since it doesn't flicker, it doesn't cause as much eye strain as the CRTs did. Only gamers stuck with CRTs till the end since they're mostly the people that noticed the terrible smearing/ghosting and lower refresh rate that came with really early LCDs.

The amount of energy it uses for it is the downside.

>CRTs dont work above 20"
I had a Viewonic monitor in university, produced in the mid 1990s, that was 22". Easily the best monitor I've ever owned. It did 1800x1440 at 76 Hz, and 1600x1200 at 91 Hz. Colours were amazing. But, I concede that it was a heavy cunt, at least 60 lbs if not more, and it took a SHITLOAD of electricity to run it (~160W) compared to my current Ultrasharp monitor (23W). I was still very sad when I had to sell it at the end of uni, thing was a fucking champ. I'd like to think some Braun nut is still using it to this day.

>CRTs gave me headaches

Shitty CRTs gave me headaches. As a child, I had some garbage TTX monitor that obviously didn't have a high enough refresh rate, probably 50-60 Hz or so. That gave me really serious headaches, but my parents never replaced it because they figured it was an easy way to control my computer usage. But good CRTs are fine. My dad had a really nice Dell D1025TM (which we still own to this day), that sucker did 1280x1024 at 75 Hz and didn't give me a lick of trouble. Sony Trinitron underneath, I believe.

they also used more energy, right?

Barely +50% above background reading on my Geiger-Muller counter at the faceplate.

Attached: 1523866134827.png (574x567, 325K)

- blurry all the time
- worse colors unless in models much more expensive than the same LCD

Attached: 1525519780037.jpg (250x250, 7K)

Born in 1994 and was 9 by the time my parents got their first LCD monitor. First thing I noticed was that the picture itself was flat instead of curved, lines looked sharp, and small text was actually readable. It wouldn't be until 2007 did we get rid of our CRT TV and that's when I my consoles looked like shit on it. Went to the thrift store and picked up a smaller CRT (there were walls full of them) to play my PS2 and SNES on and I still have it to this day. In 2018 the only reason you'll ever need one is to play old consoles without HDMI output.

I must be the only one who can read text easier on CRTs, matte LCDs are the worst for casual reading feels like eye cancer.

I got a CPD G500 Trinitron for free because some pleb switched it from HD15 to BNC input.
I still have to repair it every 5yrs of use, and often find myself working around 25-30KV rails. At least I don't have to toss it.

LCD dies, it goes in the bin and I have to buy a new one.

Yes, it was a shame to send my old IBM G72 (I think?) to the dumpster. The shitty LCDs I got to replace it are, of course, shitty.

That's communism

75hz exists solely for people like you. If it weren't for that, the manufacturers would've gotten away with not putting a clock generator in there and just let the electrical grid dictate the refresh like a convential TV. However, unlike TVs, a PC monitor is meant to be used up close.

Communism sure was advanced in the early 2000's.
VGA/BNC Inputs.
1024x768 @ 120Hz.
2048x1536 @ 75Hz (incase your MX440 can't handle it)

Why does tech in 2018 suck?

I had a Dell flat screen monitor and it was great except for the space it took up. It was something like 20 inches and the res was 1600x1200. Much better than the LCD's of the day I think. Only downside is it had VGA only

Unfortunately my FW900 died years ago.

It was god tier pretty for consuming anything analog/low resolution.

I have to admit, though, the thing isn't nearly as crisp as a modern 4k monitor. If you do a lot of work that involves looking at tiny details and text, LCD has finally surpassed CRT with the recent super high resolutions.

>almost all consumer monitors were 17-19".
Yes, well, at the time they were contemporary, almost all consumer TFT displays were also 17-19".

>+ higher resolutions
Care to point out those 4K CRTs?
>- susceptible to screen burn-in
This. I was a CRT-fag for the longest time, until I discovered that "burn-in" doesn't necessarily mean static patterns, but just that the screen ages non-uniformly. I didn't even discover until I had used TFTs for a while that my last CRT was all patchy in brightness, even though it didn't have any particular patterns burnt in from static UI elements or anything.

Could be a magnitude of things really. I had overbrightness on the entire tube and was the cause of a single failing resistor.

Same, form the time I was a young child right up until I got my first LCD around 2004, I could not watch TV or sit on a computer for more than maybe 2 hours straight before I would get pounding headaches.

Thought it was just the result of watching video, motion or something, until I got the LCD and it stopped and it comes back any time I use a CRT for a long time even today (Last time was when playing a SNES on an old CRT with my cousin)

Too high energy. Would burn phosphors leading to low life, had geometry problems due to alignment and also needed lensing to achieve proper rear projection, and of course, they were fucking space heaters filled with mercury the size of a small AC unit. They did have infinite contrast ratio, faster refresh and response times at the time, but that's largely been erased by modern tech and now that OLED TVs with 144Hz and 4K are around for really not that much and quantum dot technology around the corner there's really not that much reason to have em around besides to play Mario Party 2 and Smash sometimes to relive your shitty youth.

I miss the resolution flexibility, high refresh rate, nice blacks, and that sound the degaussing coil made.

But boy do I not miss geometry adjustments. I always managed to get everything perfect except one corner. Fixing that corner would fuck up a different corner, and so on.

Nowadays we have better flat screens, but this was true for most of the 2000's and early 2010's.
CRTs are still the go to monitors for old computers and retro gaming.

Attached: 0.png (500x500, 184K)

Wrong though.

Funnily enough the early LCDs of the early/mid 2000's that where usually ~14" actually could take just as much as a 19" CRT of the same time did.

CRTs don't have pixels.
LCDs did the same though, screens where still refreshed one line at a time.

If you where going to play anything low resolution or analog you wanted (and still do) 4:3 not a widescreen meme CRT.

LCDs always display the image at once, even with VGA it just reads the entire frame into a framebuffer before updating the entire screen. This is why LCD TVs usually have horrible lag when dealing with analog signals while they fare much better with an all-digital HDMI signal. It's even worse with interlaced video since it has to apply deinterlacing.

>+ higher resolutions
>+ higher refresh rates
Not at the same time though.

Not really how it works kiddo. You know about screen tearing? Yeah, that's one of the results of how the displays refresh, not exclusive to CRTs.
Also I said *did the same* not *do the same* as anyone these days should use a DisplayPort monitor that can refresh any part of the picture individually.

I wish I could find some decent CRTs nowadays

Screen tearing is when the game engine runs faster than the rate at which the video card generates frames. It takes place before the signal is even sent out the card.

interesting. I owned 2 as a kid and both were over 20". I guess I was a pimp.

>touch the glass
>*REEEEEEE*
>3rd degree burns
>mfw

>LCDs always display the image at once
No.
LCDs scan from top to bottom (usually) as well.

>Screen tearing is when the game engine runs faster than the rate at which the video card generates frames. It takes place before the signal is even sent out the card.
No.
Screen tearing takes place when the video buffer is updated while the monitor is reading from it.

Are you a /v/ spillover? Screen tearing is when the game framerate rate is not an integer multiple of the vertical sync of the screen. Hence V-Sync fixes it by syncing the framerate to a integer multiple.
That's why screens will render plits in the image because the image changes while they are still drawing it. DisplayPort screens can fix this with Adaptive Sync or Enhanced Sync.

Boy I hope you're being a annoying ironic troll right now.

>plits
splits*

screen tearing isn't directly related to the game's rendering speed, and it's not specific to the engine rendering faster than the display, either, you can tear when it's too slow as well
tearing occurs when the framebuffer, for one reason or another, has information from more than one instance in time written to it during scanout to the display
in other words, the framebuffer is updated while it's being sent to the display
double-buffered vsync works by using two framebuffers, the 'back' buffer, which the program draws to, and the 'front' buffer, which is scanned out to the display. once the front buffer has been scanned out, there's a short period of time in between frames called the blanking period in which the buffers are 'flipped' (front becomes back, back becomes front), then the cycle repeats
this ensures that only complete frames are sent to the display, no matter how long it takes the program to draw the frame, at the cost of 1 frame worth of latency

fucking this. I love CRTs but trying to unironically argue that they're better than LCD is just retarded. They're fun for aesthetics and just neat to have around in general, but in terms of functionality there's just no comparison.

Because, at some point, computers stopped being tools, and became fashion accessories.

Unless you're using low resolution input to begin with. Low resolution content does look better on a CRT than a flatpanel because of CRT dithering.

>+ higher resolutions
Blur fuckfest
>+ higher refresh rates
But "backlight" at same frequency
>+ better latency
Yes
>+ better colors
No
>- bulky as fuck
Yes
>- heat
Doesn't use any more power than average ccfl lcd

Both HD15 and BNC are identical in terms of what they can do (unless we talk cheap chink VGA cables).
Usually just workstation monitors used BNC because their cards outputs where non-HD15 to begin with.

We know that now, but there was a lot of woo around BNC in the mid-90s. I know my uncle (who was into CAD workstations at the time) was sucked in by it.

They scan but they don't flicker, it just replaces old pixels with new ones without turning off, which is why they don't cause as much headaches. The obvious problem with this is that a terrible gtg response time means terrible smearing artifacts.

Sure, okay. But that's still different from displaying the image at once.

>Blur fuckfest
This depends, actual 20" Trinitrons that can do 1920x1200 can look quite nice.

nice blogpost

>terrible smearing artifacts
It's part of the reason why we didn't need 200fps to be happy with our games.

>We know that now
You had to be pretty dumb to think otherwise. Maybe it was an American thing who where dumbfounded by RGB in general as it was "only for high-high-high end and a no-no for consumers".

Office people, on the other hand, are much happier now that they don't have to deal with flicker.

Yes, instead they used terrible CCFL (for a pretty long while until LEDs became widespread for backlights) that gave you a even worse headache, instead of running a CRT at a +60Hz refresh rate.

Most of the woo I heard was based on the assumption that those 15 teeny-tiny VGA wires had interference/crosstalk, and that separated, shielded, magic BNC cables would solve that.

>Office people, on the other hand, are much happier now that their shitty 60Hz default refresh rate that every CRT monitor since 1989 can beat is hidden from them
FTFY. You're welcome. Also H/T .

But these are top of the line super expensive things that no one could afford till they where obsolete.

Average CRT back in a day had horrible focus/sharpness, even the more expensive ones. Had EIZO with non triniton tube and it was still not as sharp as even the first generation of LCDs.

You had to shell insane amount of money for sharp picture back then.

>But these are top of the line super expensive things that no one could afford till they where obsolete.
Not really true, a Trinny that was a few years old was pretty well priced. Apple used to ship nice Trinnies with their consumer machines also. Second-hand smaller ones where even more fairly priced as people upgraded to bigger ones.
Also still have a Compaq consumer shadow mask 14" SVGA monitor that looks pretty decent and is from the mid 90's, is it sharp? No, but is it blurry? At 800x600, no it's not blurry.

First generation LCDs where sharper for sure, we are comparing pixels to phosphor elements here, but they had a ton of other flaws that made switching from CRTs to LCDs a retarded thing for anyone concerned about picture quality for several years after LCDs had already become a norm.

I'm grandpa and never had sharp CRT in my life despite not heaving cheapest models (NEC Multisync 3D->EIZO 17") :-) Only sharp good focus CRTs I've seen back then where Macintosh monitors.

LCD was godsend for all of us poorfags.

Pic related, 90's aesthetics, lava lamp & shit

Attached: eizo.png (1024x768, 821K)

>Only sharp good focus CRTs I've seen back then where Macintosh monitors.
That's pretty much the thing, CRTs of all shapes and sizes can look non-blurry when they are run at a resolution that their shadow/slot/aperture mask/grille can handle. Like Macintosh monitors always did.
If you run them over or under their spec, they will become blurry.

Nah most didn't look good even at lower resolutions.

Also many vga card back then had horrible output, just recently tried connecting some old trident card to lcd and no shit, lots of shadowing and reflections in the signal causing blurry image. Even tired some quality thicc vga cables but it didn't help.

Interestingly lots of cheap black&white vga crt monitors had very sharp image.

the glass was cooler than an lcd screen. all the heat came from the back

I meant the electricity when you put your finger on the screen

you mean the light static like rubbing a balloon? you must have terrible sensitive skin

It's was also not really much heat. 32" CRT TV only uses about 80W.

it was enough to heat up my small room in summer. would've been unbearable if i weren't autistically absorbed in some game

>Nah most didn't look good even at lower resolutions.
Like I said, that's the thing. When they were run fine, most will also look fine, that's already how it works, it's just electrons coming through holes and exciting a phosphor layer.
Where they as sharp as square pixels on LCDs? Like I said, no. Where they blurry? No, just not as sharp.

Be it high res, low res, cheap, expensive, shadow mask, aperture grills, slot mask, etc.

Attached: CRTs0.jpg (1000x1874, 2.7M)

Yeah

>Doesn't use any more power than average ccfl lcd
wrong. My 19" Sony CRT had a power consumption of 110W (tested with a wattmeter). The 19" LCD CCFL one, only 35W.

Because new cards aren't meant to output analog signals and LCD have bad ADC and simply can't do image processing as good as an all analog connection.

>Even tired some quality thicc vga cables but it didn't help.
It helps as much as using a $1000 audio cable from a shitty $100 receiver to a good pair of B&W/DALI speakers, it will only be "up to" as good as the source.

While I love my old 2048x1536 19' CRTs, they don't see any use and are stored in the closet because they are inefficient and I like the wider ratio of my 16:10 and 16:9 monitors.

Old arcade machines are depressing to look at.
Especially knowing that, had they implimented a system that turned the arcade cabinet screen off until it was needed, those poor monitors could still be in decent shape today.
So much burn-in from running up to 18 hours a day on a loop. Racing cabinets suffered the most from it. Particularly Sega Rally or Daytona USA.
LCD mods look so damn ugly.

You forgot
>tiny screens

Attached: IMG_20180722_021749.jpg (2592x1944, 911K)

People are more likely to play it if it's blinking and showing a demo.

This is why you get a Trinitron. No, it'll never been as crisp as LCD, but color is great, no ghosting, good response time, and it looks pretty good at all resolutions rather than great at one and dogshit at all others.

>+ higher resolutions
With shitty blur and distortions
>+ higher refresh rates
That still managed to flicker
>+ better latency
Maybe
>+ better colors
Don't even have "true" blacks like OLED.

>- bulky as fuck
>- heat
-Dust magnets
-Noisy
-Consume fuckton of power

>All things considered, did we get memed with LCDs? It seems like a downgrade for me.
Go buy one.

His point was more about all CRTs:

Obviously Trinitrons are the go to for high end CRT technology available on the consumer market. Be it VGA or TV.
Even Trinitrons though have a go to mode, a combination of a resolution with a specific horizontal and vertical refresh that looks the best on them, just that the other ones don't look as bad.

>posting before reading the thread
why do people do this?

How would a faulty resistor cause the screen to be of non-uniform brightness, rather than just effect a global brightness error?

Reading comprehension?

>"burn-in" doesn't necessarily mean static patterns, but just that the screen ages non-uniformly.
That's exactly what it means though, parts that get more exposure wear down quicker. There is no magic behind it, just what patterns your displays shows and for how long.
Same with OLED, same with plasma. Etc, etc. CRTs have it the least compared to those.

>+ higher resolutions
>+ higher refresh rates
>+ better latency
>+ better colors

CRTs lose in all those categories except the latency thing (maybe, some of them are quite fast), meme-kun.

this

Attached: 1436001615603.gif (200x204, 40K)

I would agree with you if this was 2012.

>a decade after CRTs died LCDs became superior
Well no shit, but imagine of CRTs weren't dropped.

They wouldn't be much better with flatpanels still being better. Also don't call them LCDs, since we have much better flat display technologies.

What's ironic is LCD hasn't really evolved as much as CRT would have in that time, Sony had 120/240hz zoned backlight scanning models out ten years ago and now others have caught up in time, basic LCD panels are of today are no different than in 2006 only cheaper.