It's time for the almighty CRT to rise again. Those of you who weary of shitty blacks, washed out colors...

It's time for the almighty CRT to rise again. Those of you who weary of shitty blacks, washed out colors, displays that show weird shimmery colors when viewed from the side, and non-scalable images must unite. We, some 30-40 million strong, will blitz Sony, Sharp, Panasonic, and others with angry letters and letter bombs. The truth must come out that LCDs suck dick and balls and must be banished.

Power to the people and death to false display technology!

Attached: u2qx7nx484lz.jpg (4032x3024, 755K)

Other urls found in this thread:

youtube.com/watch?v=m2sTMLQ513M
twitter.com/NSFWRedditVideo

>Shitty blacks
That's racist

OLED>CRT

just buy a ips screen

>convex
>one pixel at a time
>warm analog

>mom says that if you leave fancy brand new LED HDTV on for to long the images will burn into the screen and stay there
>try to tell her that you're thinking of CRT's not HDTV's
>she doesn't fucking listen
this is why i hate old people.

You know what would be cool? A T-shirt that says "CRT LIBERATION ARMY" on it and you could go into Best Buy with it on. It would make a bold political statement.

They'd probably just think it was the name of a band.

I can't wait to downgrade from 3840x2160 to 720x576

Have you ever lugged a CRT TV up a flight of stairs? Protip: It really sucks

That's because you're a pissweak piece of shit.

>she doesn't fucking listen
Because she's right you retard, all new TVs can get burn-in.

CRTs were fucking grey by default, I wouldn't call this amazing black levels.
You only get good black levels in the dark with no other light sources.

*SIP *SIP *SIP *SIP *SIP *SIP *SIP *SIP

Attached: IMG_20180815_204350.jpg (4160x3120, 2.13M)

The damm thing smells like deep frying when warmed up

Attached: IMG_20180815_212417.jpg (4160x3120, 1.7M)

Gr8 b8 m8

does actually any one know where to find a

HP A7217A or Sony FW900


or would they all be dead now beacuse they use that grill shit?

I have 3 shadow mask CRTs Philips 109b6 (2005 model) that will last me until using CRT is no longer possible 2035 or some thing. but I wouldn't mind looking on the sly for some obscure better CRT in the next 2 decades.

I completely agree that CRT were great, and that LED and OLED suck disgusting nigger balls with the shitty viewing angles and color distortions.

But, the next technology that will be just like CRT is already here. It’s called MicroLED. It will be like OLED only without burning in, without green hue when viewed under angle, it will be a perfect IPS.

Apple has bought a microLED company and rumor has it they will appear in Apple watches first

Attached: E4296527-E5D5-4F3D-9113-FFFED9225896.png (600x515, 472K)

will still be 10-30ms of lag thou. people think LCD has improved because there are 1ms monitors on the market now but the sad truth is in the mid 2000s 1ms LCD exist they just didn't brand them as that because they used the proper measurement of changing image not just turning image off. "1ms" monitors are all 9-15ms LCD hasn't "improved" that's a myth.

That looks worse than LCD black and it's not even turned on.

Attached: CRT.jpg (250x250, 7K)

Actually later tubes that were more expensive were very dark grey and some were almost black, that’s what happened in 2000’s, but all 90’s were grey yes

So you have data to back that up because those screens haven’t even been out yet properly to assume that, plus it will depend on processing, 85 always depends on processing.

Apple never delivered garbage, will see..

and if you don't think 10-30ms of lag matters. consider this.

motherboard/os lag about 3ms. keyboards are as fast as 0.2ms now (not your shitty cherrys or dyi keyboards they are all 30+) mice are still kinda laggy but its hard to work out how laggy they actually are the best ones are prob 5ms or some thing.

so with that considered you can make your system inputs on keyboard as fast as 3.2ms why would you want to make your system lag 4/8x as much with a LCD when you could get a CRT for 20$.

sure you might say that with a default system/input device lag of 5-10ms adding another 10-30ms onto that doesn't matter but it totally does just play any FPS game people with pings of 3distroy people with pings of 50+ that's CRT will always be best. in the recent fortnite online torniment that was broadcast on twitch it was super obvious the top players where using CRT beacuse their aspect ratios where all fucked up and all the noobs had no idea why and thought they where playing on phones.

Attached: EC48EE2D-E2B6-4C06-AF3E-C17CFABBCBA8.png (600x729, 730K)

also LCD lag isn't consistant across the screen. the top of the screen on "1m" monitors lags 10-12ms the middle lags 15 and the bottom lags 20ms

and that's on the "best you can get" and they literally cant improve the technology any more because monitors with these exact same speeds and TN/IPS panel type came out 15 years ago.

thats the response time of turning the image off not changing image to some thing els they are literally bragging that you can turn the screen off quickly when your done using it not that it can update a image that fast.

best "1ms" lcd is 9-15ms (240hz some times 9ms)
best oled is even slower than that.

Any LCD TV demolishes your shitbox TV you hipster faggot
top of the line LCD TVs like the upcoming Z9F are absolutely insane with zero viewing angle issues and deep colors. Kys filth

LCDs don't get along well with industrial or extreme operating environments. I've worked in heavy industry for 15 years and I've never seen a flat panel used here. They can't handle the heat and dust.

I play at 3.2ms lag thru my keyboard including OS/motherboard lag
I play at 15ms lag thru my mouse
I play at 3ms lag thru my CRT

you play at 45ms lag thru your keyboard on "1ms LCD"
you play at 45ms lag thru your mouse on at "1ms LCD"
you play at 15-20ms lag thru your eyes.

which would you rather use.

They weren't on CRTs, they stretch their res in nvidia to get higher vFOV.

I know OLED sucks fucking ass you can see the rubber banding lag on every android phone, part of why they feel so damn slow

I don’t think we should jump to conclusions, the tech seems super simple, only problem is welding everything.

Apple is supposed to release their Apple Watch with microLED later this year, I don’t suppose it would be too hard to go to Apple store and using slow motion camera record the response time

that's like 1/20th of a second disadvantage that might not sound like alot but look at a clock and see how long a second actually is. in a FPS game that will make you loose like 90% of the time regardless of skill.

SED/FED NOW

SED/FED NOW

SED/FED NOW

SED/FED NOW

The rubber bad lag is just Android being slow to touch response.
Although on AMOLED you can see this strange blueish fringe when you scroll with dark screen elements.

4:3 gives worse FOV you dumb cunt. they where on CRTs

So is like how USB falls short for industrial equipment and RS-232 is preferred instead.

LCDs are excellent for browsing the web or using MS Office. When it comes to a lot of tasks, they fall short, including:

>watching sports
>gaming
>handling harsh operating environments
>being able to produce colors accurately

They are SOL then. For normalfags like myself it's a real good thing.

Agree. 4:3 is the best, and I'll not use crap flatties until they have as little input lag as CRTs

>They are SOL then. For normalfags like myself it's a real good thing.

Attached: 70.jpg (480x359, 13K)

This is a good thing. Shaniqua can’t run away with my 30ton glass and led brick

Upcoming flagships solve all those problems except the work environment one. Even the Z9D from 2016 could hold it's own against OLED which sucks anyway because of burn in and bad HDR performance in games. The brightness sucks too, mLED is the future.

yer but the point is if in 20 years of LCD development they haven't made them faster than 10ms/13 in middle and 15 at bottom of screen why would they suddenly make "true 1ms" lcds now. it would be nice I admit I would like to give up on CRT but I would be suprized for the average user 10-30ms lag is fine. I just cant see it being a priority and I expect its actually impossible. the reason CRT is so fast its analogue and analogue shit is literally 0ms. fastest "digital" thing I know of is my keyboard that's 0.2ms but pressing 1 of 70buttons is far less complicated than displaying 2mil+ pixels digital shit has its limits.

I’m suspecting shitty touch too since they couldn’t rip it off from Apple, their calibration is garbage too for where you “mean to tap”

Anyway, if OLED has good response, I just don’t see how microLED with so little components would be worse

>watching sports
>gaming

Irrelevant because only manchildren do these two things.

Basically, mostly involving lots of movement. It's really visible in very early LCDs where you could see a fuckton of smearing. Okay for spreadsheets but terrible for Doom. Nowadays, some LCD screens get around this either by having a good g2g response time or an anti-flicker function that just strobes the backlight.

you will be forced to switch to a shitty LCD in 20 years time because no supported OS will have driver support for any devices that can do VGA output. you might be able to last abit longer if you use onboard gfx from 2020 era but I expect that will be unusable and not supported on 2040 era os.

They do but in hardened enclosures

CRT are so bad that you have to try, poorly, to emulate their smoothing with cleartype

4:3 is the aspect ratio of human FOV

We’ll all be dead by then who cares

there was a shortlived technology based on crt's called SED. That might me more interesting. its essentially crt tech in a flatter space

Look up how FOV works on UE4, you stupid nigger monkey.

Attached: 4L_iwG3fyg9.jpg (640x429, 39K)

I DEMAND SED/FED MONITORS NOW

Gayming monitors like the PG279Q nuke that issue. Doom runs like a dream.

Twitch gaming on an LCD is almost impossible.

Saw a Sony Trinitron HD CRT smashed on the side of the road the other day It had a DVI and CRT port, so it was probably 720i? Threw a bigass rock at it.

I know, I know. This didn't actually happen and you're hoping to trigger some CRT autists with this low effort b8. I've seen it all before, kiddo.

I actually did. Kinda sad imo, not really into CRTs but I would have picked it up if it wasn't smashed. Kinda rare actually, I'm from a 3rd world country and those things were not sold in great numbers, back in the day it must have been the same price as a small car.

The radiation distortion shockwave from breaking the glass should have vaporized you. How are you still alive

Yes. OLED has perfect blacks and a response time measured in hundreths of a millisecond rather than milliseconds like LCDs. The technology is taking its time getting off the ground but it's the best chance we have to finally escape the LCD scourge

is the response time really that good? what are the refresh rates like?

i don't think it'll ever take off though, it's been nearly twenty years since it started appearing on small devices like mp3 playres and multimeters, and that's it.

the response time measurement is faked its just the speed it can turn off not change image. OLED lags worse than LCD and LCD hasn't improved its lag in 15 years there where "1" ms LCDs in the early 2000s the only reason they sell them now is they use the Fake mesurment of turning image off not changing image.

LCD has had 3 kinds since it was invented ones that run at 10ms ones that run at 15/20/30ms the slower ones have nicer color.

OLED is like 20-30ms its not faster than LCD actual use one and you will realize that in a second my vita looks like it has motion blur turned on compared to a LCD..

micro LCD like some one mentions might be faster but I doubt its a priority wouldn't be surprised if they are like 15-20ms.

digital shit cant get close to 0ms it cant even get under 1/100second. it never will and stop lieing to yourself and thinking LCD has improved from 30ms at launch in the early 2000s to 1ms now.. it simply has not they just found a fake way of measuring it seam like they inproved it.

asus and LG made "1ms" monitors in 2007 they just branded them as 10-15ms because they used the proper test.

the best LCD lags 10ms at top of screen 13ms in middle and 15 at bottom.

for aiming that means like 0ms vs 13ms

LCD will never improve. its the limits of digital. scientists still use analogue equipment for tests because they can never get digital stuff to under 8-9ms.

So then why did the industry settle on such crap technology

In my onion, 240p looks better on slot mask tubes.

Y'all heard about RetroArch's nifty run-ahead feature? It makes it possible to play many games on a fast gamer LCD with less lag than the actual console on a CRT. It's nice not having that CRT eyestrain, whine, and geometry issues.

put it in the bag

Attached: 1512725741986.png (344x273, 103K)

Shit proliferates.

Muh slim and light.

In HDR, sure, but not in fluidity and response time.
I just repaired this Trinitron g520 a day ago.

LCDs have their uses, but there are a lot of things they're also lousy at.

Because the display industry shifted from Japan to countries that don't innovate, look at all the amazing products that were shown at CEATEC 2007/2008 to see what could have been

Supertop emission RGB OLEDs, active 3D 4K local dimming LEDs, the final Trinitron series, 100 inch plasmas, Mitsubishi lazer based DLP projection, Pioneer KURO series, Field Emission monitors from Canon etc etc, there was a proliferation of different technologies but the entire industry was scaled or dead by 2010 with the onslaught of 600:1 low quality garbage panels from AUO/LG/Chimei

youtube.com/watch?v=m2sTMLQ513M

obviously because LCD is cheaper to make and ship around the world. 10-30ms isn't "shit technology" LCD is still fine for many many things. just not for games you want to win at.

>LCD is still fine for many many things. just not for games you want to win a
And watching sports and having decent blacks and non washed out colors.

>10-30ms isn't "shit technology"
10-30ms of mouse-to-eyeball lag being contributed by a single component is shit

all cherry based keyboards lag 30ms and no one complains and think they are instant. you would be suprized what people get used to. mouse movements lag less I think but mouse clicks lag about 20-30ms as well.

so most people are playing with like 50ms of lag atm even with a "1ms" monitor because monitor is like 13ms in the center of screen motherboard/os is like 3-6ms keyboard is 30ms and mouse is 20-30ms. and that's with what most people consider a "optimal" gamer setup these days from watching esports and thinking Zowie monitors are "best".

its not just switching to a CRT that will help you getting a faster mouse can give you a 25ms improvement and getting a Roamer G or Bloody LK keyboard can give you 30ms improvement which is way better than CRT.

if your using a "1ms" LCD at least get a better mouse and keyboard to make up the choice.

if you really like high res being 13ms behind optimal isn't that bad. 99% of popular "gamer" mouse and keyboards have 3x more lag than "gamer" LCDs do.

That's 16:10.

This is why CRT haters are weak asocial faggots: they don't even have a single friend or acquaintance to help move a TV, or some uncle willing to show his boomer superpowers in helping taking it to his truck.

190/135 is a lot closer to 4/3 than to 16/10

John Carmak says bandwith of the human eye folicule or what ever is lower than 4k or even 1080p. we can focus really good on a small point but not that much data at once so in the future with eyetracking games will just have to be like 1080p in a 6x6cm space and rest can be down scaled.

your eye literally cant absorb all of 4k or even 1080p at once.

pictures of human FOV ive seen from medical books look weird and basically like superwides but yer your focual point is not "widescreen" so 4:3/1:1 makes more sense if you don't care about periphery vision.

ideal monitor setup is prob a 4:3CRT with a DLP projecting behind you projecting every thing els around the room with your shadow keeping your CRT from reflections from projector. like that dumb Microsoft thing they tried to push. to make it work thou you would have to render the game twice like 1024x768 on your CRT and 80fov and then 720p or 1080p on the projector at a massive drawn out fov.

>muh blacks
get a better tv

I've literally never heard anyone say 4:3 is close to the FOV of the human eye. 16:9 was invented used for the MSUE HD system to better approximate the way eyes see. I prefer 16:10 monitors but I'd still choose 16:9 over 4:3, 4:3 is almost unusable once you get used to widescreen. 1:1 is completely useless outside of very specific tasks.

this isn't 4:3 and regardless its pointless you don't look at monitors in your periphery but your focal point is 1:1

16:9 16:10 just got popular because idiots wanted a TV that looked like a movie theater its a retarded format.

like people should have gone 2:1 for Tvs and kept computers 4:3 or even 1:1

Attached: ttg20130407011.gif (1000x905, 220K)

Play a slightly older FPS designed with 4:3 resolution in mind, it plays fine. After you get used to it, start playing in 16:9: you feel overwhelmed, the screen feels large and confusing.

Now do the other way around, play in 16:9 first, you're used to it, then switch to 4:3: you feel "boxed in" but it's perfectly fine, you don't feel overwhelmed with "too much visual information".

I know because just months ago I was playing in 1280x960 or 1280x1024 in a Full HD screen thanks to shitty Intel HD Graphics, but I got a GPU now and needed an hour or two to re-adapt to 16:9.

4:3 just feels comfy, like you're not missing anything on the screen, which is great for some uses. I've never lost the cursor on a 4:3 resolution, on widescreen though, I have.

no 16:9 was made because people where trying to work out DVD standards and the idiot TV makers didn't know how to make 2:1 CRT tubes but they could make 16:9 (I had a 16:9 1080i tv)

its a retarded resolution and dictated by limitations of CRT not actual science movie theaters are 2:1 because its dramatic and landscape etc. if LCD became mainstream faster and people didn't use CRT tvs in the 90s to watch DVD 16:9 wouldn't exist.

its shit. if i was buying a projector i would get 2x 720p and if i was buying a LCD i would get a ultra wide or one with cinema aspect ratio. but i would never get a LCD to watch movies that's retarded.

its your periphery not your focus thou. the actual area your looking at when you gaming is only about 50x50pixels on a 1024x768 monitor rest is periphery

on 1080p or 4k you still only focus on 50x50pixels even thou the pixel density is much higher. that's why pro players refuse to play on high res. it doesn't help because your brain cant comprehend that much data and you just end up looking at a smaller area in finer detail not actually being able to absorb more pixels of information and all the stuff around the side is pointless and just a blur in your brain.

>I've literally never heard anyone say 4:3 is close to the FOV of the human eye.
numbers don't lie. in fact if you only care about binocular vision rather than merged binocular and monocular, then vertical FOV even exceeds horizontal by 15 degrees

>16:9 was invented used for the MSUE HD system to better approximate the way eyes see.
lolollol

>4:3 is almost unusable once you get used to widescreen
i prefer working at 4:3

yeah it's not 1.333, it's 1.417 if you take the maximum horizontal arc. if you average out the eccentric curvature of the left and the right it's even closer to 1.333.

Honestly I do miss those old times

like on 1080p or 4k you still just focus on a 50x50 pixel area but its 1/4th of a inch instead of 1/2 inches big.

resolution is keno thou some games it does sorta help like things with super long view distance and player draw and high geometry like DayZ or WarZ etc but not many games like that exist.

1600x1200 is the best res

also BF1 sniping or some thing high res prob does help.

not really because no CRT can run that at high enough hz. prob like 75hz max even on the best unless you know some thing.

im interested how much the best CRT can actually overclock past their rated specs mine says it can only do 75 or 85 at 1024x768 but it can actually do 120-130 i think

i wonder if HP A7217A CRT(re-badged Sony FW900) or FW900 can actually do 240hz... apparently you can work it out by looking at

Horizontal Scanning Frequency: 30-97 khz
Vertical Scanning Frequency 50-160hz

that's of my 109bg Philips apparently you can work out with thous numbers that 1024x768 can only do 120-123hz or some thing but i don't know the math.

>nostalgia for fucking 200lbs 34' tvs
hell no

truly LCDs are perfect for the transient goy on the go!

Attached: l32757nopeevenidontwantthisone.jpg (700x606, 30K)

@67182329
fuck off newfag

really obscure question hope some one knows.

so apparently on some motherboards with intel integrated graphics you can run the intergrated gfx as well as the GPU card at same time and have like 3-4 monitors that way. 1 on integrated and 3 on gpu. what im wondering thou is when you set the game to run can you make it run from primary GPU in windowed mode and drag the window to the intergrated gfx screen.

reason im asking is lots of intergrated gfx still has VGA output and thinking of a creative way to have 0lag VGA output on cards that don't support it.

im imagining this wouldn't work but then I don't see how windows would stop it working other than the image just going black on the intergrated gfx screen.

I did this, and no you cant, if the game is on the integrated gfx screen it will run like shit, and will also slow down the rest of your pc aswell
just buy two graphics cards, get a cheap one for your auxiliary monitors

no wait you not answering my question. you can make game run in window and when launch game set it to run on primary GPU if you drag half of window over to intergrated graphics and half on monitor from main GPU what happens. does the part on the intergrated gfx screen just not show? or what. doesn't sound like you did what I asked.