Why does no one talk about microLED?

Attached: 1491977157_41516.png (600x430, 418K)

Other urls found in this thread:

en.wikipedia.org/wiki/Cathode-ray_tube
blurbusters.com/zero-motion-blur/lightboost/
en.wikipedia.org/wiki/Surface-conduction_electron-emitter_display
youtube.com/watch?v=3BJU2drrtCM
flatpanelshd.com/news.php?subaction=showfull&id=1523611680
vimeo.com/180284417
twitter.com/NSFWRedditImage

Because it isn't commercially viable yet, and the real developments on it are under wraps inside corporate R&D departments.

Because it's vaporware.

MicroLED news
CES2018 samsung present TV ...
Maybe new microLED for AR glasses or VR visors over 2020+

if they dont at least make µLED flicker each frame then it will be no better than LCD

So, it never gets released?

Is it really smaller than a sand
Like just a single sand?

It'll probably show up again next week at CES.

Think I heard Samsung might have some early consumer models.

Dude if it works like normal LEDs you can do that all you want.
That's just in the firmware driver of how you control the display.

LED have no burn in.
This could be EPIC.

e-ink is the future of displays

What if they paint the microLEDs onto the inside of a glass screen and then use an electron gun to scan across the interior surface of the screen and light up the LEDs?

no because retards want lcd tier refresh rates

i had this idea too actually. just having a layer of µLEDs on the end of a glass tube all connected to a common ground and having an electron gun shoot electrons at the individual µLEDs. it would be even better than µLEDs in a matrix because it wouldnt have input lag.

then do you think someone could use a modified firmware on an LCD to make the backlight flicker once after all the pixels changed to the right colors?

>electron guns

Stop watching Star Trek.

>LED have no burn in

They do, but it's nothing like OLED.

W-would this basically be like a CRT but better?

Why?

>LEDs have an expected lifespan of 100,000,000 hours so technically if you left a static image displayed on there for 11,000 years you could get burn in
>microLEDfags BTFO

Attached: 1524925428258.jpg (572x303, 32K)

it would be a crt

it would be better, since phosphors have burn in

Do you know how small electrons are relative to micro LEDs? That'd be like firing randomly into a crowd and expecting to hit someone.

>smaller than a sand

Attached: 1497134574289.jpg (1024x1182, 79K)

en.wikipedia.org/wiki/Cathode-ray_tube
>The cathode-ray tube (CRT) is a vacuum tube that contains one or more electron guns

This kills the pixels.

A valid point but the more pressing concern is how to prevent the electrons from hitting all the air atoms between the gun and the target

Would it be cheaper to just use normal sized LEDs but view them from further away?

>would it be cheaper to just buy a bigger house so I can have my desk 10 feet away from my monitor

What's the difference between this and the regular shitty screens we already have other than it's smaller? What advantages does being smaller have other than greater PPI?

I wish I could tell you guys about the leds I’ve seen behind closed doors, 5% the size of this shit and can be powered over the air, seen it in multiple companies dev labs
The future is gonna be an amazing place once it leaves R&D my dudes

Only 5% the size of microLED and powered over the air? I wish I could tell you guy about the leds I've seen behind even more secure closed doors, 1/6000000th the size of that shit and can be powered over the subconscious, seen it in multiple companies dev labs

Extremely low power usage, like, approaching the hypothetical minimum. Dirt cheap material costs meaning low cost displays once the manufacturing process is refined. True blacks, awesome contrast and colors, etc... LEDs can flip between on and off in nanoseconds unlike OLEDs, so you can play your DOOM or whatever kids play these days at 16,000 FPS.

are you retarded this is new technology right now we use either LCD or OLED, andboth of them are flawed, LCD because you need backlight, so you dont get true blacks and have light bleeding and oleds because of burnins and lack of brightness.
mLED is like OLED, expect instead of Organic shit they use synthetic led, which has 50000x lifetime

no at large sizes the electrical infetterence makes it unpractical

ideally youd want to make nano LEDs and view them through a magnifying lens

On my nighwalks, I like to visit several R&D labs to see what's up. You wouldn't believe the things I've seen.

been there and done that
let me tell you little truth:
>antigravity engines are real

Well you can do black frame insertion yeah. And doing it with the backlight after every frame would be better than with the crystals.
Blurbusters back in the day modified some 120Hz 3d monitors to do this with software called "Lightboost"
blurbusters.com/zero-motion-blur/lightboost/

THIS

>not a houselet

>Blurbusters back in the day modified some 120Hz 3d monitors to do this with software called "Lightboost"
could you do this to any LCD monitor using that software?

>actually wanting your monitor to flicker like the old CRTs

Enjoy your headache and worsening eyesight.

>Enjoy your headache and worsening eyesight.
i use a CRT as my main monitor and i dont notice either of these

Why were these killed Jow Forums? Were they just too good?

en.wikipedia.org/wiki/Surface-conduction_electron-emitter_display

Depends on the frequency. 85 Hz or better was acceptable. But anything below that (75 Hz, God forbid 60 Hz) always made my eyes hurt after a time.

The one advantage LCDs have over CRTs (besides form factor) is non-flickering image.

for me 80hz is high enough to not notice flickering
though LCDs do also have the advantage of more sharp text.

no they are shit, they have the negatives of both LCDs and CRTs.

>The one advantage LCDs have over CRTs (besides form factor) is non-flickering image.
yes, but the way it achieves this is through sample-and-hold, which, while flicker-free, utterly ruins motion quality, making basically anything smear across the screen

>no they are shit, they have the negatives of both LCDs and CRTs.

You mean positives.

>SEDs combine the advantages of CRTs, namely their high contrast ratios, wide viewing angles and very fast response times, with the packaging advantages of LCD and other flat panel displays. They also use much less power than an LCD television of the same size.

they also have burn in like a CRT and input lag like an LCD

>burn in like a CRT

Its fucking nothing..

>input lag like an LCD

Well, perhaps the top 1% of competitive twitch shooter players will stay on CRTs.. other people dont care.

I don't see why not, the monitor just displays whatever info it's provided. If it's provided a full black image every other frame, it'll display it

LEDs are low-voltage, retard.

It's physically impossible to provide enough energy to 9.5 million LEDs. You would have to break them up into controller chunks, and that would lead to even more absurd energy usage, not to mention timing issues.
I can confidently say µLED TVs will NEVER hit the consumer market. It's basically SED/FED 2.0.

Attached: 1533113321548.jpg (650x576, 60K)

>SEDs combine the advantages of CRTs, namely their high contrast ratios
CRTs had AWFUL contrast ratio in every real world setting. The only time it had contrast that you could call "good" was in a completely dark room, preferably with a felt-lined monitor hood.
CRT/SED phosphors reflect incoming light, which has the same effect as projecting in a lit room. The black level as only as dark as the ambient light in the room. CRTs are so dim that even overhead lighting can wash the display out. If the sun shines on a CRT it is impossible to use because more light is reflected by the sun than is emitted by the phosphors.

NO.
Only specific monitors designed for 3D because they had actual drivers to control backlight and were designed to flicker it, just every other frame for every other eye.
they list supported monitors.

you don't know what you are talking about.
Its backlight control not black frame insertion.

I knew what I was talking about, it just wasn't what he was talking about.

It'll probably be an 8K TV that they've been talking about

BTFO

Attached: 00000000000000003.jpg (450x450, 38K)

1. You don't power 9.5 million LEDs at the same time. You can power as little as one LED at a time.
2. Even if you did power multiple simultaneously, you are on mains power and for the amount of light you are outputting which is the main power consumer, it's equivalent to existing LED back lights.

And no, controlling chunks leads to less energy usage and there are no timing issues. The timing is the function of scanning the matrix, which is already how existing LCD operate. They have a horizontal refresh rate and that is essentially the speed of the scanline through the matrix.

So basically you have no idea what you are talking about.

you can run them all at once, what retarded reasoning is that? You won't blast them with 20mA or whatever it takes for normal 3-5mm LEDs, that's the fucking point. You supply some uA of power per LED, enough to make it light and don't burn your eyes out.

youtube.com/watch?v=3BJU2drrtCM

get rekt

You actually can't. You could just power them with a static image but that isn't a display.
Because the controllers don't have enough pins to individually address each input or output.

So the most sane way to address tons of input/ouput is to actually do one at a time and then just go through them all very quickly.

Ok and you want to elaborate how i "got rekt"?

LCD with backlight is only updating one pixel at a time, but it holds the old image. a LED cannot hold an old image because the LED is both the light source and image data / color source.
a micro led tv would be similar to the CRT and again, only one is being powered at a time just very fast.
So no, I'm right still.

Then explain this

flatpanelshd.com/news.php?subaction=showfull&id=1523611680

how would an LED not hold an image, it would be the same color and brightness until it was told to switch
it would function exactly the same way
the only real difficulties with MicroLED is cost of manufacture especially regarding yields, which doesnt seem very attractive given how OLED will only bring you more money through planned obsolescence

Because the LED is only going to put light out while being powered.
And you are going to be multiplexing through an array because you don't have 9 million outputs on a processor/driver chip.

>Here is your schooling on the basic of multiplexing
Say you have 6 pins to send power or ground to.
You could control 6 LEDs simultaniously.
Since an RGB LED is actually just 3 seperate LEDs combined, You could control 2 pixels. (2*3 = 6).
Not very good.

You can take the same 6 pins, and instead place your LED into a grid.
....1....2....3
......|.....|.....|
4---x---x----x
......|.....|.....|
5---x---x----x
......|.....|.....|
6---x---x----x

You can control 9 LED with 6 pins, but only one at a time by accessing a row and column.
You can't hold the image. If you try to power everything on it's going to just turn them all on and be white.
This scales up very well all you have to do is just go faster to get address every one individual in the same time slice.

Ok you have been schooled now.

Vacuum.

of course you can, there's daisy chaining of shift registers and all kinds of possible schemes. Multiplexing is just done because it's cheap and if done fast enough there's no flicker anyway.

hello mount stupid

ur mom gay

ever heard of active-matrix LCDs?

no. whoa thats cool.

oh look, apple is trying to push yet another retarded name for tech that already exists, again

remember "retina" display. fuck. that. shit.

Attached: brainlet.jpg (900x900, 83K)

butthurt fanboy is butthurt

you're the kind of retard apple retards laugh at

i bet you think your new "4K LED TV" actually uses LEDs for pixels, it doesn't. what people know as an "LED TV" is actually just an LCD TV with LEDs for backlighting, as opposed to a traditional CCFL

-- oh, and apple didn't invent or name microLED, of course.
that said, they probably will come up with a dumb marketing name for them if/when they start using them, but we're not there yet

Why the fuck do you WANT flicker, you fucking nword? Flicker causes massive headache for most people. Kys faggot.

it's not that people want noticeable flicker, it's just that flicker is a side-effect of something people want; good motion quality.
an electronic display image is a band-limited signal, that is, it can't have an infinite framerate, so it can only display instances in time
your eyes, on the other hand, can track movement smoothly, watching a 30fps image doesn't make your eyes 'jump' from one position to the next 30 times a second, rather it's enough to trick your eyes into moving smoothly with the picture contents, anticipating movements, here lies the problem.
with a common LCD, the picture is drawn, then left on screen until it is overwritten, this causes old information to smear on your retina as your eye continues to follow the assumed motion of image contents
ideally, as the display is presenting data which is a set of instances in time, you want to display them for a very short period of time, with blank periods in between. this eliminates 'smearing' on your retina, providing much sharper and realistic motion, at the cost of flicker

i've written this once before, but again;
- flicker-free
- low framerate
- good motion
pick two.

you want flicker to have good motion, but you also need a high framerate to prevent that flicker from being visible

I, for one, am glad that so many patents related to this technology have already been issued despite the complete lack of products on the market. Surely this will benefit consumers and hasten advancement of the technology.

You can just do black frame insertion via software

sometimes that is viable, if you have a monitor that supports suitably high refresh rates (there's not much use for 60Hz with black frame insertion, the flicker that produces is pretty intense)

>The display measures almost 10 meters in width and 5.5 meters in height, which translates to 440 inches diagonally (in 16:9 aspect ratio).
It probably takes a few kilowatts of power to operate.

Nope, 24.93fps film stock actually records at infinite depth of time versus rendered images. Due to film recording ALL light hitting it in EVERY frame, for the duration of it, creating blur that the human eye interprets naturally as motion. It takes a friggin 8,000 core render farm for Pixar to simulate this decently.

For computer displays and rendered images and scenes and motion, higher frames yields non diminishing returns. Eventually kilohertz and gigahertz monitors will be a thing. You can see 244hz vs 190hz like black and white.

>Eventually kilohertz and gigahertz monitors will be a thing.
Seems a bit excessive and unnecessary when variable refresh exists.

That will probably be used to manage heat and power consumption

I'd choose flicker free and low framerate any day over nauseating, eye cancer causing flickering garbage shit.

Look at him.
Look at him and laugh

Attached: 1514744854822.jpg (650x600, 159K)

It takes a friggin 8,000 core render farm simply because they're trying to simulate each ray out of probably billions.
If you have a quantum computer with enough qubits you could do it perfectly and almost instantly.
vimeo.com/180284417

How do you even manage to breathe while being so retarded?

Typing on a low refresh rate display with high input lag is nauseating.

Are you retarded? OLED TVs are already a thing.

Enjoy your space heater TV.

OLED is not MicroLED.

>OLED is not MicroLED.
They are still LEDs and if we can power OLED just fine how much harder is it going to be to power normal LEDs?

MicroLED TVs are already on sale. What are you talking about?

>MicroLED TVs are already on sale
[citation needed]

What is the difference between OLEDs and mLEDs that makes it impossible to build an mLED TV?

>Enjoy your space heater TV

Attached: 1525111729478.png (1291x983, 219K)

2019 year of the xled?

I'm not here to educate you on the design of MicroLED backplanes. Unless you're working with a superconductor you're going to have resistance, and having millions of points of resistance will cause tons of heat. Higher heat causes higher resistance as well, so you end up drawing more energy.
OLEDs have far less resistance to deal with, and they get hot as fuck.