Wanna upscale some of my old movies in UHD. Is there any way I can add HDR color as well? Any software suggestions...

Wanna upscale some of my old movies in UHD. Is there any way I can add HDR color as well? Any software suggestions? I don't understand the process desu, but if anyone could help it'd be you guys.

Attached: HDR.jpg (3917x3917, 871K)

no
also dynamic range and color are completely unrelated

yeah dude just increase the contrast in davinci resolve

Thanks user, you the man

Good luck trying to make multi billion dollar companies that shit out monitors to stop misusing the term HDR.

Just look at how they downgraded 4K(4096x2304) to UHD(3840x2160).

After reading several articles on HDR I still don't totally understand what HDR is. I don't even notice much of a difference on a display which is praised for its HDR capability (LG C8).

that beinh said standards that incorporate hdr do often also enforce 10 bit media and processing along with a wider gamut than the old ass rec 709 "sdr" standard so it somewhat affects the colors, but yeah not at all the palette, you can literally make a black and white hdr movie just fine, it's nowhere stated that you have to overuse ridiculously saturated rainbow color puke but that's exactly what people expect they're getting with muh hdr

It's bad enought that they don't state the bit-rate on TVs that claim they use HDR.

Then it's the HDR10 and HDRover9000 and HDR1234567890

It's about as retarded of a "standard" as retina.

Some TVs are capable of adding an HDR effect to SDR content in real-time as you view it. Sony call it X-tended Dynamic Range for example. I know LG TVs can do it as well. You still need a display with a 10-bit panel and local dimming (or an OLED) to display HDR anything like it's meant to be seen though. You gain nothing from watching it on your shitty old 24" IPS monitor that you bought in 2009.

Bit depth, not bit rate. And I don't know why you're throwing HDR10 in as some sort of pseudo-standard. It's a very clearly defined standard actually. As are HDR10+, Dolby Vision and HLG. Direct your anger at deceitful TV manufacturers looking to con normies who don't know any better by using words like "compatible" and "capable" to describe their product's HDR support. Something they can get away with by making the TV able to process a HDR signal, but omitting the fact that the TV lacks the ability (and indeed just the basic hardware) to actually display that signal anything like how the creator intended.

There's nothing wrong with the standards themselves, beyond the usual anti-consumer cripple fight between two competing formats in HDR10+ and Dolby Vision. Though the Betamax and VHS is very quickly being defined between the two, no matter how much Samshit kick and scream.

HDR is a form of active brightness control basically
An HDR TV is aware of what brightness a scene should be based on metadata and will map the brightness to what the TV can actually produce
SDR can only do 100nits of brightness and increasing brightness means also increasing the brightness of dark scenes
HDR depending on standard can do 10,000 nits of brightness while at the same time keeping dark scenes dark and with leveraging 10bit color you don't loose any detail when there are contrasting areas on the screen
I wanna say that is how it works, I still have a hard time wrapping my head around it

The higher emittance of display light allows to portray objects more naturally and closer to the reality of how much they were illuminated in the scene. Add to that how the digital values of light get mapped to the capabilities of the display and you have HDR. This of course heavily depends on the maximum amount of light the display can emit, usually measured in nits. Typically HDR is targetted for 1000 nits and above, but some HDR advertised displays may have only a half of that and be simply closer to conventional displays. For reference, typical PC monitors max out at 300 nits.

Whops, meant bit depth.
Just came from an audiofool thread.

The thing that is wrong with the standard is that it shouldn't exist since movies are pre-encoded and HDR is something you use with photo/image editing.
For the end user it's just a fancy 10-bit display doing colors as intended at 10-bit instead of 8-bit or Chroma 4:2:0 like most trash is encoded for.

It's not just a blanket effect across the entire scene. HDR, when used on a proper display with good (ideally full-array) local dimming (or an OLED), also creates specular highlights within the image. So if you have an extremely dark scene and somebody holding a lit torch for example, the torch will be extremely bright (as in actually measurably bright in terms of nits, not perceptually), whilst leaving the rest of the scene that's not being illuminated dark. If you imagine in real life what a neon sign looks like in the darkness, that's the kind of effect that HDR is capable of creating.

Obviously there's a whole lot more to it than that, but specular highlights are generally the "wow" factor that people talk about when it comes to HDR.

But that isn't true, and I don't believe that you've actually experienced HDR for yourself on a capable display if you're saying that. Movies are encoded with metadata that tells the display how to display the content in HDR. Static metadata that covers the whole content when it comes to HDR10, or dynamic metadata that can change the image right down to a frame by frame basis with HDR10+ and Dolby Vision.

Colour is only a part of what HDR does, and by far the least-important part for the average normie buying an HDR-capable TV. The specular highlights are what people are drawn to. And don't get me wrong, they ARE very nice, and create an image with far more depth and realism when used correctly (though some just try to burn your retinas out for no reason).

>SDR can only do 100nits of brightness and increasing brightness means also increasing the brightness of dark scenes
100 nits is just a guideline that has been superseded with displays rushing up to 700 nits even before HDR was introduced. The only people who I guess would think about setting up their environment for appropriate end viewer viewing conditions may work for cinema. Broadcast is even less likely due to how a lot of the content comes from the camera already stripped of any extremes if set up correctly or even with the default settings. And lastly internet video creators most likely don't have a single idea about this, they have their displays pushing for maximum contrast.

>though some just try to burn your retinas out for no reason
It's up to the editor to make sure to use their HDR budget correctly over the course of scenes. Yes, budget, because due to technology he can't make all the scenes be as bright. HDR production monitors even have an indicator telling when the image is too bright.

how the fuck do you guys know so much about TVs/panels?

No job

They never get laid

Ever heard of Wikipedia

They didn't even mention anything outside general superficial knowledge, lol

Okay, thanks. That helps.

lurk more, look up stuff you don't understand.

>why do people on a technology board know things about technology
are you daft?

hdr is a meme, why do you want to ruin your old movies with a modern meme the film makers couldn't even comprehend if you explained it to them?

You can do fake HDR effects using madvr and tone mapping. It's very complicated and resource intensive. I don't recommend it.