Yeahhh boiiii

Yeahhh boiiii
2160p is going to have some juicy releases in the scene.

Attached: file.png (1274x809, 669K)

Other urls found in this thread:

caps-a-holic.com/c.php?a=2&x=399&y=163&d1=13201&d2=13200&s1=131893&s2=131879&l=1&i=0&go=1
en.wikipedia.org/wiki/SPARCstation_20
4kmedia.org/real-or-fake-4k/
twitter.com/NSFWRedditVideo

It's going to be worse than 4k Blu-ray releases due to compression from bandwidth limitations. So really it's nothing to get excited about.

>disney
>watching that garbage

Considering most of their releases are still mastered at 2k, what's the point, the UHD bluray releases they do are almost all just 2k upscales.

This. Streaming is irredeemable garbage.

>what is bitrate

AUDIO 10
VIDEO 10

Thanks YIFY!

Attached: 266863D0-D56E-47F7-987D-7991352B7650.jpg (265x190, 7K)

>its going to be worse than 50gb of data
NO SHIT FUCKING RETARDED CAPTAIN OBVIOUS

Clearly not obvious to the OP

Can you screenshot the exact part where in OP i said its going to be the exact same quality as a bluray rip?

>2160p is going to have some juicy releases in the scene.

This implies the scene will bother grabbing something from Disney+ instead of just using the UHD bluray.

For their movie releases, it will be useless.

For TV shows like The Mandalorian, sure great. But they were never gonna get a UHD bluray release anyway.

Attached: ULTRAHD+.gif (960x540, 552K)

>2160p is going to have some juicy releases in the scene

Attached: tumblr_m1wuljUPZE1qc4b7mo1_250.gif (250x251, 999K)

Thanks for confirming that you are a retard and that i didn't claim it would be the same as a bluray.

Do you consider movies on iTunes to be 'juicy' releases too or has consuming media of the lowest common denominator fermented brain usage?

STOP LIKING THINGS I DON'T LIKE

Attached: file.png (384x458, 516K)

Ok thanks for clarifying

you implied it desu senpai

You imagined it.

>Paying subscription fees to watch Marvel and Star Wars

maybe by upscaling the shit out of everything, then recompressing it, things will average out to 2K again, like how youtubers will film in 2K to unfuck 1080p compression

>youtubers will film in 2K to unfuck 1080p compression
2k is 2048x1080p

literally almost identical to 1080p.

DCI 2k is basically 1080p. 2k is what MOST 1080p blurays are mastered from.

The fact they're STILL using 2k masters for their 4k UHD blurays is fucking disgusting.

You can't unfuck lossy compression, only make it less unslightly

Finally someone here understands what 2k actually means where its constantly erroneously used to as a buzzword for 1440p.

>Finally someone here understands what 2k actually means
I'm not bothered by it's use in consumer monitor resolutions, but when specifically discussing film, it's important to denote that DCI 2k is 2048x1080p, VERY similar to 1920x1080p.

Similarly of course DCI 4k is 4096x2160p whereas consumer 4k is 3840x2160p

It'd be nice if monitor manufacturers didn't make it confusing by calling 2560x1440p 2k, but it's not TOO bad considering no one is selling 1920x1080p panels as "2k".

2k is 1440p, retard.

Not when you're talking about movies.

DCI 2k is 2048x1080p

DCI 4k is 4096x2160p

Based.

Ha ha can't wait for 24/7 Disney, bing bing yahoo!

>streams 4k
>saturates entire neighborhood's telecommunications shit for a nearly unnoticable increase in quality

> be 2019
> 4k is standard resolution supported by all modern GPUs
> offer 4k
> kvetching commences
> MUH BANDWIDTH!
> MUH NO NOTICEABLE DIFFERENCES!1
> MUH THEY BE USING LOW RESOLUTIONS AND UP-SCALING!
they're offering something that everyone should be offering in 20 fucking 19. Christ, even youtube serves up 4k videos and that site is far more cancerous than anything disney could vomit up. what's the issue? I'm not really seeing it. did this same level of mindless kvetching happen when HD arrived too? the only issue I have with Disney is how its yet another streaming platform some people will have to spend more money to access as disney as taken most of their IP away from competitors in preparation for this service's launch.

>mfw got my official UHD capable 4K blu-ray drive properly BIOS flashed so it can rip 4K UHD now.

My time has come.

Attached: 1561423240082.jpg (707x707, 116K)

> mfw got my official UHD capable 4K blu-ray drive properly BIOS flashed so it can rip 4K UHD now.
> just flash the bios to bootleg discs
how things just never change. good for you, user.

Thanks! It required a little voodoo to get it working because the vendors who actually make official UHD drives firmware lock them so they can't be flashed back. Mine shipped with firmware 1.3. I needed 1.0 to properly rip. Required cross-flashing to a modified Asus Firmware with write back enabled. Then I was able to properly cross-flash BACK to proper LG 1.0 firmware.

It was an afternoon, but it fucking worked and I'm thrilled. Going to have to buy bigger HDD's for my server though. Even encoded properly, a single UHD movie is like 25GB

Do you mean you flashed a firmware that handles encryption and don't need MakeMKV to do it?

nope. I use MakeMKV. Basically Asus and LG both have similar drives that you can firmware flash back and forth (they'll not function of course). So because my LG UHD drive was 1.3, I need 1.31 or higher. Only Asus had a firmware that high, it was at 1.4. Because of that Someone on the MakeMKV forums modded the 1.4 firmware to allow firmware rewind/write-back. Then since my drive is recognized by my OS as an Asus drive (even though it's actually LG), I then cross-flashed the proper UHD ripping capable 1.0 LG firmware onto my drive.

Now MakeMKV opens/rips the discs like they're regular 1080p blu-rays.

oh and about the encryption. The encryption is only enforced on higher version firmwares. MakeMKV will throw a shit load of errors if you try to rip. Where as 1.0 firmware didn't even check for encryption.

>Required cross-flashing to a modified Asus Firmware with write back enabled. Then I was able to properly cross-flash BACK to proper LG 1.0 firmware.
nice
>It was an afternoon
i could imagine.
> but it fucking worked and I'm thrilled. Going to have to buy bigger HDD's for my server though. Even encoded properly, a single UHD movie is like 25GB
can you make 1:1 disc copy (minus the protection but still preserve menus, etc.) ?
>The encryption is only enforced on higher version firmwares
that's damn nice, user.

>can you make 1:1 disc copy (minus the protection but still preserve menus, etc.) ?
I can pull a full un-encoded copy of course. They're usually in the 65GB range. As far as preserving menus, I don't think you can pull that digitally so far as I know. I never bothered because I don't even use a dedicate blu-ray player. I just have them sitting on my server and then stream to my living room TV via my HTPC

back when was in my early 20s (many moons ago) we had the joys of doing the same things. hunting down drives that could allow us to easily get around the various protections for DVD, that became pointless when the protection was broken wide open and any drive could rip an entire disc. anyway, much joy is sparked within me to see such protections be smacked around like cheap whores. i'm sure the industry would love to believe their protections are holding up well, but sadly for them everything from streaming to physical discs, someone is ripping it.

exactly. By law we're allowed to make backups of optical media. Fuck these encryption pushing fuckers in hollywood.

It's so barren compared to the Flix.

Attached: Screen_Shot_2019_08_23_at_8.00.09_PM.png (998x557, 762K)

Streaming doesn't even need full hd to show all detail left after compression. 4K will only render artefacts more accurate. How about we put a number on compression for marketing instead of muh pixels.

>Disney
>2160p
Oh boy..
caps-a-holic.com/c.php?a=2&x=399&y=163&d1=13201&d2=13200&s1=131893&s2=131879&l=1&i=0&go=1

>using toy story 1 as a comparison
you know the original render was not even 1080p, right?
i wonder if they even have the resources to render it again at a higher resolution

Yes, yet ANOTHER streaming platform.

Attached: Adobe_20190827_173422__01.jpg (1080x525, 140K)

>Each completed shot then went into rendering on a "render farm" of 117 Sun Microsystems computers that ran 24 hours a day.[37] Finished animation emerged in a steady drip of around three minutes a week.[63] Depending on its complexity, each frame took from 45 minutes up to 30 hours to render. The film required 800,000 machine hours and 114,240 frames of animation in total.[38][58][64] There are over 77 minutes of animation spread across 1,561 shots.[60] A camera team, aided by David DiFrancesco, recorded the frames onto film stock. To fit a 1.85:1 aspect ratio, Toy Story was rendered at a mere 1,536 by 922 pixels, with each of them corresponding to roughly a quarter-inch of screen area on a typical cinema screen.[37]

This was between 1993 & 1995 to finish. Can't the original source files be rendered at 1536x922 in real-time at this point? Nvidia is always bitching about it's RTX tech and even though it not yet a fully featured rendering arch the underlying hardware ought to be there for an application like this.

At the time they were using a farm of 117 Sun SPARC 20s
en.wikipedia.org/wiki/SPARCstation_20
So roughly at the very maximum 468 cores @ 50 Mhz if they fully populated each dual-socket board and 512 MB of system memory and 8MB VRAM each.

With a modern EPYC Rome cluster and Quadro RTX cards there is no reason they couldn't rerender the film in 4 or even 8K in 60FPS+. They could even update the textures/colorspace if wanted. It isn't like Disney doesn't have the fucking money. But instead we get upscaled UHD trash on disc.

Attached: KH3 ain't even raytraced.jpg (1280x720, 116K)

>any audio besides PCM or LPCM post 90's
Why do major corporations fall for scams?

>STOP LIKING THINGS I DON'T LIKE
Sure showed him

>Forky asks a question
a fucking fork!!! toy story 4 revolved around a fucking SPORK! REEEEEE

I can't wait to finally store all the Disney classics in glorious 4K for my kids to watch them when the time comes

Enjoy shitty compression

>Can't the original source files be rendered at 1536x922 in real-time at this point?
while i haven't run the numbers, i'd say that's still a stretch.. well, unless you're throwing it at a datacentre still, then yea most likely
and yes, it would be totally reasonable to do an actual UHD render today, assuming they have all the assets still

4? did i miss 3?

>Forky asks a question
>World according to Jeff Goldblum
Why the fuck is the equivalent of Youtube shorts/ marketing material being used as a selling point for a streaming service?

Attached: Bruh.jpg (640x480, 50K)

this content is really similar to what Disney already offers to Sky TV UK customers, pic is one of them

Attached: d8z8pkm-fdd5324f-933f-475d-a799-198598dac398.jpg (1024x1228, 298K)

>what's the issue?
The issue is they're releasing UHD content that was mastered at 2k, far below actual 4k resolutions.

It's just kinda bullshit when they have the resources to just do a 4k master and no need to do upscaling at all. But they dont.

Video streams don't have the concept of resolution, beyond the obvious need to set an upper limit on the amount of bitrate retained before diminishing returns sets in.
If it's mastered in 2K, the bit stream should be the same in 4K.
Accordingly, nobody said that upscaling and downscaling were lossless operations, so you would in fact lose quality.

One exception is deep learning superresolution which "invents" information pulled from a historical database.

>Video streams don't have the concept of resolution
false
>beyond the obvious need to set an upper limit on the amount of bitrate retained before diminishing returns sets in.
... what?
>If it's mastered in 2K, the bit stream should be the same in 4K.
no it shouldn't
>Accordingly, nobody said that upscaling and downscaling were lossless operations
they generally are. downscaling loses information, but that's considered subsampling, rather than lossy compression
>One exception is deep learning superresolution which "invents" information pulled from a historical database.
it makes educated guesses, not inventions

I know most streaming services let you download content now, never tried it but I imagine that would be better quality?

what a god damn disaster,
the worst part is people are gonna buy into it

Every advancement in technology is instantly nullified by exploitative services of no value. Welcome back to dial-up, where you can't play an online game while dad watches Trailer Park Boys and your sister streams 4K Disney shows at the same time. Net neutrality is a massive scam for this reason.

I'll give nothing for its service

I wanna try it for a month, not one to jump on "hype trains" but it's Disney so I'm assuming the experience will be at least 2x better than Netflix or Hulu

which model drive? i wanna do this in the future

That's QHD actually

>retards implying anyone but the normies is going to pay for this shit

You could throttle Disney+ if it's such a problem. It'll just fall back to a lower resolution.

Because on top of their 4k masters possibly being upscaled from 2k, current bandwidth limitations mean at most you'll be watching at around 20mbps, the same average bitrate you get from a standard 1080p bluray, you fucking sped.

funny thing is, 4k movies are mostly shot in 1080p with the exception of some scenes.
the majority of scenes are upscaled and brushed up in post production to make it look passable.
.t sfx guy

This is GREAT. All these streaming services pushing out pozzed content will now make less profit overall since they compete against eachother.

Peak söy

They're actually generally shot at a much higher res, but they're MASTERED at DCI 2k for cinema release, if it gets an IMAX release it might have a separate higher res IMAX master.

Most Disney/Marvel films for example are shot at digital Arri 6.5k res.

The digital effects however are rendered at 2k for the theatrical master.

> 4k is standard resolution supported by all modern GPUs
How many people have a "modern GPU" though? Anyone who's not a gamer is probably running intel HD graphics or some similar shit. Not to mention 4k monitors are literally 2% of actively used monitors. Not exactly what I'd call "standard"

fun fact no.2, there are no macs involved in the sfx pipeline.
monkeys work on windows workstations and the scenes are processed on linux server farms

I'm not talking about main character shots.
back layers and 'filler' scenes are basically shot with the equivalent of smartphone cameras compared to the ones you're talking about.
and they make the majority of the movie

intel iGPU has been capable of 4k DRM playback since skylake.

>I'm not talking about main character shots.
>back layers and 'filler' scenes are basically shot with the equivalent of smartphone cameras compared to the ones you're talking about.
>and they make the majority of the movie
lmao you're so full of shit, i'm talking about big budget marvel movies, they're not using anything bit arri 6.5k pretty much the whole fucking way since 75% of it is green screen or similar

sure, if you want to splurge on upscale, I won't stop you, fagget

this is a surprisingly dependable resource listing the "legitimacy" of many many 4K releases
4kmedia.org/real-or-fake-4k/

what the fuck are you even talking about?

I'm pissed at disney because they master their UHD blurays with a 2k theatrical master instead of using their 6.5k arri footage and mastering at 4k.

>Bad Boys 2 – Real 4K, HDR
>Bad Boys – Real 4K, HDR

that's all i needed

The only relevant post in this entire shit thread.

not necessarily. but even so, the possibility of 4K SDR content for those without an HDR screen is enticing too

>claiming movies are shot in 1080p when its easy to look up what cameras are used in big budget pedowood flicks
>usually shot in resolutions between 3.5k (i.e. fury road) and 6k
>"sfx guy"
Either you're full of shit, or you're so demonstrably fucking bad at your "job" that you think the lower res files used in offline editing and cutting together rough cuts are what the cameras shot at.

Lmao, what a fucking retard.

100% this.

They will all claw for our money now.

At this point we need some labelling to separate true from bs 4K.

It's still worth saying.
4k at shit bitrate is often times worse than a good 1080p encode upscaled by your 4k tv.

Get rid of "#k" meme naming and just continue to name based on vertical resolution (1080p, 1440p, 2160p, etc)

Marketing depts will never drop it. Hoping they always list in on the back of the box is the best we can hope for...

Luckily genuinely good content worth buying is rare so.

Would you rather download a 10gb 2160p release or a 10gb 1080p release for a 4K TV?

1080p, assuming the bitrates between then are the same. A 4k movie compressed to 10 gigs will look like shit.

/thread

> why can't a 25 year old proprietary distributed rendering suite work in 2019
oh my sweet summer child

Yes it's going to be worse and people need to be told

The 4K rips with low bitrate often come with better sound and HDR, though.

great, so even LESS bitrate for your actual video file.

No one needs to be told that, this is Jow Forums not /b/.
The only ones that need to say that are the retards that like to feel important by saying something while implying you don't know what they are talking about, you retards are literally the tipping fedora meme in word form.

>Comparing a 1080p BD to 4K HEVC streaming.

Compression has a diminishing effect with higher resolutions, h265 is better than h264, the 4K streams still come with better sound and HDR, and 4K streams will still have better bandwidth than 1080p ones.

Then again this is Jow Forums, most of you are probably on used thinkpads and think everything that isn't h264 720p is a meme.

Looks bretty good for 9.5gb.

Attached: john.jpg (3840x1597, 1.98M)