Movies and gaming going on low FPS

They profit more from still pictures than videos when they do propaganda to kids.
After all, if they market videos, they can always cheat and render elsewhere.

I also blame the movie industry. "Cinematic low FPS" is literally the same as the lunacy of "Vinyl sounds better than digital" that still exists.
Luckily that started dying a few years ago.
If movie makers weren't hipster pieces of shit they'd strive for 240FPS at 16K.

Attached: file.png (219x167, 50K)

Other urls found in this thread:

wiki.hydrogenaud.io/index.php?title=Myths_(Vinyl)
twitter.com/NSFWRedditImage

dumb retard

Freshly pressed Vinyl sealed in a vacuum and played on a highly specialized turntable sounds infinitely better than digital. Even 1bit SACD. It's objectively true due to analog quality's superior resolution in comparison, and subjectively shown by many series of double blind studies that consistently showed a natural preference for a vacuum sealed vinyl over any digital format.

This being said, the human eye can pick up a single different frame from a series of frames at up to 200fps. Just as the human ear has no limiting factors that make it unable to pick out the difference between vinyl and digital music, there is no limiting factor to the eye to make it unable to see at 240fps

> Freshly pressed Vinyl sealed in a vacuum and
Fuck off retard. Hipsters like you is what keeps technology back.

The intermediate source is digital you brainlet.

nope, the only reason vinyl sounds more pleasing than CD is due to the LACK of quality the vinyl imposes, the analogue warmness that creates imperfections and texture that we all love, the turntable output can be digitized into 16bit PCM and it's absolutely impossible for a human to tell the difference
wiki.hydrogenaud.io/index.php?title=Myths_(Vinyl)

idk about games, but I don't see why cinema should use anything higher than 24fps, 24fps have been the industry standard since ever and it wasn't because Hollywood wanted to save money on film

Film makers commonly use 35mm film instead of a digital camera. This is because film, being an analog substance, can record as resolutions that exceed most 4k and even 8k digital pictures. IMAX commonly shot their films, before it was just a buzzword, in 65mm film, which has a resolution exceeding 16k, and could be scanned at 16k and produce an image sharper and more clear than one shot in 16k native digital.
The same is true for vinyl, and is why vinyl remains king of quality standards, while in a perfect vacuum and played by a specialized turntable to prevent dust pops. Magnetic tape is just about the same, with some providing audio quality equal to a computer outputting at 384khz 64-bit resolutions.

> I don't see why cinema should use anything higher than 24fps, 24fps have been the industry standard
Who the fuck gives a shit what is a standard?
Vinyl was the standard for decades. That's exactly why you have morons still believing it's better when they are ludicrously stupid when the intermediate source that pressed them was digital to begin with (nowadays).

For modern vinyl sure, but for decades it was based on a magnetic master, that was superior on all fronts to any form of audio technology that we have today. There just isn't any evidence that shows people can pick up on the difference besides some doubleblind studies done by "golden ears."

> Film makers commonly use 35mm film
grandpa, that hasn't been true since the early 2000s because it's more expensive in most cases.

Also if you weren't clueless you'd know of 70mm. It's why 2001: A space Odyssey looks like it's from the mid-1980s (it's from the fucking 60s).

Generally, hipsters like you should shut the fuck up. Vinyl and low FPS are relics that have ONLY disadvantages over high resolution and rate.

Fuck off and let the world progress.

But the human ear can't hear ultrasounds, thus your whole argument is null, also during the recording, mixing and mastering process the ultrasonic elements are (or at least should be) lowpassed to cut the unwanted ultrasonic content that can generate distortions.
Vinyl DOES sound better for most types of music though, not saying that it's better, digital can reproduce a 1:1 perfect copy much superior to vinyl in any way, but the analogue warm vinyl has it's most more pleasing to listen to than sterile digital recordings.

You're fucking stupid if you think the 1960s Saturn V that got us to the moon was inferior to a 1990s shuttle that brought us to a station 4 times and burst into flames.

Anything higher than 24fps and your movies start to look like soap operas and wildlife documentaries.
There's an entire science behind how humans percieve different shutter speeds, frametimes, and what feelings they induce.

> soap operas
Fuck off hipsters. High FPS is exactly why it would make something more able to draw you in because it's more realistic.

No, high FPS and resolution are overstimulating. It's enough that everything is overanimated CGI nowadays and there's always something twitching around of exploding on screen.
You can't convey drama with 240FPS 16K. You need to draw out the moment, induce burden with dark colours and motionlessness with slow shutter speeds.

Also, this!
I was going to comment on this but I didn't because I thought they would say "but just because you have more FPS doesn't mean you can't still use low shutter speeds".
You know that the human brain expects to see motion blur, right? Or else looks unnatural, just wave your hand in front of your eyes, what do you see?
Eitherway, it's not a good argument because you can still use slow shutters at high fps.

I'm very aware of 70mm, but it was uncommon, I was only referring to popular formats, and yes, filmmakers used 35mm it because it was cheaper, still you can keep scanning a 35 to 8k to this day and get consistently better results with higher digital scans, meanwhile early adopters are stuck at crap resolutions that they shot the movie in, and the real content is forever stuck looking like crap. You don't understand quality if you can't at least respect analog. I love all things quality, I hate 24fps films, and think it's absolutely retarded. You can't show me any single digital image or sound that is somehow superior to the high-end of analog quality, because analog is still better in so many ways.

You know, the human eye can't really see that much detail, after a certain sharpness it really doesn't matter it's going to be lost on your retina anyway

>If movie makers weren't hipster pieces of shit they'd strive for 240FPS at 16K.
I highly suggest you watch The Hobbit in HFR ("high frame rate" which is something like 48fps)

>The human ear can't hear ultrasounds
>The human eye can't see more than 24fps
>The human eye can't see past 1080p
>The human brain can't tell the difference between cgi and reality anymore

I've heard these arguments time and time again, fuck off with that shit, it's not true, there isn't a limit on your eyes or ears, sure ultrasounds aren't perceivable, but come first, second, third bounce they are distorted into audible frequencies that make up the many imperfections of background noise that fill the room with a more accurate picture of what can be heard. Same for ultra-low frequencies.
Higher resolution is picked up at a closeup, but also as an indescribable sharpening of the image that can be picked up even at a large distance, the inclusion of the effect also takes care of the many artifacts that can persist at lower resolutions such as aliasing, noise profiles, and color accuracy.
Higher framerate is what the world is going at when you see. The soap opera effect as it is commonly known is merely an artifact caused by poor frame interpolation, real 60fps content looks better and more smooth, as well as reducing aliasing artifacts and overall perceived accuracy.
CGI will always look fake unless they were using real voxels, very detailed material science, and perfectly accurate scene matched lighting which is impossible as it couldn't possibly account for the many unintended light scatters that were placed in a scene.

See

No, you really can't hear ultrasounds. ~20KHz is all you get and it's downhill from there with age. Whatever you do hear in that range from ultrasonics is either caused by the aliasing of your equipment or harmonic resonance.

No, really, you can, they are between 12-13khz by the time you catch the bounce as it's distorted against your walls and by your speakers.

When fast wave go through air and hit soft thing wave slow down.

The wave's amplitude is lowered, not its frequency. durr

>sure ultrasounds aren't perceivable, but come first, second, third bounce they are distorted into audible frequencies that make up the many imperfections of background noise that fill the room with a more accurate picture of what can be heard.
You know, formal double blind tests were made and proved that a human can't tell the difference.
>indescribable sharpening of the image that can be picked up even at a large distance, the inclusion of the effect also takes care of the many artifacts that can persist at lower resolutions such as aliasing, noise profiles, and color accuracy.
Aliasing is solved with anti-liaising filtering, aka softening the image a bit during capture.
Noise isn't an issue on highend cinema equipment, also bigger pixels = less noise
And the eyes are more sensitive to luma than color, if luma detail is beyond the limit of the human eye, then color detail is several times beyond the limit.
>The soap opera effect as it is commonly known is merely an artifact caused by poor frame interpolation, real 60fps content looks better and more smooth, as well as reducing aliasing artifacts and overall perceived accuracy.
The brain needs motionblur in order to operate it's best, your eyes have motionblur, the moment you take the motionblur away you're not reproducing what the human eyes see, rather something the brain isn't used to cope with.
>CGI will always look fake
Yeah CGI looks like shit, we both agree on that

That's like saying you can't hear a jet's turbines because the sound coming directly off them is at a frequency too high as air molecules are being pressed out at a rate exceeding 20,000 cycles per second. When obviously you can hear them, because when the sound bounces around enough it is within audible range.

>blur the image
Jesus fuck, no, just stop.

>color sampling exceeds the human eye
again, where is your data on this, which part of the brain is limiting you from being able to see colors less or more accurate

>bigger pixels means less noise
Not when the thing is shot at at that lower resolution, and we're only talking about that, nobody has 16k tv, and probably no one will because GPUs will hit the silicon wall way before that happens.
Regardless, I'd rather have tiny noise than blury noise.

>human eye needs motion blur
Why would it not produce motion blur for an images running at 240fps than if it was in real life running at infinity fps? I don't see your point at all, and it really seems to me like you have never seen a high framerate video that was not made using interpolation.

Wrong, it affects amplitude and total harmonic distortion. Harmonic as is frequency. I believe it was tesla that proved that attempting to discover the harmonic resonance of different solids.

>Jesus fuck, no, just stop.
Why, you can't see the difference anyway
>color sampling exceeds the human eye
again, where is your data on this, which part of the brain is limiting you from being able to see colors less or more accurately
I wasn't talking about color sampling, but color resolution, idk of sRGB is good enough color sampling, it probably is, I guess I could read on it. But we were talking about sharpness, detail in the space domain, amount of pixels in a digital video, where the human eye is several times more sensitive to luma than color.
>Regardless, I'd rather have tiny noise than blury noise.
Yeah, but again, after a certain resolution the individual pixels or grains are so small that you can really see it so forwarder increasing that resolution is useless. And noise is only a problem with low end equipment.
>Why would it not produce motion blur for an images running at 240fps than if it was in real life running at infinity fps? I don't see your point at all, and it really seems to me like you have never seen a high framerate video that was not made using interpolation.
Good point, you may be right on that, I guess if you have ultra fast fps, motion blur wouldn't be needed (except for a relaxed cinematic feel), but sure would still be needed for 60 or something fps

>The Hobbit in HFR
interpolated garbage

If you don't think analogue sourced audio is better than digital you are a idiot hiding from facts

Analogue will always be better

>I download vinyl rips nuthing is lost

Imagine being this dumb

>I can hear ultrasounds

Imagine being this mentally retarded

>dude why don't everyone film in 144Hz

Attached: 1516131470720.png (205x246, 6K)

that magnetic master was simply digitalised in high resolution, making it invulnerable to wear and tear. Something that's literally impossible on physical media. You need to get yout hear out of your ass.

I get that a racing game or a competitive shooty mc shootface has to be on 60fps+, but why other games? there's no point really except forcing you to get even more expensive hardware to run it.

>high FPS and resolution are overstimulating
BAHAHAHA HOLY SHIT