Nonblurry integer-ratio scaling

Why is this not implemented in graphics drivers yet? I was about to buy a 4k monitor and wanted to check that 1080p would indeed scale perfectly as you would expect but apparently not. I'm appalled tbqh.

tanalin.com/en/articles/lossless-scaling/

Attached: main-bilinear-2x.png (488x560, 69K)

Other urls found in this thread:

tanalin.com/en/projects/integer-scaler/
twitter.com/SFWRedditGifs

Heh?
You want to scale the whole desktop or just video?
Because there's no reason to scale desktop, you're not supposed to either, you are supposed to increase DPI or else there's no point in buying 4K.
And any decent video player like mpv or mpc give you like tons of scaling algorithms to choose from.

I want to scale games and videos. I haven't been able to find settings in mpv or mpc that do what I want. "nearest neighbor" scaling in mpc for example just makes everything blocky without actually 4xing the right pixels.

tanalin.com/en/projects/integer-scaler/
This program does a good job for windowed applications like mpv but it doesn't work with linux which I was planning on using with the new monitor

>you are supposed to increase DPI or else there's no point in buying 4K.
Ideally this would be true but the only OS that does this well in practice is MacOS. Windows and Linux both have tons of problem applications that do not work well on a high PPI screen.

Attached: 1523771016861.jpg (5120x2880, 1.99M)

Nearest neighbor scaling of video is shit. You WILL notice the blockiness of the lower resolution being displayed on a much higher resolution, even with lossless, integer scaling.

because you're not supposed to run LCD monitors at a resolution that isnt their native one

Not OP but that's a shitty excuse. Do you also run emulators without any scaling? Enjoy your 4cm x 3xm image.

Attached: 1525309673303.png (3840x2160, 38K)

I run emulators at my screens native resolution and the emulator scales the graphics
that's how you do it in the modern era

It's still objectively the best you can do if it's done properly. I just used the windows integer-scaler program I mentioned to play a 312x240 video on my 3000x2000 display and while the blockyness was apparent, it was clearly the most detailed view I've had of that video. The same video was a mess using nearest neighbor with mpc.

You are running the image at a non-native resolution and blurring it. Even at 4k you will notice artifacts due to the fact you cannot evenly fit an SNES/[console] frame into a 4k frame.

Attached: 1539056695288.gif (230x230, 1.54M)

>It's still objectively the best you can do if it's done properly.
It's not. With proper upscaling you will get an image that isn't blocky or blurry. Of course it's not magically going to have more detail, but the point is to remove the blockiness of the lower resolution without affecting everything else. Since you have 400% more pixels to work with upscaling filters don't really harm the image at all.

Many of us are waiting for this. It'll come soon 100%. In the meantime you have 'Losless Scaling' program on Steam or increase sharpness on your monitor/TV.

emulators can upscale the game image using nearest neighbor interpolation

OP wants to be able to output 1080p as 4K though. So it's not as if he wants to run his monitor at 1080p, as the monitor blurs the fuck out of it.

I'm concerned that it has been at least 4 years since people started raising this. Makes it seem unlikely that anyone with influence starts caring soon.

i havent seen a game that doesnt let you change the resolution

Outputting a game at 1080p on a 4k display still almost always results in a bilinear blur though, even though you could just make 1 pixel into 4 and fit the image perfectly. I think that's what OP is looking for here.

well you change the game resolution to 4k so it doesnt blur
if you just want to upscale lower res images why are you even buying a hi res monitor

High PPI is important for text and GUI elements. It doesn't add much to video games, video, or photos. 120fps+ is more important than 4k for vidya.

It's been much more than that. But these things just happen. There are a bunch of things I thought I'd never see, like multi-millionaires simultaneous multi-projection, slider for rendering at lower resolution but keeping the UI at full resolution, such a large VR support, frame interpolation option on almost every TV, ULMB techniques almost everywhere...

Attached: 1548001728951.jpg (1010x897, 68K)

*multi-monitor simultaneous multi-projection

>It doesn't add much to video games
Oh it does add a lot when playing at large screens/TVs or sitting close to them.

If people can stand looking at the blockyness of a 1080p monitor then why would it be any worse on a 4k of the same size? Like you said, you can't magically get more detail. In my tests with the windows program so far I've found I prefer the crisp, correctly scaled videos, despite the sharpness.

>scale perfectly
Nearest neighbor is only "perfect" scaling if the source image you're scaling is pixel graphics, like the one you posted. Everything else will look like aliased garbage if scaled with nearest neighbor. If you're scaling video, much, much better algorithms exist than nearest neighbor. If you're running a game you should run it at native resolution where the image quality will be vastly superior to 1080p anyway.

Because you have 4 times as many pixels, so where you used to have the substrate between pixels, you now have a solid block of color.
If you have a 4k monitor just open GIMP or Photoshop, take a 1080p frame, and upscale it with nearest neighbor and then another with something like bicubic. You'll see how visibly shittier 1080p looks on 4k screen than 1080p.
Pic related. It looks especially bad on text, even if you disable subpixel hinting, which of course doesn't work in this situation anyway.

Forgot my pic.

Attached: 1539690941310.jpg (2245x1549, 1020K)

1080p 22-24" 1x
1440p 30-32" 1x
4K 44-48" 1x
4K 22-24" 2x
4K 15" 3x

your only option is to not be a consumerist retard

In my opinion this just seems too simple compared to those things. For example manufacturers would rather pretend that they can make your ordinary 1080p content look "better" on a 4k tv so people will buy it even though they have nothing 4k to watch on it. Lossless 1080p? Yuck, I want "4k"!

would you rather have blurry shit or pixelated shit?
you lose either way.

Just use a CRT

Reminder that 4K allows you to emulate the phosphors of a CRT.

Attached: 1540902316520.png (2304x2016, 1.26M)

It would look exactly the same as a 1080p monitor of the same size. Manufacturers are just dumb.

>It would look exactly the same as a 1080p monitor of the same size.
t. Doesn't own a 4k monitor

Are you retarded? If you have a 32" monitor at 1080p and the other at 4k using internet scale they would look the same, since the pixel density (virtual pixels in the chase of 4k) would be the same.

You need at least 8k to do it accurately tho, but 4k looks good.

If everything scaled correctly it would
I can get my TV to scale 1080p right with a simple 1 to 4 pixel map if I give it a 120hz signal.

you want this crap?

Attached: .png (585x453, 67K)

How tho? Does the bilinear scaler just stop working at 120hz?

It would only look like that if you had something like a 100" screen. At lower densities it wouldn't be noticeable.

>Windows and Linux both have tons of problem applications that do not work well on a high PPI screen
funny you should mention that, because the reason I switched to Linux was it's flawless scaling on my XPS13 with 3200x1800 display, at least with KDE.
But afaik most modern DEs work well with integer scaling.

You are not actually pushing for that shitty integer scaling to be implemented, are you?

Read the post it links to. I'm saying 1080p integer scaled to 4k doesn't look the same (or as good) as 1080p on a 1080p screen.

Dunno but the "blurry" pic you show looks better because it actually uses the smaller pixels to give you smoother fonts. I mean, if you're a bing bong wahoo muh retro gaym onionman then sure, you need to see your epic pixels, but in any other case it just looks like shit.

>4k
>pixels visible at this distance
That's some 40+ inch TV, for sure. If only you knew how bad big 1080p TVs look you wouldn't even make this comparison.

The image was supposed to be 4k vs 1080p upscale. Either way the bottom is a perfect example of how you can see the blockiness of aliasing on 4k where it wouldn't be apparent on 1080p.

Even at 1 meter on a 27" screen you can see the difference between 1080p on 4k screen and 1080p on a 1080p screen.
Integer upscaling 1080p to 4k doesn't look the same as 1080p on a 1080p screen.

They are agreeing with you you mong.

>your only option is to not be a consumerist retard
You may be right.

Is that really the same kind of scaling you get with media though?

Even if it looks subjectively worse under some conditions, I still think it should be a basic option. There are certainly occasions where 1 to 1 scaling is best.

Attached: cropped scaling comparison including zoomed original 720p.png (2731x1566, 3.18M)

Looks a lot like my 42" TV with the camera at 15cm of distance. The difference is that it wouldn't look like a rainbow mess when you are watching something closer to it.

Attached: IMG_20190203_162906_1.jpg (4160x3120, 3.66M)

HI TARDS I WANT TO EXTRACT 4x THE INFORMATION OUT OF MY ORIGINAL SIGNAL USING INTERPOLATION BUT I WANT 0 ARTIFACTS I'M SURE THE LAWS OF PHYSICS SUPPORT MY USECASE?

I feel like it depends a lot on the subpixel structure of the display. There's many that wouldn't allow for clean 4x interger scaling.

X1 > X3 > X2. Brainlets will disagree

NES >

>NES is greater than nothing
I guess you're technically right

>Even at 4k you will notice artifacts due to the fact you cannot evenly fit an SNES/[console] frame into a 4k frame.
hence integer scale

yeah but why would you want that?

There's tech that smoothes those sharp pixel edges without blurring anything. It's called anti aliasing

hurrrrrrrrrrr

its because of monitors themselves having native resolutions. CRTs dont have this problem.

The absolute state of Jow Forums when multiple faggots ITT don't understand why integer scaling is needed, either in monitor's built-in up-scaling or the graphics drivers. It's been a long-standing issue people have been pestering monitor manufacturers and graphics driver writers since at least 2014 (when i first trialed 4K and realized nobody with a clue tests it.)

based post, rarely do monitor posters have a clue on ppi & scaling issues. that said, OP isn't stupid in assuming a 4K @ 1x should be able to gracefully degrade to 1080p @ 1/2x with no quality loss using nearest-neighbor.

>underage, fell for memes, doesnt into a shadow mask

Yeah mean, i only play antialiased SMW these days.

No idea, but plenty of /vr/ posters love filters.

See

>(You)
>See
but im 18 and my CRT is a trinitron.

Your trinitron is still colored via a grill with a discrete grille. This stupid fucking meme about crt's not having discrete resolutions needs to end.

well i mean it is still better than an LCD for using different resolutions.

>4K allows you to emulate the phosphors of a CRT.

You really need at least 8k/4320p to do a decent job if you take into account that the glow shape actually grows with subpixel brightness, and if you actually want to emulate separated RGB subpixels, though that of course costs brightness.

The biggest thing needed though is strobing, which is extremely tough to go back to with 60 Hz/fps content.

but why would anyone want to emulate imperfections?

muh soulless vs. SOUL

The only reason why it would be "better" is due to dithering, which naturally comes with the shadow mask.

>not using based waifu2x machine learning super resolution

>It's still objectively the best you can do if it's done properly.
For video? No, no it isn't. NGU in madVR is vastly superior and the neural network upscalers that mpv has are vastly superior as well. Nearest neighbor isn't really good for anything other than content which is inherently aliased/blocky and has no AA applied, such as upscaling old video games and getting nice and sharp blocks for each pixel in the original.

There is. It's literally called integer scaling.

>ubotnet

But what allows me to emulate the 0ms latency?

g-get oled

nothing
best option is either zsnes (atleast for snes games, not kidding) when it comes to latency. get a flashcart for a real snes instead.

I have just modded my Wii and now use my good, old CRT.
In my opinion, this is the ideal emulation machine... well, it would be if it could emulate the N64. For all games, I mean. Not just the few that work on the VC / are injectable.

>Because there's no reason to scale desktop, you're not supposed to either, you are supposed to increase DPI or else there's no point in buying 4K.
What If I want to have desktop and video viewing in 4k and gaming at 1440p or 1080p to get higher performance?

Attached: 67__.jpg (568x465, 129K)

It's simple: Give up gaming!
Your life will improve.

You're not making any sales with that kind of talk

Get 2 monitors and two PCs. One set for working/video also optimized for low power consumption and low noise with a 4k monitor, and the other for gaming with a 1080p freesync monitor and the biggest power succ you can afford.

Because the games were designed for it

>Get 2 monitors and two PCs.
think of the environment man, jesus, whats the matter with you?!

you can invest in higher end parts too when you can get them in same package

>think of the environment man, jesus, whats the matter with you?!
why should we care? its also not like the shit thats being put back into the environment just came from nowhere, its always been here.

are you performing the upscaling on the GPU or the Display?

If you're on an nvidia check the control panel and under "desktop size and position". it should definiely be possible to run 1080p with integer 2x scaling on most 4k monitors.

"get 2 PCs" is just all around shitty suggestion, at most get one PC with high end parts and 2 high end displays for same price

Are you guitarted m8? I assume that most of the time you don't gayme or watch 4k ultra high bitrate bullshit, but work/browse the 4chins/wank. So getting a power optimized system for these tasks is actually better for the environment, in particular since you can get used pc parts for that one.

>So getting a power optimized system for these tasks is actually better for the environment, in particular since you can get used pc parts for that one.
Newer computers get more done with less waste heat and have better power saving usually. My Q6600 is like an electric space heater and my 2500k like its not even on when on idle.

Yes, using a 12 y/o cpu is not a power optimized solution. Meanwhile, you can get an optiplex + decent gpu for 150 bucks used that will be both power efficient and good enough for work & watching animu.

Nearest neighbour works exactly like this when the scaling factor is an integer.

They wouldn't really look the same due to different pixel pitch and subpixel rendering techniques. But they would look close.

Retroarch with runahead enabled actually has less latency than original hardware on a CRT now.

>Retroarch

Attached: die fugger.jpg (801x525, 77K)

What is the alternative that also has runahead?

you're a faggot tbqh

Attached: retrogamesvisuals.png (1328x360, 1.39M)

It does it seems
I can get it back to the more blurry scaling in 120hz by switching to some other picture mode and then back to Game mode, but it looks really meh

the first image is objectively how they looked like, hook a game that looks like that to a CRT and you'll get the effect in the last image

Not on the NES because it had an uneven scanline length resulting in a sawtooth effect on vertical edges. Also add in composite artifacts like dot crawl and colour bleed.

Attached: NES%20Open.jpg (768x576, 109K)