If your GPU doesn't have integer scaling technology, it's shit

If your GPU doesn't have integer scaling technology, it's shit

Attached: gamescom-2019-geforce-game-ready-driver-integer-scaling-ftl.png (1801x1347, 1.35M)

Other urls found in this thread:

twitter.com/IntelGraphics/status/1167622125412392960
software.intel.com/en-us/articles/integer-scaling-support-on-intel-graphics
twitter.com/SFWRedditImages

twitter.com/IntelGraphics/status/1167622125412392960

Attached: Intel.jpg (1080x607, 124K)

buy an ad kike

>literally nearest-neighbor interpolation
Any GPU from 1995 and up supports that shit, Jensen.

For some dumb fucking reason, this is actually not true.
Though with intel doing it and jensen promoting it as his latest technological breakthrough it's only a matter of time for amd to add it as well.

But it is
You can easily do it with pretty much any retroarch build from the last decade or so, as with most emulators, on pretty much any GPU, dating back to the Intel 845G and Trident offerings, I did it back then but jumped to true 240p on a multisync monitor some years ago
Drivers have never had the option since it looks like shit on 3D content, and emulation was seen as grey area by the mainstream until quite recently

Old games and 2D indie trash are shit. Get with the times, grandpa.

It's not implemented on a GPU driver level though, go turn down your monitor resolution below native and look how blurry it is

You can run a game at 1080p on a 4k monitor and have it integer scale instead of using the shitty upscale options usually available in the GPU settings

>GL_NEAREST and D3DX10_FILTER_NONE do not exist
But it is true, always has been, you are just too dumb to understand that the most basic form of interpolation (basically none) is even a thing, and that is why you retards are so easy to swindle into buying "new" shit like a proprietary raytracing "implementation" (if you can even call a broken ass 1spp that).

>D3DX10_FILTER_POINT*
Wrong flag, my DX shit isn't what it used to be. The point still stands tho.

Yes it is, any GPU that supports OpenGL and DirectDraw must support that mode. Take you shitty damage control elsewhere, Jensen, perhaps reddit will fall for your shit.

But it's not fucking implemented in GPU drivers

Also Intel implemented it first, you actual moron
software.intel.com/en-us/articles/integer-scaling-support-on-intel-graphics

Integer scaling is not fucking "Nearest Interpolation" you actual chud

Wrong, Intel didn't implement it first

Nvidia implemented it first and released it first at Gamescom, 2 weeks before Intel

software.intel.com/en-us/articles/integer-scaling-support-on-intel-graphics

Intel announced it back in July

This guy is right

Why are idiots arguing that this doesn't exist on video cards when they've clearly never programmed any graphics before in their life

>it's not fucking implemented
Yes it is, otherwise those devices would not pass certification as a DirectX/OpenGL device.
>Integer scaling is not fucking "Nearest Interpolation"
Yes it is, and there is nothing you can come up with or say that will change that. Go ahead and try as hard as you can.
You fucked up again, Jensen, good job.

Again, they didn't release it first or implemented it first

Nvidia beat them to it 2 weeks earlier

This is a GPU driver level implementation of integer scaling often seen in emulators, which IS new, a pixel game that runs at 320x240 or something could be upscaled this way, it's literally just a neat small feature

Are you guys so caught up in calling everybody shills that you're going to make this out to be some kind of conspiracy?

Attached: D-rl3LuUwAUL9H4.jpg (1240x1239, 162K)

>>Integer scaling is not fucking "Nearest Interpolation"
>Yes it is, and there is nothing you can come up with or say that will change that. Go ahead and try as hard as you can.

Any fucking /v/ kiddie who's fucked around with an emulator could tell you that's wrong

enjoy your fucked up pixel ratios

Again, that is nearest-neighbor interpolation, which all GPUs support.
Just because someone is paying you to write about something you know nothing about does not mean that it's true.

>/v/ kiddie and (You)
vs
>API documentation and past development experience
Who would win?

one, nearest neighbor = integer scaling, but integer scaling =/= nearest neighbor
integer scaling corrects the aspect ratio and doesn't stretch anything

two, it's literally just GPU drivers now enabling the option for any game, not the shill conspiracy you're making it out to be

Attached: stretched vs nonstretched.png (1032x512, 24K)

have you chuds seriously never run a game at 640x480 on a laptop screen

Does integer scaling works at non integer ratios? Because it means precisely dick if it doesn't. Any PC game in existence could render shit at any scale (including integer ones) and disabling linear interpolation then produces identical results to this hot new meme snake oil feature.

>different aspect ratio from source to render target texture dot pee en gee
You need to come up with a better damage control, Jensen. Or are you seriously saying that rendering a 4:3 source into a 16:9 target is going to magically make up missing pixel data on the sides like you claimed with DLSS. LOL

software.intel.com/en-us/articles/integer-scaling-support-on-intel-graphics

Here you go, go argue with Intel's writeup and examples

What the fuck are you even talking about anymore?

Attached: 1024x768_IS.jpg (3840x2160, 785K)

imagine being such a fanboy you see the word "turing" and argue it doesn't exist because AMD doesn't have it yet

>t. out of copy/script
Anyone that minimally knows something about rendering is able to understand what was wrote, so maybe ask your handler for more damage control material, or is he not online yet?

Where the fuck are you getting adding 16:9 to a 4:3 image?

>Does integer scaling works at non integer ratios

no because that's the point

Attached: 1563529093644.jpg (434x442, 40K)

I have eyes and am not being paid to ignore the missing parts of the image on the right side. Or are you going to post some damage control that it's an artifact and I'm actually blind, next? Stop posting, Jensen, it will be better than this whole pathetic shit your pulling.

That's great you have eyes, now use them to read your sentences before you post.

>fixing a problem that doesn't exist

we already had integer scaling since direct x 7, thanks idiots

Then that's retarded meme feature. Anyone who ever wrote a gpu accelerated graphics program knows that this existed since some 30 years ago, taking it as a new gpu feature takes being beyond stupid - but I guess that's their target audience.

>1.0 == 1.333333333333333333333
This is exactly how dumb you are (I'm counting literally 3 posts where you claim this). You got any objections to that too?

Yeah, and now it's a GPU driver level accessible to any idiot in monitor scaling settings and not on a game by game, software by software basis, what's the actual issue?

Seriously just set your fucking monitor to 1280x720 right now and you'll see blur, this is literally just a GPU option to
>1 turn off the blur
>2 if possible scale at integer level so as to reduce stretching

>integer scaling
>the single simplest way to scale something
They really scrape the bottom of the barrel now, don't they.

are /v/idiots seriously so retarded that they gulp down a rebranded feature supported by everything since 90's?
and i thought rgb meme was the lowest point of that particular crowd

> OP: if your gpu doesn't support integer scaling as a scaling option it's shit

>What the fuck, NVIDIA DIDNT INVENT INTEGER SCALING, btw I don't have this option in my monitor scaling settings why does that matter?

Attached: Ikasuze-Kobayashi-san_09-05-19.jpg (600x338, 164K)

Quote them

It would be a nice feature if it could upscale content to any size without blur, using it's meme AI engine or whatever the fuck. But it's literally just integer scaling, it's basically worthless. Games with small resolution content do their own pixel perfect upscaling, games with large windows won't scale at all. It's just a marketing gimmick that's grasping at straws to appear useful, made out of implementation produced by a bunch of problem hair diversity hires no doubt. It's an insult to customers' intelligence, what the fuck is there to be excited about?

This guy is so sad and patetic, just like apple fags boasting about new features that were present on Android years before.

Why the fuck would you AI upscale pixel games, that's the most retarded thing I've ever heard

>Games with small resolutions do their own scaling
Modern pixel games? Some.
And some literally fucking don't, and there are still old games that ran at 640x480 and need patches or scaling software

Why the fuck is having another driver level option a _bad_ thing, exactly?

Great post. Thanks Kanye. Very cool.

Because it's a feature of shit quality, an equivalent of 90s technology that still exists. As I said, fullscreen upscaling without blur would be nice, maybe they could use their AI to circumvent misaligned pixel rows blurriness. But that's not what it is, yet heralded as half life's third installment.
>some
All.

>more pathetic damage control
Every single post by you after the one with the picture comparing 1:1 to 4:3. You just added another one to the list, thanks (?). So I take that your handler is still not online to get you a new copy script, I wonder what you (((guys))) will come up next to try and make this look like something new.

every gpu since late 90's supports nearest-neighbour interpolation, you schizo /v/idiot and it was "invented" before your grandparents were born

>All
Some. RPG maker games come to mind. Especially older engines. This is a driver level implementation for _all_ pixel games, not just new ones aware of these techniques

>An equivalent of 90's technology that still exists
You still haven't answered why having the option is _bad_ lol

>Yet heralded as half life's third installment
By whom? Why do you always throw in some random nonsense about diversity hires or weird analogies like this?

Using AI Upscaling, which adds pixels, to prevent artifacts, is so fucking dumb I'm not even going to touch that topic, btw

Show me where to turn it on in your AMD monitor settings :^)

>AMD
>monitor
take your meds, schizo
it's glTexParameter() and it works with all vendors

Congratulations, you don't even know what the thread is about and just want to dickwave that you wrote 3D graphics once

Waifu2x is an AI upscaler. Hence your opinion about this a shit. If a game has hard coded resolution then it's not even considered a game, it's an /agdg/ abortion at best.

I didn't said having this feature was bad, nor that the feature itself is bad. I said it was retarded, the way it's marketed was retarded, and everyone who thought it's hot new shit was retarded. And considering that it's year 2019 it's sad and pathetic that THIS is what they brag about.

how do you think dolphin does upscaling, /v/tard

>Retards who don't know the difference between NN and IS

Attached: 1568324714754.gif (480x358, 1.16M)

>The way it's marketed is retarded and the way they brag is retarded
You mean those announcements that show off what the feature does? What's with the insecurities?

Jesus Christ, read the Intel write-up where they specifically mention emulators usually already write this in

You still don't explain why having it as a driver option is bad, just that it's not A BRAND NEW RENDERING TECHNIQUE (coincidentally unavailable to my gpu of choice, but that has nothing to do with why I'm so mad honestly)

This man speaks the truth. That other dude is just mad that proper integer scaling isn't retroarch exclusive anymore. I for one welcome this since that will help making 4K even cripser with content that wasn't made for it, so basically everything.

Now run literally any modern 3D game at resolutions that aren't native to your monitor and enjoy blur

Oh god, you're seriously trying to suggest running games through Waifu2x

Why not just implement HQ2X or the entire Retroarch shading stack while we're at it with bad ideas

I for one would love to run Windows 10 through CRT Royale thank you very much

you're not thinking far enough

Attached: maxresdefault.jpg (1280x720, 71K)

Isnt NN old as shit?

Attached: 0_Sk18h9op6uK9EpT8.png (650x378, 54K)

>I for one would love to run Windows 10 through CRT Royale thank you very much
I was thinking this often when running old games. I wish retroarch would have a desktop overlay mode so that i can use some simple scaling on old games and get good crt shaders. libretro dosbox is simply not up to the standalone version and don't even talk about all the win98 exclusive shit that just looks bad on LCD.

why do you need "turing integer scaling" when it looks worse than nearest neighbor scaling

Retroarch / CRT filters do exist for SweetFX / ReShade

Can't this be handled in shader code?

so how is 'integer scaling' any different from 'nearest neighbor scaling' besides integer scaling just round to the nearest whole integer? this is a pretty obvious semantic ploy

so it is just retards falling for the most basic of marketing puffery. gotcha

>integer scaling "technology"
>literally just rendering smaller texture to larger monitor with nearest neighbor filtering
lmao Jow Forums is so dumb

>Retroarch / CRT filters do exist for SweetFX / ReShade
I'm kind of a linux guy so i would have to fiddle long and hard to make those injectors work with wine. Just imagine the braindump i have to do to make them properly compatible with all of this. dx

I love how you fucking retards run into a word you don't know and think it's a new marketing term

>besides integer scaling just round to the nearest whole integer?
That's the whole point of integer scaling and why it isn't called nearest neighbor, you ape.

>it's a new marketing term
correct. glad we agree.

Most games I wanna play with integer scaling I can because it's an option of the emulator. I don't play indie pixelshit especially ones that don't implement it themselves.

What do you think integer scaling is

because its a transparently low effort and lazy feature being presented as some kind of new technology. i don't blame them, because re-branding entry level computer graphics algorithms with limited application as TURING SCALING is zero effort and gets people like you to defend them, but it just cements the fact that nvidia fulcrums really heavily on disingenuous marketing gimmicks

It's called Turing scaling because they've only implemented it on Turing drivers

what does the rounding even matter? when you are sampling from a pixel grid with nearest neighber you are basically rounding to the nearest pixel anyways.

Intel calls it 'Retro Scaling' but apparently it's only an issue when Nvidia does it

(Again, it just so happens AMD doesn't have this but that has nothing to do with why you're so mad, honest)

It is purely software implementation at this point. Nvidia just found another way to harness "RTX cores" for graphical workloads that is arguable more useful for gaymers.

but its the exact same algorithm, it just limits the inputs. you do understand how stupid it is to come up with an entirely new name for something that already exists, right?

>Using RTX cores to change the texture filtering mode
what will they think of next?

Go take it up with whomever invented the term integer

They are throwing shit at the wall at this point.

They are hoping something sticks because RTX cores are a long-term gambit to make discrete GPU relevant to the masses. Just like pxiel-vertex shaders were back in the early 2000s.

>the term 'integer' was invented with integer scaling technology
/v/tards

You're so fucking dense, holy shit

you're strawmanning pretty hard and it makes you difficult to take seriously

intel doing it makes it even more obvious it's a marketing gimmick because their computer graphics department is woefully behind the rest of the industry and they have always had to rely on disingenuous marketing and feature design to convince consumers they have a viable product.

I was being sarcastic, why would they use RTX cores to do something video cards have been doing for decades?

nearest neighbor blurs shit if it isn't scaled properly, which it isn't if you don't actively prevent it form scaling to the whole screen. if you don't have black bars at every site of your picture then you have shit scaling and need to invest in an integer scaling algo.
It isn't, see above. your nearest neighbor is snakeoil if it isn't combined with integer scaling.

says the retard who says we should take issue with the inventor of the term "integer" as though its at all relevant here

you're just being reductive to the point of being obtuse for the sole purpose of chasing a semantic distinction that only exists so nvidia can lie to your face. i'm just dissappointed, honestly

Or maybe it's just called retro scaling because it's a neat scaling feature for retro games

>nearest neighbor blurs shit
>need to invest in an integer scaling algo

Attached: asd.jpg (644x500, 39K)

yes, that is why they slapped a brand name on it, because it otherwise doesn't have any useful applications

Nvidia did not invent the term "integer scaling" and it's even the term Intel uses, so that whole post doesn't make sense

I'm making fun of you, because you're retarded and think the term 'integer Scaling' is new

Would you prefer "Non Fractional Nearest Neighbor Interpolation"

If Nvidia and Intel support it, that's not a brand name or gimmick, that's a standard they're behind on

NN is "non-fractional" retard. It rounds to the nearest pixel just like "integer scaling technology"
>nearest neighbor blurs shit if it isn't scaled properly,
jesus christ dude, just leave the thread

its just nearest neighbor interpolation. you can put lipstick on a pig all you want but at the end of the day this is still the exact same algorithm that has been in use since before computers were even able to render graphics in real time

software.intel.com/en-us/articles/integer-scaling-support-on-intel-graphics

Argue it with Intel's literal definition of the differences between the two

Marketing reasons, and it reduces overhead (MAH INPUT LAG! types)

See
nearest neighpoor is worthless without integer scaling saving its ass.

>Integer scaling (IS) is a nearest-neighbor upscaling technique that simply scales up the existing pixels by an integer (i.e., whole number) multiplier.
This is a fucking joke. "Integer scaling" as per their definition is a subset of nearest neighbor. Why is this suddenly a feature of graphics cards when we've been able to do this forever?

Read the fucking thread, Jesus Christ.
This is a driver level option for all games, that's literally it, you fucking turd

Are you upset because coincidentally, AMD doesn't have this option?