Tiling autism

So, I've been reading about clever ways they used to fit games on limited resources of NES, namely tiling. Breaking it into 8x8 pixel tiles, using those tiles to make bigger tile, reusing the same tiles, etc.

As far as I understand, the main advantage is huge memory saves. But does it also saves processor resources?

For example, let's say we have a game like Diablo or Fallout, with 2D graphics. Will there be any advantage if we will use the same methods? Like being able to run it on 386 processor with adequate framerate or something.

Attached: serveimage.jpg (600x600, 373K)

dwarf fortress and fallout already use those techniques

dwarf fortess uses text, so it doesn't really count

You're a fucking moron

Stupid gamer.

I am aware that there are tilesetes

>But does it also saves processor resources?
No, unless you count caching improvements from smaller memory chunks, which older processors like the 386 do not have anyways.

Attached: symbos cpc.gif (640x400, 22K)

Gaymerfaggot incels need to be rounded up a shot just like we did nazis and commies.

Were there any tricks to go easy on CPU and GPU then?

ASCII IS a tileset

Well, yes, but it's kind of built in into existing OS, so the programm itself doesn't need external tiles.

>reee you need to be shot because your doing something I don't like and that doesn't affect me

How does garbage like you existing not affect anyone? You are a parasite and an incel, a waste of this world's resources that even Africans deserve more than yourself. You are indeed an issue that affects everyone, one that needs to be dumped in a ditch.

the default tileset for dwarf fortress is a bitmap image stored in the game's directory

Yeah. Using the CPU's registers efficiently (in creative ways) and using undocumented instructions. Using custom color palettes which best represent the scenes color shades. Giving the illusion of three dimensionality by making lines converge at a fugue point. Using pre-rendered models at different angles. Limiting the number of sprites in the scene. You had limited memory too, so it's not like what you mentioned in the OP (tiling sprites) wasn't essential too.

If you want to learn about this stuff, look into the demoscene. They try to push the machines to the limits of what it can do.

DF has the OPTION to be rendered as actual text/ASCII.
By default it uses and "ASCII" tileset saved as a bmp.
>pic related

I probably will, I'm just curious how reasonable it is to apply similar methods in modern apps.

>pic related
>no pic

Watch out, we got a nihilist here! Hide your waifus!

Absolutely disgusting.

Attached: 91.jpg (600x600, 149K)

the main reason why the NES could handle the graphics it could was due to hardware accellerated graphics, basically, it had a 'gpu'
not a whole lot like modern ones, but it did things like screen scrolling and sprites, which took a huge load off the main cpu
for example, you could load up a tilemap larger than the screen, then just have the cpu flip a few bits to move the viewport around, rather than redrawing the whole screen using the cpu each time (like what PC games at the time had to do)

So basically if I want to go full PC optimisation autismo I'd better take note from Carmack's engines?

there's a million and one ways to optimize something as complex as a video game
it really depends on how much time you have
perhaps find some reading material about how demoscene prods are made, they aren't games, but they're even better, often showing off all kinds of fuckery to make things do what you wouldn't imagine possible

I want to get into programming, but I just started reading K&R, so I just wanted to spark a general discussion. I like the idea of pushing hardware to its limits so I just wanted to spark a general discussion, since I'm too dumb yet to fully understand and comment on those topics.

unfortunately, a lot of what you'd see on platforms like NES and DOS can't be done on modern systems
back on those days, you had total control over the system, your game was basically where the kernel is today, and on fixed platforms like the NES, you could, of you wanted, manually manage things right down to individual clock cycles
on modern systems, your game is at the top of a large stack of layers, you have to ask the OS to handle basically anything besides the internal game logic (the parts that aren't IO/sound/graphics/etc), so there's only so much you can do

Use TempleOS

Oh, yeah, no. Modern stuff is completely different.
Nowadays you're just trying to have an idea of what each OpenGL API call and language/library primitives will cost you in terms of processing power and IO, and actually porting things from software rendering to whatever graphics API you're using, almost nothing to do with the lower level stuff these guys are doing.
You would be better off reading about how modern GPUs work and their APIs, how to do multithreading for the most amount of stuff possible, how to keep thing in the CPU cache for tight loops, things like that. But as a game dev you won't have time for these things anyway, that's just a couple people in the team that develops the engine.
But if you're interested just for the sake of it read about SIMD and CUDA, 90% of graphics is actually implemented on top of that.
Nah, modern demoscene stuff is amost 100% focused into compressing cool stuff into the least amount of disk possible. They don't actually pay that much attention to the graphics API level performance, all the effort is in compression and real time procedural generation of stuff. Different constraints than for a game.

Attached: symbos-msx-os6.gif (512x424, 33K)

First sane advice.

Jow Forums is just full of CIA niggers.

i wasn't referring to modern demoscene stuff, but stuff like those made for NES/DOS, since that's where the conversation was

Actually, I think the commies got a pass. Communism was never put on trial.

Except it didn't. Romania was the most public. The rest didn't televise. The short period post gov overthrow was a legal killing and revenge party.

I wonder if there's a formula to calculate how fast each thread on Jow Forums will drift into Jow Forums

Is that really a Bill Gates quote? I've never heard it.