How did they program graphics on DOS without OpenGL?

How did they program graphics on DOS without OpenGL?

Attached: strip-poker-professional.gif (640x480, 116K)

Other urls found in this thread:

en.wikipedia.org/wiki/Adaptive_tile_refresh
github.com/keendreams/keen
uridiumauthor.blogspot.com/2017/12/scrolling-on-amiga.html
youtu.be/N8elxpSu9pw
delorie.com/djgpp/doc/ug/graphics/vga.html.
youtube.com/watch?v=iw17c70uJes
youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab
github.com/ssloy/tinyrenderer/wiki
youtube.com/watch?v=_yJ5M3BY2Ts
twitter.com/SFWRedditImages

pixel by pixel

But what pushes said pixel to screen

Attached: Dog-Puzzled.jpg (350x350, 20K)

You can write directly to the framebuffer.

memcpy()

But how do you synchronize with CRT. You telling me you do that software side?

you did everything software side

You had two choices:
1.) Time it manually, typically by writing assembly and knowing how many clock cycles you're using and making sure you keep synchronized with the display hardware. This isn't very resilient to increasing speeds - the DOS world learned this as soon as 286s and turbo 8-MHz XT clones came out around 1984 - so it wasn't much used after that. It lasted many years longer in programming video game consoles.
2.) Punt and don't do it at all. A lot of 1980s monitors had really slow refresh rates by todays standards, and just papered over it with a high-persistence phosphor. Back then you didn't have fast action games that would make what we now call ghosting a problem. Everything was slow than anyway, a bit of a hitch in redrawing the screen would be unnoticed in a lot of circumstances.

so what about high-speed action games on 386 and 486s?

Wonder what these thots are doing today

>high-speed action
Carmack invented smooth scrolling for DOS, before his tricks scrolling was choppy

en.wikipedia.org/wiki/Adaptive_tile_refresh
github.com/keendreams/keen

Abuse oxy during night shifts in Walmart Inc.

Attached: 20170416135835_1-100718559-orig.jpg (1920x1080, 303K)

You may be interested in a book called 'Racing the Beam'.

that was early in the dos era, by the end of the dos era they had smooth scrolling VGA games

can I get an answer without reading a book

Why would you need to sync with the CRT?
You write to the framebuffer, and the display adapter takes care of putting that image onto the display.
You couldn't time the display via CPU clock or whatever because after the mid 80s clock speeds were no longer uniform.

What era you are asking about?

>Why would you need to sync with the CRT?
You need to do it for speed tricks and save CPU time. It's common for Amiga games

On old consoles hooked up to CRTs, the game writes color values for a given pixel/line/location to a piece of memory or register, which is read by the piece of hardware controlling the phosphor beam which is sweeping across the screen. Horizontally, then the next line.
The value of that memory at any given moment determins which color to show at that moment (i.e. at that pixel).
This process is very fast hence you see a solid image rather than lines being generated.

That's a very simple summary and therefore inaccurate but it should giver you a vague idea.

I guess I was unaware just how slow CPUs of the time were.
Wouldn't the video card take care of drawing to the screen? I didn't think the CPU drove the display since the very early era.

The only time you really had to sync in software was with some CGA cards that had video RAM that couldn't be read and written at the same time, and therefore would produce "snow" unless you made sure you only wrote during hsync/vsync.
After that, it only lead to screen tearing and associated color artifacts (in the middle of writing one bitplane to the framebuffer) at best.

They weren't purely software driven by late 80-early 90ties, consoles/computers have blitter chips that helped with transformation tasks and so. Demoscene developers have sometimes good blogs about how early graphics works
uridiumauthor.blogspot.com/2017/12/scrolling-on-amiga.html

early computers didn't have video cards, how new are you
video cards got introduced to display 3D graphics

Like this:
youtu.be/N8elxpSu9pw

OK they weren't always cards you dingus, but many PCs shipped with only a keyboard port on the motherboard. IO and everything else had to be added in the form of an expansion card.
They used a dedicated video chip of some sort to run the display. The displays were not directly driven by the CPU, that was only on super low end early computers like the ZX Spectrum.
3D accelerator cards are different.

Don't go spouting bullshit and calling people new if you don't have the right information.

You dont have the right information
the CPU did the drawing, at least on PCs

With Glide of course.

God, I love code monkeys trying to grasp DOS or anything else old.

You used the chipset to generate graphics on an Amiga though. Unless you used chunky to planar, but then you wouldn't need to do that either since the benefit would be miniscule, since you'd be using higher end CPUs already or even RTG.

probably getting a big hard hug from their grandsons and granddaughters while enjoying the life of retirement with their significant other.

Option 3: Write highly hardware-dependent code for each popular GPU adapter from the time and let users choose their adapter through setup.exe

This thread is horrifying.

t. CSist

>256KB display adapter running on an 8-bit ISA bus
Yeah, no such thing as video cards at all.

In most cases you would use a display card to output video, it was graphics that did the drawing. Prior to 286 machines, it was indeed done by the CPU.

If people are saying something wrong, please do share the correct knowledge.

It is hard to fix things when you don't know that they are broken.

how do you program the pixels?

Attached: 1530939938832.png (222x249, 47K)

I never got to figure what the fuck is the right speed to play Halloween Harry at.

Just read this: delorie.com/djgpp/doc/ug/graphics/vga.html.

interrupts

Attached: 1554754809016.png (464x450, 59K)

as I recall there was some kind of software interrupt you could subscribe to (set your own handler for), which you could then use to swap buffers or whatever. something like that.

youtube.com/watch?v=iw17c70uJes

*runs on your 386/486*

this book was THE bible for the longest time

Attached: 4162348.jpg (318x392, 101K)

Do everything manually.
You can control the color of every pixels, so you can paint anything you want.

>Includes Super VGA
fkn zoomers

If you actually want to learn how to do these things, Bisqwit is a terrible channel. He NEVER explains what he does and why he does it.

The only thing I can recommend to people who want to learn graphics programming is to take a course in linear algebra. Here, there's even an amazing channel who actually teaches shit like this with great visuals instead of just playing 200x sped of footage of him writing code:
youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab

Then just do an OpenGL or WebGL tutorial. In my opinion, WebGL is much easier to get started with since you have everything in your browser, while with OpenGL, you need a shitton of boilerplate to even instantiate a window and there are thousands of deprecated tutorials that still teach immediate mode.

If you want to learn software rendering, instead of doing the OpenGL tutorial, I'd recommend github.com/ssloy/tinyrenderer/wiki

kek

I was about to post this.
This one is also cool:
youtube.com/watch?v=_yJ5M3BY2Ts

u can use sfml, sdl2 or even glfw to create an easy window so you can focus on the gfx programming. they all support modern opengl

It always cracks me up when they call an entertainment product "professional", like PlayStation pro.

There are professional video game players, just like there are professional football players

>github.com/ssloy/tinyrenderer/wiki
Mah nigga, that dude is amazing. Love his projects

Why would they do that? Then they can't act like a bunch of elitist assholes when you get something wrong.

Attached: keen story bro.jpg (256x256, 17K)

mov ax, 0a000h
mov fs, ax
mov [fs:0000h], ??h
Where ??h is some pixel value, and assuming you've already set the VGA BIOS to some graphics mode. It's that simple. SVGA+ modes are a slightly different matter as graphics programming for them is more complicated, and usually involve having the processor running in protected mode.

soul

why are there two cursors

Real gamers use - nay, need - more then one cursor.

Compare these non-nude models to the non-nude models that are on Instagram now. It's like a whole different world

what kind of retarded bullshit is this
of course there were video cards, first on ISA then posh ones on VLB, etc
all years before hardware 3d acceleration became a thing

Western women have been getting more shallow and worthless, ironic given how they constantly shriek about people treating them as such. Really makes you go celibate.

First you determine individual bit values at a basic sprite scale of 8x8 pixels. 1 is on, 0 is off to denote values for pixel segments.
11111111 11100111
11000011 01000010
10000000 01000010
11111000 01000010
10000000 01000010
10000000 01000010
10000000 01000010
11000000 00111100

Next, we determine which color value, tied to a a built in color register, that we want to use.
These values are hexadecimal and range.

you can do literally anything

Has anyone attempted to create a dual mouse system?

About a decade ago I worked on a multiuser touch screen which basically amounted to a screen that relayed touches as mouse positions to windows. There is barely anything for multiple mice. I honestly thought more people would have given it a shot even if it is giant pain in the ass