Remix.run Logo
pjmlp 21 hours ago

So basically going back to the old days of Amiga and Atari, in a certain sense, when PCs could only display text.

goku12 19 hours ago | parent | next [-]

I'm not familiar with that history. Could you elaborate?

pjmlp 18 hours ago | parent | next [-]

In the home computer universe, such computers were the first ones having a programmable graphics unit that did more than paste the framebuffer into the screen.

While the PCs were still displaying text, or if you were lucky to own an Hercules card, gray text, or maybe a CGA one, with 4 colours.

While the Amigas, which I am more confortable with, were doing this in the mid-80's:

https://www.youtube.com/watch?v=x7Px-ZkObTo

https://www.youtube.com/watch?v=-ga41edXw3A

The original Amiga 1000, had on its motherboard, later reduced to fit into an Amiga 500,

Motorola 68000 CPU, a programmable sounds chip with DMA channels (Paula), and a programable blitter chip (Agnus aka early GPUs).

You would build in RAM the audio, or graphics instructions for the respetive chipset, set the DMA parameters, and let them lose.

goku12 15 hours ago | parent | next [-]

Thanks! Early computing history is very interesting (I know that this wasn't the earliest). They also sometimes explain certain odd design decisions that are still followed today.

nnevatie 17 hours ago | parent | prev [-]

Hey! I had an Amiga 1000 back in the day - it was simply awesome.

estimator7292 13 hours ago | parent | prev [-]

In the olden days we didn't have GPUs, we had "CRT controllers".

What it offered you was a page of memory where each byte value mapped to a character in ROM. You feed in your text and the controller fetches the character pixels and puts them on the display. Later we got ASCII box drawing characters. Then we got sprite systems like the NES, where the Picture Processing Unit handles loading pixels and moving sprites around the screen.

Eventually we moved on to raw framebuffers. You get a big chunk of memory and you draw the pixels yourself. The hardware was responsible for swapping the framebuffers and doing the rendering on the physical display.

Along the way we slowly got more features like defining a triangle, its texture, and how to move it, instead of doing it all in software.

Up until the 90s when the modern concept of a GPU coalesced, we were mainly pushing pixels by hand onto the screen. Wild times.

The history of display processing is obviously a lot more nuanced than that, it's pretty interesting if that's your kind of thing.

pjmlp 12 hours ago | parent [-]

Small addendum, there was already stuff like TMS34010 in the 1980's, just not at home.

cmrdporcupine 12 hours ago | parent | prev [-]

Those machines multiplexed the bus to split access to memory, because RAM speeds were competitive with or faster than the CPU bus speed. The CPU and VDP "shared" the memory, but only because CPUs were slow enough to make that possible.

We have had the opposite problem for 35+ years at this point. The newer architecture machines like the Apple machines, the GB10, the AI 395+ do share memory between GPU and CPU but in a different way, I believe.

I'd argue with memory becoming suddenly much more expensive we'll probably see the opposite trend. I'm going to get me one of these GB10 or Strix Halo machines ASAP because I think with RAM prices skyrocketing we won't be seeing more of this kind of thing in the consumer market for a long time. Or at least, prices will not be dropping any time soon.

pjmlp 11 hours ago | parent [-]

You are right, hence my "in a certain sense", because I was too lazy to point out the differences between a motherboard having everything there without pluggable graphics unit[0], and having everything now inside of a single chip.

[0] - Not fully correct, as there are/were extensions cards that override the bus, thus replacing one of the said chips, on Amiga case.