| ▲ | gdevic 4 hours ago | |
The core question: how did HP's scientific calculators actually work at the gate level? That rabbit hole led to building one from scratch. The architectural decision everything else follows from: a decimal calculator should store numbers as BCD — one decimal digit per 4-bit nibble. A standard byte-oriented CPU (Z80, 6502) fights that layout constantly. So I designed a small custom CPU in Verilog where 4 bits is the natural data width and memory is nibble addressable. What the project covers: - Custom CPU: Harvard architecture, 12-bit ISA, 8-state execution FSM, hardware stack guard with a FAULT state for microcode debugging - CORDIC for trig functions, verified to 14 significant digits - Two-pass assembler in Python (~700 lines) - Verilator + Qt framework: same Verilog source runs in simulation, as a desktop GUI debugger, as WebAssembly, and on real hardware - Scripting language on top of the microcode for adding functions without touching hardware - Custom PCB (EasyEDA/JLCPCB), battery, charging circuit Write-up: https://baltazarstudios.com Hackaday: https://hackaday.com/2026/05/13/build-the-cpu-then-build-the... | ||
| ▲ | VLM 3 hours ago | parent [-] | |
Ironically the Z80 is a nibble ALU. That's why its so slow compared to the competition, an 8 bit add on a "2 MHz" Z80 takes as much clock time as a 8 bit add on a "1 MHz" 6809. | ||