▲ | spacechild1 4 days ago | ||||||||||||||||||||||||||||
I'm building 2D computer game environments for musical performances and interactive installations. I ended up developing my own little engine. The current iteration is based on: - SDL2 for input and window management - OpenGL for rendering - libpd for sound generation and processing - RtAudio for audio I/O - sol2 for Lua bindings - Qt6 for the tile map editor I'm not targeting the web, though. > The audio engine needs to be DAW-level, but, synchronized to graphical elements Are the sounds generated from game events? In that case, time synchronization between game clock and audio clock is indeed an interesting problem with multiple solutions, each with their own trade-offs. If the game only acts as the user interface, you can treat it like in a regular DAW. | |||||||||||||||||||||||||||||
▲ | aaroninsf 4 days ago | parent [-] | ||||||||||||||||||||||||||||
Thanks for this! I am torn about backing off to focusing on first getting the UX (which is really "the thing") right, and just emitting a MIDI MPE stream. That obviously has some other benefits but there were things I was keen on having low-level control over... but I can get over that maybe... Ambivalent also about target—really, the iPad is the ideal and canonical target, in terms of what it can and the UX it affords... but I have this possibly unfounded sense that the amount a coding copilot can assist me is somewhat limited inside the Apple ecosystem... :/ EDIT: oh and is your engine something I can look at (or license or...)? :) | |||||||||||||||||||||||||||||
|