| ▲ | namuol 3 hours ago | |||||||
Programming in general is about converting something you understand into something a computer understands, and making sure the computer can execute it fast enough. This is already hard enough as it is, but GPU programming (at least in its current state) is an order of magnitude worse in my experience. Tons of ways to get tripped up, endless trivial/arbitrary things you need to know or do, a seemingly bottomless pit of abstraction that contains countless bugs or performance pitfalls, hardware disparity, software/platform disparity, etc. Oh right, and a near complete lack of tooling for debugging. What little tooling there is only ever works on one GPU backend, or one OS, or one software stack. I’m no means an expert but I feel our GPU programming “developer experience” standards are woefully out of touch and the community seems happy to keep it that way. | ||||||||
| ▲ | anikom15 2 hours ago | parent [-] | |||||||
OpenGL and pre-12 DirectX were the attempt at unifying video programming in an abstract way. It turned out that trying to abstract away what the low-level hardware was doing was more harmful than beneficial. | ||||||||
| ||||||||