▲ | spenrose 4 days ago | |
Look at your examples. Translation is a closed domain; the LLM is loaded with all the data and can traverse it. Book and music album covers _don't matter_ and have always been arbitrary reworkings of previous ideas. (Not sure what “ebook reading” means in this context.) Math, where LLMs also excel, is a domain full of internal mappings. I found your post “Coding with LLMs in the summer of 2025 (an update)” very insightful. LLMs are memory extensions and cognitive aides which provide several valuable primitives: finding connections adjacent to your understanding, filling in boilerplate, and offloading your mental mapping needs. But there remains a chasm between those abilities and much work. |