| ▲ | dnhkng 10 hours ago | |
Yes, thats true. But that points again to the main idea: The model has learnt to transform Base64 into a form it can already use in the 'regular' thinking structures. The alternative is that there is an entire parallel structure just for Base64, which based on my 'chats' with LLMs in that format seems implausible; it acts like the regular model. If there is a 'translation' organ in the model, why not a math or emotion processing organs? Thats what I set out to find, and are illustrated in the heatmaps. Also, any writing tips from the Master blogger himself? Huge fan (squeal!) | ||