| ▲ | amelius 6 days ago | |||||||
If you want to make it more human-explainable, then ditch the entire tokenizer and just feed the models raw characters. Because now there is nothing to explain. | ||||||||
| ▲ | sigmoid10 6 days ago | parent [-] | |||||||
Then that means you need at least 4x the compute to achieve the same results as state of the art. Meaning that if I can train my frontier model with my normal tokenizer in 3 months, it will take you a year. When major releases across all competing providers are measured in months, there's simply no incentive to do that just to capture these fringe edge cases. | ||||||||
| ||||||||