| ▲ | zozbot234 3 hours ago | ||||||||||||||||||||||||||||
COBOL is the perfect language for LLMs because it looks just like the English text they were trained on to begin with. | |||||||||||||||||||||||||||||
| ▲ | cromka 3 hours ago | parent | next [-] | ||||||||||||||||||||||||||||
It's quite the contrary, the less interpretative the language, the better. And no, LLMs were not trained on English to begin with. And they don't perform best in English. | |||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||
| ▲ | Insanity 2 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
That’s not how it works. Being trained a ton of human text doesn’t mean you can complete the next token for a program that needs to be logically coherent. Imagine all your data is Reddit threads and now I ask you what follows “goto”, how would Reddit help you? The opposite is likely true - there isn’t a ton of publicly available cobol code compared to e.g React, so an LLM will degrade. | |||||||||||||||||||||||||||||
| ▲ | keyle 3 hours ago | parent | prev [-] | ||||||||||||||||||||||||||||
Context window required grows though. | |||||||||||||||||||||||||||||