| ▲ | joegibbs 4 hours ago | |
Would it make more sense to instead train a model and tokenise the syntax of languages differently so that white space isn’t counted, keywords are all a single token each and so on? | ||
| ▲ | __MatrixMan__ 4 hours ago | parent [-] | |
After watching models struggle with string replacement in files I've started to wonder if they'd be better off in making those alterations in a lisp: where it's normal to manipulate code not as a string but as a syntax tree. | ||