Remix.run Logo
lmm 2 hours ago

That might be an argument for not using a novel homebrew programming language. But it's not an argument against, like, any top-100 or even top-1000 programming language, which will be adequately represented in the training data.

ambicapter an hour ago | parent [-]

It is if more training data results in better performance. In which case, GP will continue to use the language that is likely to have the most training data available.

lmm an hour ago | parent [-]

> It is if more training data results in better performance.

Sure. But given the relation with translation systems, it seems far more likely that there are diminishing returns to larger volumes of training data.