| ▲ | avadodin 4 hours ago | |
I don't think the LLMs are to blame here. Not yet, at least. This is caused by people active in English-speaking communities translating memes literally and spreading them in their native language communities as-is. As the meme spreads, monolingual speakers begin using the same format and eventually they reference it off-line. | ||
| ▲ | gyomu 3 minutes ago | parent | next [-] | |
> This is caused by people active in English-speaking communities translating memes literally and spreading them in their native language communities as-is. You're right that memes are getting translated literally and spreading, but in terms of pure volume, LLMs doing the translation dwarf humans doing the translation. Building a bot that picks whatever posts are trending on Reddit/imgur/etc, automatically translates them, and then posts them in target languages on social networks is an easy way to accumulate likes+followers; then those high reach accounts get used to push whatever makes money. | ||
| ▲ | int_19h an hour ago | parent | prev [-] | |
Not really. I have observed the same thing in Russian, and no, it's not for expressions that are translated literally. Having said that, SOTA models got much better at this kinda stuff. They're quite able to write in a way that is indistinguishable from a native speaker, colloquialisms and all, with the right prompting. But SOTA models are also expensive. Most automated translations are done with something way cheaper and worse. | ||