Remix.run Logo
daxfohl 7 hours ago

Interesting, IIUC the transformer architecture / attention mechanism were initially designed for use in the language translation domain. Maybe after peeling back a few layers, that's still all they're really doing.

nathan_compton 6 hours ago | parent [-]

This has long been how I have explained LLMs to non-technical people: text transformation engines. To some extent, many common, tedious, activities basically constitute a transformation of text into one well known form from another (even some kinds of reasoning are this) and so LLMs are very useful. But they just transform text between well known forms.