| ▲ | athrowaway3z 10 days ago | |||||||||||||
> If you’re using an LLM to spit out text for you, they’re not your thoughts The thoughts I put into a text are mostly independent of the sentences or _language_ they're written in. Not completely independent, but to claim thoughts are completely dependent on text (thus also the language) is nonsense. > Might as well just give people your prompt. What would be the value of seeing a dozen diffs? By the same logic, should we also include every draft? | ||||||||||||||
| ▲ | mrguyorama 10 days ago | parent | next [-] | |||||||||||||
>The thoughts I put into a text are mostly independent of the sentences or _language_ they're written in. Not even true! Turning your thoughts into words is a very important and human part of writing. That's where you choose what ambiguities to leave, which to remove, what sort of implicit shared context is assumed, such important things as tone, and all sorts of other unconscious things that are important in writing. If you can't even make those choices, why would I read you? If you think making those choices is unimportant, why would I think you have something important to say? Uneducated or unsophisticated people seem to vastly underestimate what expertise even is, or just how much they don't know, which is why for example LLMs can write better than most fanfic writers, but that bar is on the damn floor and most people don't want to consume fanfic level writing for things that they are not fanatical about. There's this weird and fundamental misconception in pro-ai realms that context free "information" is somehow possible, as if you can extract "knowledge" from text, like you can "distill" a document and reduce meaning to some simple sentences. Like, there's this insane belief that you can meaningfully reduce text and maintain info. If you reduce "Lord of the flies" to something like "children shouldn't run a community", you've lost immense amounts of info. That is not a good thing. You are missing so much nuance and context and meaning, as well as more superficial (but not less important!) things like the very experience of reading that text. Like, consider that SOTA text compression algorithms can reduce text to 1/10th of it's original size. If you are reducing a text by more than that to "summarize" or "reduce to it's main points" a text, do you really think you are not losing massive amounts of information, context, or meaning? | ||||||||||||||
| ||||||||||||||
| ▲ | y0eswddl 8 days ago | parent | prev [-] | |||||||||||||
language we use actually very much dictates the way we think... for instance, there's a tribe that describes directions only using the Cardinals. and as such they have no words for nor mental concept of "left and right". and coincidentally, they're all much more proficient at navigation and have a better general sense of direction (obviously) than the average human because of the way they have to think about directions when just talking to each other. === is also why the best translators don't just do a word for word replacement but half to force think through cultural context and ideology on both sides of the conversation in order to make a more coherent translation. what language you use absolutely dictates how and what we think as well as what particular message is conveyed | ||||||||||||||