Remix.run Logo
smithza 3 hours ago

Please read through this incredible book review (book is All Things Are Full of Gods by David Bentley Hart). It is the kind of philosophy that everyone is looking past. Syntactic vs informational determinacy. LLMs is designed to create copy that is syntactically determinate (it is a complex set of statistics functions). Whereas the best human prose actually is the opposite--it does not converge on syntactic determinacy (see quote below) but instead converges on informational determinacy. The plot resolves as the reader's knowledge grows from abstraction and ignorance to empathy, insight and anticipation.

https://www.thenewatlantis.com/publications/one-to-zero

  Semantic information, you see, obeys a contrary calculus to that of physical bits. As it increases in determinacy, so its syntactical form increases in indeterminacy; the more exact and intentionally informed semantic information is, the more aperiodic and syntactically random its physical transmission becomes, and the more it eludes compression. I mean, the text of Anna Karenina is, from a purely quantitative vantage of its alphabetic sequences, utterly random; no algorithm could possibly be generated — at least, none that’s conceivable — that could reproduce it. And yet, at the semantic level, the richness and determinacy of the content of the book increases with each aperiodic arrangement of letters and words into coherent meaning.
Edit: add-on

In other words, it is impossible for an LLM (or monkeys at keyboards [0]) to recreate Tolstoy because of the unique role our minds play in writing. The verb writing hardly appears to apply to an LLM when we consider the function it is actually doing.

[0] https://libraryofbabel.info