Remix.run Logo
eru 7 hours ago

No. That's wrong. LLMs don't output the highest probability taken: they do a random sampling.

storus 7 hours ago | parent | next [-]

This was obviously a simplification which holds for zero temperature. Obviously top-p-sampling will add some randomness but the probability of unexpected longer sequences goes asymptotically to zero pretty quickly.

eru 2 hours ago | parent [-]

I'm not sure what the point is?

A bog standard random number generator or even a flipping coin can produce novel output at will. That's a weird thing to fault LLMs for? Novelty is easy!

See also how genetic algorithms and re-inforcement learning constantly solve problems in novel and unexpected ways. Compare also antibiotics resistances in the real world.

You don't need smarts for novelty.

Where I see the problem is producing output that's both high quality _and_ novel. On command to solve the user's problem.

6 hours ago | parent | prev [-]
[deleted]