| ▲ | eru 2 hours ago | |
I'm not sure what the point is? A bog standard random number generator or even a flipping coin can produce novel output at will. That's a weird thing to fault LLMs for? Novelty is easy! See also how genetic algorithms and re-inforcement learning constantly solve problems in novel and unexpected ways. Compare also antibiotics resistances in the real world. You don't need smarts for novelty. Where I see the problem is producing output that's both high quality _and_ novel. On command to solve the user's problem. | ||