| ▲ | jkubicek 2 days ago | |||||||
> In the same way, LLMs should speak to us in our favored format - in images, infographics, slides, whiteboards, animations/videos, web apps, etc. You think every Electron app out there re-inventing application UX from scratch is bad, wait until LLMs are generating their own custom UX for every single action for every user for every device. What does command-W do in this app? It's literally impossible to predict, try it and see! | ||||||||
| ▲ | johnfn 2 days ago | parent | next [-] | |||||||
On the other side of the spectrum, I see some of the latest agents, like Codex, take care to get accessibility right -- something not even many humans bother to do. | ||||||||
| ||||||||
| ▲ | Aiisnotabubble 2 days ago | parent | prev | next [-] | |||||||
But that's exactly what an LLM solved. It's the best ui ever. It understands a lot of languages and abstract concepts. It will not be necessary at all to let LLM generate random uis. I'm not a native English speaker. I sometimes just throw in a German word and it just works. | ||||||||
| ▲ | tim333 2 days ago | parent | prev [-] | |||||||
>our favored format - in images, infographics, slides, whiteboards, animations/videos, web apps, etc If you look at how humans actually communicate I'd guess #1 is text/speech, #2 pictures | ||||||||