▲ | Gareth321 5 days ago | |
If executed well I think this could reduce a lot of friction in the process. I can definitely unlock my phone and hold it with one hand while I prepare and cook, but that's annoying. If my glasses could monitor progress and tell me what to do with what while I'm doing it, that's far more convenient. It's clearly not there yet, but in a few years I have no doubt it will be. And this is just the start. With the screens they'll be able to offer AR. Imagine working on electronics or a car and the instructions are overlaid on the screen while the AI is providing verbal instructions. |