▲ | manoweb 10 hours ago | |||||||||||||
Your usage of Siri today (probably on an old version of iOS) frankly has nothing to do with the article we are discussing. Sorry to say this but it is going to take time. Comparing the performance of a chatgpt running in a big data center with a model running locally on a phone device... give it a few years. | ||||||||||||||
▲ | ninkendo 9 hours ago | parent | next [-] | |||||||||||||
People have been giving Siri a few years for a decade now. Siri used to run in a data center (and still does for older hardware and things like HomePods) and it has never supported compound queries. Siri needs to be taken out back and shot. The problem with “upgrading” it is the pull to maintain backwards compatibility for every little thing Siri did, which leads them to try and incorporate existing Siri functionality (and existing Siri engineers) to work alongside any LLM. Which leads to disaster, and none of it works and just made it all slower. They’ve been trying to do an LLM assisted Siri for years now and it’s the most public facing disaster the company has had in a while. Time to start over. | ||||||||||||||
| ||||||||||||||
▲ | lxgr 7 hours ago | parent | prev [-] | |||||||||||||
> Your usage of Siri today (probably on an old version of iOS) frankly has nothing to do with the article we are discussing. Yes, but isn't that infuriating? The technology exits! It even exists, as evidenced by this article, in the same company that provides Siri! At least I feel that way every time I interact with it – or for that matter my Google Home speaker, ironically made and operated by the company that invented transformer networks. |