| ▲ | RandomGerm4n 8 hours ago | ||||||||||||||||||||||
Users have the right to modify any app running on their own device. Software security should never depend on the user having no control over their own device. Smartphones are essentially just regular computers, and on them you can use a debugger and do whatever you want. Viewing smartphones as closed systems like game consoles where you need the manufacturer’s permission for everything only leads us into the dystopia that Richard Stallman described as early as 1997 in his short story "The Right to Read" | |||||||||||||||||||||||
| ▲ | viktorcode 2 hours ago | parent | next [-] | ||||||||||||||||||||||
To become dystopia people must be forced to use locked down smartphones. In reality you buy the one that suits your needs and do not enforce your design decisions on the smartphones other people use. | |||||||||||||||||||||||
| ▲ | Avamander 6 hours ago | parent | prev [-] | ||||||||||||||||||||||
Once SafetyNet was brought to Android a decade ago the tendency has been clear - these freedoms are going to be restricted heavily. Because how do you make sure it's the user who does those modifications, willingly and well-informed? That it's not a malicious actor, not an user getting socially engineered or phished? Incredibly difficult compared to the current alternative. If it's not a software root of trust that provides an attestable environment like Android or iOS. It's going to be a hardware root of trust that provides an attestable hardware environment, like SGX. I can predict no other practical avenue taken. Unless the orangutan really forces a demonstration on how untrustworthy these environments can be and a lot of money and effort is spent. | |||||||||||||||||||||||
| |||||||||||||||||||||||