| ▲ | ulrikrasmussen a day ago |
| We need regulation which defines that any hardware device capable of running software developed by a third party different from the hardware manufacturer qualifies as a general purpose computing device, and that any such device is disallowed to put cryptographic or other restrictions on what software the user wants to execute. This pertains to all programmable components on the device, including low-level hardware controllers. These restrictions extend outside the particular device. It must also be illegal as a commercial entity to enforce security schemes which involve remote attestation of the software stack on the client device such that service providers can refuse to service clients based on failing attestation. Service providers have other means of protecting themselves, taking away users control of their own devices is a heavy handed and unnecessarily draconian approach which ultimately only benefits the ad company that happens to make the software stack since they also benefit from restricting what software users can run. Hypothetically, they might be interested in making it impossible to modify video players to skip ads. |
|
| ▲ | miki123211 a day ago | parent | next [-] |
| I agree, but I think three extra conditions would need to be added here. 1. Devices should be allowed to display a different logo at boot time depending on whether the software is manufacturer-approved or not. That way, if somebody sells you an used device with a flashed firmware that steals all your financial data, you have a way to know. 2. Going from approved to unapproved firmware should result in a full device wipe, Chromebook style. Possibly with a three-day cooldown. Those aren't too much of an obstacle for a true tinkerer who knows what they're doing, but they make it harder to social engineer people into installing a firmware of the attackers' choosing. 3. Users should have the ability to opt themselves into cryptographic protection, either on the original or modified firmware, for anti-theft reasons. Otherwise, devices become extremely attractive to steal. |
| |
| ▲ | xg15 a day ago | parent | next [-] | | > Devices should be allowed to display a different logo at boot time depending on whether the software is manufacturer-approved or not. Not sure how to phase this legally, but please also add a provision against manufacturers making the "custom firmware" logo hideously ugly on purpose to discourage rooting - like e.g.Microsoft did for Surface tablets. > 3. Users should have the ability to opt themselves into cryptographic protection, either on the original or modified firmware, for anti-theft reasons. Full agreement here. I very much would like to keep the bootloader locked - just to my own keys, not the OEMs. | | |
| ▲ | harvey9 a day ago | parent [-] | | Someone with the motivation to install custom firmware would consider the bootsplash aesthetic a deal breaker? | | |
| ▲ | AshamedCaptain a day ago | parent | next [-] | | Yes -- bootsplash showing "DANGER! YOUR SECURITY AT RISK! HACKERS CAN NOW STEAL YOUR GIRLFRIEND AND SHUFFLE YOUR PAIRS OF SOCKS!" in big bold red letters only because you enabled root to remove manufacturer malware (which if anything likely _increases_ your security) is a deal breaker, because it will frighten most users from doing it . | |
| ▲ | xg15 a day ago | parent | prev [-] | | If you want to promote alternative bootloaders or OSes for wider, nontechnical audiences (like LineageOS etc), then absolutely. I think it's a difference in mindset whether you view custom firmware as a grudging exception for techies (with the understanding that "normal" people should have a device under full control of their respective vendor), or whether you want an open OS ecosystem for everyone. |
|
| |
| ▲ | xg15 a day ago | parent | prev | next [-] | | > Devices should be allowed to display a different logo at boot time depending on whether the software is manufacturer-approved or not. Another thought on that point: Why of all things is manufacturer approval so important? We know manufacturers often don't work for - or even work against - the interests of their end users. Manufacturer approval is not an indicator for security - as evidenced by the OP article. If anything, we need independent third parties that can vet manufacturer and third party software and can attach their own cryptographic signatures as approval. | |
| ▲ | gmueckl a day ago | parent | prev [-] | | 4. Apps with special security needs are allowed to detect whether a device is unlocked and can either disable themselves or go into a mode that shifts ALL related liability onto the user. It's not the bank's fault if the user disabled protections and some spyware logs the online banking password or something like that. | | |
| ▲ | Zak a day ago | parent | next [-] | | I'm pretty sure I'm against this. I could be convinced otherwise by documentation of significant fraud involving compromised devices (especially Android phones) that would have been stopped by a device attestation scheme. I should note Google has such an attestation scheme, and there are reliable defeats for it in most situations given root access. Apps have been able to insist on hardware-backed attestation which has not been defeated for some time, but that isn't available for old devices. Almost none do so. If this had a meaningful impact on fraud, more apps would insist on the hardware-backed option, but that's quite rare. Even Google doesn't; I used Google Pay contactless with LineageOS and root this week. I'm currently convinced it's primarily a corporate power grab; non-Google-approved Android won't be a consumer success if it doesn't run your banking app, and the copyright lobby loves anything that helps DRM. | | |
| ▲ | ulrikrasmussen a day ago | parent [-] | | Also, online banking has been a thing for so long on PCs which never had that kind of remote attestation. I also do not believe the security argument, but I believe that the banks believe it. | | |
| ▲ | Zak a day ago | parent | next [-] | | I suspect the banks want to do checkbox-based compliance with regulators and insurers without any deep understanding of the underlying issues. | |
| ▲ | gmueckl a day ago | parent | prev [-] | | Online banking doesn't need remote attestation. Some additional locked down hardware with its own minimal display is enough. My banks force me to use devices like those made by Kobil or ReinerSCT. |
|
| |
| ▲ | ulrikrasmussen a day ago | parent | prev | next [-] | | My bank app refuses to work on LineageOS, but I can use the web interface just fine which has the exact same UI and functionality as the app. In both the native app and the web app I have to authorize any transactions using my national ID, which for me is a hardware token (the app for my national ID also refuses to run). Why is it somehow insecure to initiate this flow from a native app on LineageOS while it is not insecure to do the exact same via a browser on LineageOS? If the app can be compromised, so can the browser - the bank cannot trust all its browser based clients anyway. The web app has been running with this security model for decades on PCs, and it has been fine. The whole narrative about remote attestation being necessary to protect users is an evil lie in my opinion, but it is an effective lie which has convinced even knowledgeable IT professionals that taking away device ownership from users is somehow justified. | | |
| ▲ | gmueckl a day ago | parent [-] | | A hardware device that doesn't confirm transaction details on its own locked down display enables man in the middle attacks. I have to use such devices with my bank card when banking online. |
| |
| ▲ | mmh0000 a day ago | parent | prev | next [-] | | It is the banks fault if they allow non-reversible, weird or large transactions without a secondary authorization capability. The bank’s bad processes are not an end device fault. | |
| ▲ | xg15 a day ago | parent | prev | next [-] | | Yeah, nope. All apps have "special security needs" according to their manufacturers. Every app that relies on spying for revenue will use that to disable itself. (Or worse, actively malfunction - e.g. that banking app could switch into a special mode where it does transactions on its own that are not in the interest of the user. If the user has accepted all liability, there isn't much they could do against that) I'm alright with limiting liability for an unlocked/customized phone (for things that happen from that phone) - but that's a legal/contractual thing. For that to work, it's enough for a judge to understand that the phone was customized at that time - it doesn't require the app to know. | |
| ▲ | Dylan16807 a day ago | parent | prev [-] | | Screw that. I want nearly the opposite. I don't really own my device if apps will look at my ownership flag and refuse to run. We can talk about the consequences of spyware but definitely not a total liability shift. Also preventing root doesn't prevent spyware. |
|
|
|
| ▲ | Sophira a day ago | parent | prev | next [-] |
| While I agree in theory, this is never going to happen. There's too much DRM in use for it to work out. |
| |
| ▲ | jimjimwii a day ago | parent | next [-] | | Repeal and outlaw drm. It was a mistake that violates everyone's constitutional rights. | | |
| ▲ | mmh0000 a day ago | parent [-] | | “constitutional rights” Words written on toilet paper. Only thing that exists today are “billionaire rights”. | | |
| ▲ | reactordev a day ago | parent | next [-] | | Exactly. DRM isn’t going anywhere so long as copyrights exist. | | |
| ▲ | xg15 a day ago | parent [-] | | Not even that. Companies are already lobbying massively for selective enforcement of copyright as to not harm the AI boom (immediate jail terms for individuals torrenting a movie, "it's a complex issue" for AI companies scraping the entire internet) But even the DRM that is already there often only uses copyright laws as suggestions. E.g. YouTube's takedown guidelines are defined through their TOS, not through the DMCA. |
| |
| ▲ | mensetmanusman a day ago | parent | prev [-] | | Are there billionaires in the room with us right now? |
|
| |
| ▲ | const_cast 19 hours ago | parent | prev | next [-] | | DRM can still stick around and be popular. For example, consider an Apple TV. They make the hardware and software, so it can be locked down under the provided rules. Or a console. We might consider devices which are used for streaming or movies to not be general purpose computation devices. Which, historically, they haven't been. Watching copyrighted stuff on general purpose computers is a very new phenomena, and it's still quite atypical IMO. | |
| ▲ | AshamedCaptain a day ago | parent | prev | next [-] | | What there are is many people utterly convinced that this brings some security to end-users. See the other messages in this thread. DRM is only a fraction of the problem. | |
| ▲ | al_borland a day ago | parent | prev [-] | | DRM is a barrier to legally protected purchasing digital media for me. I will buy an album from iTunes (no DRM), but I will not buy digital movies the same way. |
|
|
| ▲ | akoboldfrying a day ago | parent | prev [-] |
| > any such device is disallowed to put cryptographic or other restrictions on what software the user wants to execute Won't this also forbid virus scanners that quarantine files? > This pertains to all programmable components on the device, including low-level hardware controllers. I don't think it's reasonable to expect any manufacturer to uphold a warranty if making unlimited changes to the system is permitted. |
| |
| ▲ | fc417fc802 a day ago | parent | next [-] | | It wouldn't forbid shipping the device with a virus scanner. It would only forbid refusing the user control over what software does and does not run. There might be a couple messy edge cases if applied at the software level but I think it would work well. Applied at the hardware level it would be very clear cut. It would simply outlaw technical measures taken to prevent the user from installing an arbitrary OS on the device. Regarding warranties, what's so difficult about flashing a stock image to a device being serviced? At least in the US wasn't this already settled long ago by Magnuson-Moss? https://en.wikipedia.org/wiki/Magnuson%E2%80%93Moss_Warranty... | | |
| ▲ | akoboldfrying 17 hours ago | parent [-] | | > what's so difficult about flashing a stock image to a device being serviced? Yes, I think that would cover most cases if we take it to its logical conclusion of wiping all device state (hard disk). OTOH, a few points: 1. I would accept the need to wipe the hard disk if I had messed with firmware or the OS, but not if a couple of keys on the keyboard had stopped working. This implies that (for me at least) a meaningful distinction remains between these two "levels" of warranty service. Do you agree? 2. Activities like overclocking or overvolting a CPU have the potential to cause lasting damage that can't be reversed by re-flashing. Under the policy you're suggesting, it would be illegal for manufacturers to offer users the option "You can pull this pin low to overclock outside the supported range, but you will void the warranty by doing so", and too expensive for them to endlessly replace parts damaged by these activities for free under warranty, so that consumer option, rare as it already is, would go away completely. 3. I still think there may be some devices that are impractical to completely re-flash. According to this 2021 Porsche article [0], modern cars contain 70-100 ECUs (microcontrollers), each of which will have its own flash/EEPROM. [0]: https://medium.com/next-level-german-engineering/porsche-fut... |
| |
| ▲ | afeuerstein a day ago | parent | prev | next [-] | | > Won't this also forbid virus scanners that quarantine files? Yes. If I really _want_ to execute malware on my device, I should be allowed to do so by disabling the antivirus or disregarding a warning. > I don't think it's reasonable to expect any manufacturer to uphold a warranty if making unlimited changes to the system is permitted It is very reasonable and already the rule of law in "sane" jurisdictions, that manufacturer and mandated warranties are not touched by unrelated, reversable modifications to both hard- and software. | | |
| ▲ | akoboldfrying 16 hours ago | parent [-] | | > Yes. If I really _want_ to execute malware on my device, I should be allowed to do so by disabling the antivirus or disregarding a warning. I agree. > already the rule of law in "sane" jurisdictions, that manufacturer and mandated warranties are not touched by unrelated, reversable modifications to both hard- and software. Do you have any examples of such jurisdictions? I think whether this is reasonable turns on how "reversible" is interpreted. If it means "reversible to factory settings", including wiping all built-in storage media, then it seems reasonable to me that manufacturers should support this (possibly modulo some extreme cases like cars that have dozens of CPUs). But I would not be happy with having my hard disk wiped if I sent in my laptop for repairs because a couple of keys stopped working, which tells me that (to me) there remain at least two classes of "problem that should be fixed for free under warranty by the manufacturer". |
| |
| ▲ | encom a day ago | parent | prev [-] | | >virus scanners You can (and should, imho) remove anti-virus software. |
|