Remix.run Logo
kogepathic 9 hours ago

> What I am asking for: publish a basic GitHub repo with the hardware specs and connection protocols. Let the community build their own apps on top of it.

This concept works fine for the author's example of a kitchen scale, but fails when the device in question is something like a router that has secure boot with one key burned into e-fuses.

In that case we need both open software and a requirement that the manufacturer escrow signing keys with someone so that after EOL any software can be run.

Aurornis 8 hours ago | parent | next [-]

Forcing the release of signing keys would be a security disaster. The first person to grab the expired domain for the auto update server for a IoT device now gets a free botnet.

The only real way to make devices securely re-usable with custom firmware requires some explicit steps and action to signal that the user wants to run 3rd-party firmware: A specific button press sequence is enough. You need to require the user to do something explicit to acknowledge that 3rd-party software is being installed, though.

Forcing vendors to release their security mechanisms to the public and allow anyone to sign firmware as the company is not what you want, though.

Retr0id 8 hours ago | parent | next [-]

The OTA firmware update keys ideally shouldn't be the same as the secure boot keys.

jcgl 25 minutes ago | parent [-]

…how do the updates get booted then?

bigfatkitten 10 minutes ago | parent [-]

ROM bootloader loads a second stage bootloader (e.g. [1]). The ROM bootloader verifies that the second stage loader is signed with keys fused into the MCU. The second stage bootloader in turn verifies application images shipped by the vendor, using a different set of keys.

When the vendor discontinues support for the device, they make available to their customers an optional signed update to the second stage bootloader that allows any application image to run, not just images signed by the vendor. They make it so this update can only be installed with some sort of local interaction with the device, not automatically over the air.

Devices in the field running ordinary OEM firmware continue to be protected from malicious OTA updates. Customers who wish to unlock their devices also have the means to do so.

This is very straightforward indeed to do, but it needs to be considered when the device is designed. Regulations would be required to make sure that happens.

[1] https://www.trustedfirmware.org/projects/mcuboot/

kogepathic 7 hours ago | parent | prev [-]

> Forcing the release of signing keys would be a security disaster. The first person to grab the expired domain for the auto update server for a IoT device now gets a free botnet.

Have you seen the state of embedded device security? It is already an unmitigated disaster.

Since you bring up botnets, there are far more exploited security vulnerabilities because a vendor EOLed support (or went bankrupt) and their firmware contained bugs that cannot be fixed because a signed firmware is required, or the source code was not provided than because their signing keys were leaked and someone is distributing malicious updates.

> Forcing vendors to release their security mechanisms to the public and allow anyone to sign firmware as the company is not what you want, though.

Yes, it is what I want. I am perfectly aware of the potential downsides and what I am proposing is worth it. The product is already EOL. In our current era of enshittification, vendor pinky promises to implement a user-bypass in their signed boot chain is not good enough. Look at the Other OS controversy on the PS3 if you want an example of this in practice, or Samsung removing bootloader unlocking in their One UI 8.0 update.

> The only real way to make devices securely re-usable with custom firmware requires some explicit steps and action to signal that the user wants to run 3rd-party firmware: A specific button press sequence is enough. You need to require the user to do something explicit to acknowledge that 3rd-party software is being installed, though.

The vendor has implemented an internal pad on the laser-welded, weather sealed, IP-rated smart watch that must be shorted to disable secure boot. Opening the device to access this will essentially destroy it, but we preserved the vendor's secure boot signing keys so missioned accomplished!

IgorPartola 6 hours ago | parent | next [-]

But you can still do both. Put a key into escrow that unlocks the device fully, but the key can only be used if the device is physically manipulated. This could mean holding down a button as it boots ups to put it into “enter the unlock key” mode. The mode is useless until the key is published and the key is useless without physical access to the device. And you don’t need to open anything. This could be a purely software thing. As long as you can somehow externally communicate with the device via a button, Bluetooth, Ethernet, etc. you can create a system that would allow this. Hell, you could use a magnet to trigger it.

I agree that devices shouldn’t be locked by the manufacturer AND I think that silently unlocking all devices all at once could do harm.

Aurornis 3 hours ago | parent | prev [-]

> Have you seen the state of embedded device security? It is already an unmitigated disaster.

If security was an unmitigated disaster on every device then it would be trivial to root them all and install your own software, wouldn’t it?

razighter777 8 hours ago | parent | prev | next [-]

How about just allowing key enrollment with a physical button?

kogepathic 8 hours ago | parent [-]

This is very much not an option on most embedded devices. They allow one key to be burned once.

IIRC, a certain Marvell SoC datasheet says multiple key slots are supported, but the boot ROM only supports reading the first entry (so really, only one key is supported).

nullpoint420 4 hours ago | parent [-]

Unless it becomes a law, and the hardware makers adapt.

realusername 5 hours ago | parent | prev [-]

Locked bootloader should just be competely forbidden, even for brand new devices. Hardware and phone owners have the right to make any change they see fit on their device, no matter if the manufacturer thinks it's ok or not.

goku12 4 hours ago | parent [-]

I agree with you fully on this. Unfortunately, the odds are stacked very unfavorably against us. It's not just the manufacturers who resort to these underhanded profiteering tactics. Even the regulatory agencies are for locking down the firmware.

Their argument is that an unlocked firmware would allow us to override regulatory restrictions like the RF output power or the IMEI number. That argument has some merit. However, my opinion is that such restrictions should be implemented as hardware interlocks that are unchangeable through software. Thus, we would be free to change the software as we like. Sadly, both the manufacturers and the regulatory agencies tend to completely ignore that solution, so that they can retain their excess control.

allreduce 14 minutes ago | parent | next [-]

It's trivially easy to break those restrictions with off the shelf SDR hardware you can buy rather cheaply.

Locking people out of their phone does not raise the skill or effort ceiling much, as there still presumably would be software restrictions in place.

realusername 2 hours ago | parent | prev [-]

I always found this claim completely bogus, you can always do something illegal with your phone, there's no way to prevent everything with software.

This is the goal of law enforcement and justice in general and in this argument, a hardware manufacturer is substituting this role, when we say that, we can see the overreach. Manufacturers aren't public entities able to make such decisions.