Remix.run Logo
romaniv 2 hours ago

I still hope that one of these days people in general will realize that executable signing and SecureBoot are specifically designed for controlling what a normal person can run, rather than for anything resembling real security. The premises of either of those "mitigations" make absolutely no sense for personal computers.

arcfour 6 minutes ago | parent | next [-]

I strongly disagree on the Secure Boot front. It's necessary for FDE to have any sort of practical security, it reduces malicious/vulnerable driver abuse (making it nontrivial), bootkits are a security nightmare and would otherwise be much more common in malware typical users encounter, and ultimately the user can control their secure boot setup and enroll their own keys if they wish.

Does that mean that Microsoft doesn't also use it as a form of control? Of course not. But conflating "Secure Boot can be used for platform control" with "Secure Boot provides no security" is a non-sequitur.

kelseyfrog 4 minutes ago | parent [-]

Anything that restricts user freedom is entirely bad, even if it's at the expense of security.

astrobe_ 2 hours ago | parent | prev | next [-]

I don't know about executable signing, but in the embedded world SecureBoot is also used to serve the customer; id est provide guarantees to the customer that the firmware of the device they receive has not been tampered with at some point in the supply chain.

201984 2 hours ago | parent | next [-]

And what if that customer wants to run their own firmware, ie after the manufacturer goes out of business? "Security" in this case conveniently prevente that.

hhh 18 minutes ago | parent | next [-]

you click the box to turn off secure boot

an hour ago | parent | prev | next [-]
[deleted]
gjsman-1000 2 hours ago | parent | prev [-]

Tradeoffs. Which is more likely here?

1. A customer wants to run their own firmware, or

2. Someone malicious close to the customer, an angry ex, tampers with their device, and uses the lack of Secure Boot to modify the OS to hide all trace of a tracker's existence, or

3. A malicious piece of firmware uses the lack of Secure Boot to modify the boot partition to ensure the malware loads before the OS, thereby permanently disabling all ability for the system to repair itself from within itself

Apple uses #2 and #3 in their own arguments. If your Mac gets hacked, that's bad. If your iPhone gets hacked, that's your life, and your precise location, at all times.

samlinnfer 2 hours ago | parent | next [-]

1. P(someone wants to run their own firmware)

2. P(someone wants to run their own firmware) * P(this person is malicious) * P(this person implants this firmware on someone else’s computer)

3. The firmware doesn’t install itself

Yeah I think 2 and 3 is vastly less likely and strictly lower than 1.

the__alchemist 4 minutes ago | parent | next [-]

I encourage you to re-evaluate this. How many devices do you (or have you) own which have have a microcontroller? (This includes all your appliances, your clocks, and many things you own which use electricity.) How many of these have you reflashed with custom firmware?

Imagine any of your friends, family, or colleagues. (Including some non-programmers/hackers/embedded-engineers) What would their answers be?

mikestew 41 minutes ago | parent | prev | next [-]

As an embedded programmer in my former life, the number of customers that had the capability of running their own firmware, let alone the number that actually would, rapidly approaches zero. Like it or not, what customers bought was an appliance, not a general purpose computer.

itsdesmond an hour ago | parent | prev | next [-]

This guy thinks that if you rephrase an argument but put some symbols around it you’ve refuted it statistically.

P(robably not)

samlinnfer an hour ago | parent [-]

The argument is that P(customer wants to run their own firmware) cancels out and 2,3 are just the raw probability of you on the receiving end of an evil maid attack. If you think this is a high probability, a locked bootloader won’t save you.

FabHK 6 minutes ago | parent [-]

Very neat, but 1) is not really P(customer wants to run their own firmware), but P(customer wants to run their own firmware on their own device).

So, the first term in 1) and 2) are NOT the same, and it is quite conceivable that the probability of 2) is indeed higher than the one in 1) (which your pseudo-statistical argument aimed to refute, unsuccessfully).

philistine 2 hours ago | parent | prev | next [-]

As if the monetary gain of 2 and 3 never entered the picture. Malicious actors want 2 and 3 to make money off you! No one can make reasonable amounts of money off 1.

gjsman-1000 2 hours ago | parent | prev | next [-]

On Android, according to the Coalition Against Stalkerware, there are over 1 million victims of deliberately placed spyware on an unlocked device by a malicious user close to the victim every year.

#2 is WAY more likely than #1. And that's on Android which still has some protections even with a sideloaded APK (deeply nested, but still detectable if you look at the right settings panels).

As for #3; the point is that it's a virus. You start with a webkit bug, you get into kernel from there (sometimes happens); but this time, instead of a software update fixing it, your device is owned forever. Literally cannot be trusted again without a full DFU wipe.

samlinnfer 2 hours ago | parent [-]

And where are the stats for people running their own firmware and are not running stalkerware for comparison? You don’t need firmware access to install malware on Android, so how many of stalkerware victims actually would have been saved by a locked bootloader?

gjsman-1000 2 hours ago | parent [-]

The entirety of GrapheneOS is about 200K downloads per update. Malicious use therefore is roughly 5-1.

> You don’t need firmware access to install malware on Android, so how many of stalkerware victims actually would have been saved by a locked bootloader?

With a locked bootloader, the underlying OS is intact, meaning that the privileges of the spyware (if you look in the right settings panel) can easily be detected, revoked, and removed. If the OS could be tampered with, you bet your wallet the spyware would immediately patch the settings system, and the OS as a whole, to hide all traces.

samlinnfer an hour ago | parent | next [-]

Assuming that we accept your premise that the most popular custom firmware for Android is stalkerware (I don’t). This is of course, a firmware level malware, which of course acts as a rootkit and is fully undetectable. How did the coalition against stalkerware, pray tell, manage to detect such an undetectable firmware level rootkit on over 1 million Android devices?

kuschku an hour ago | parent | prev [-]

LineageOS alone has around 4 million active users. So malicious use is at most 1:4, not 5:1.

lazide an hour ago | parent | prev [-]

Clearly you’ve never met my ex’s (or a past employer). Not even being sarcastic this time.

dns_snek an hour ago | parent | prev [-]

#2 and #3 are fearmongering arguments and total horseshit, excuse the strong language.

Should either of those things happen the bootloader puts up a big bright flashing yellow warning screen saying "Someone hacked your device!"

I use a Pixel device and run GrapheneOS, the bootloader always pauses for ~5 seconds to warn me that the OS is not official.

root_axis 39 minutes ago | parent [-]

Yes. They're making the point that your flashing yellow warning is a good thing, and that it's helpful to the customer that a mechanism is in place to prevent it from being disabled by an attacker.

tosti 2 hours ago | parent | prev | next [-]

Computers should abide by their owners. Any computer not doing that is broken.

ghighi7878 an hour ago | parent | next [-]

Its a simple solution in law to enable. Force manufacturers to allow owners of computer to put any signing key in the BIOS.

We need this law. Once we have this law, consumers csn get maximum benefit of secure boot withiut losing contorl

cferry an hour ago | parent | prev | next [-]

I make the analogy with a company, because on that front, ownership seems to matter a lot in the Western world. It's like it had to have unfaithful management appointed by another company they're a customer of, as a condition to use their products. Worse, said provider is also a provider for every other business, and their products are not interoperable. How long before courts jump in to prevent this and give back control to the business owner?

wat10000 an hour ago | parent | prev [-]

This gets tricky. If I click on a link intending to view a picture of a cat, but instead it installs ransomware, is that abiding by its owner or not? It did what I told it to do, but not at all what I wanted.

ghighi7878 an hour ago | parent [-]

We dont need to get philosophical here. You(the admin) can require you (the user) to input a password to signify to you(the admin) to install a ransomware when a link is clicked. That way no control is lost.

wat10000 35 minutes ago | parent [-]

What if the cat pictures are an app too? The computer can't require a password specifically for ransomware, just for software in general. The UI flow for cat pictures apps and ransomware will be identical.

Galanwe an hour ago | parent | prev | next [-]

> id est provide guarantees to the customer that the firmware of the device they receive has not been tampered with

The firmware of the device being a binary blob for the most part... Not like I trust it to begin with.

Whereas my open source Linux distribution requires me to disables SecureBoot.

What a world.

WhyNotHugo 22 minutes ago | parent | next [-]

You can set up custom SecureBoot keys on your firmware and configure Linux to boot using it.

There's also plenty of folks combining this with TPM and boot measurements.

The ugly part of SecureBoot is that all hardware comes with MS's keys, and lots of software assume that you'll want MS in charge of your hardware security, but SecureBoot _can_ be used to serve the user.

Obviously there's hardware that's the exception to this, and I totally share your dislike of it.

repelsteeltje 29 minutes ago | parent | prev [-]

+1

An unsigned hash is plenty guard to against tampering. The supply chain and any secret sauce that went into that firmware is just trust. Trust that the blob is well intentioned, trust that you downloaded from the right URL, checked the right SHA, trust that the organization running the URL is sanctioned to do so by Microsoft...

Once all of that trust for every piece of software is concentrated in one organization, Microsoft, Apple or Google, is has become totally meaningless.

mort96 an hour ago | parent | prev | next [-]

It's to serve the regulators. The Radio Equipment Directive essentially requires the use of secure boot fir new devices.

petcat an hour ago | parent [-]

I happen to like knowing that my mobile device did not have a ring 0 backdoor installed before it left the factory in Asia. SecureBoot gives me that confidence.

mort96 27 minutes ago | parent [-]

No it doesn't? The factory programs in the secure boot public keys

petcat 18 minutes ago | parent [-]

The public keys are provided by the developer. Google, or Apple, for example. It's how they know that nothing was tampered with before it left the factory.

2 hours ago | parent | prev [-]
[deleted]
asveikau 13 minutes ago | parent | prev | next [-]

Apple is also somewhat responsible for the attitude shift with the introduction of iOS. 20-25 years ago a locked down bootloader and only permitting signed code would have been seen by techies as dystopian. It's now quite normalized. They say it's about security but it's always been about control.

Stallman tried to warn us with "tivoization".

pjmlp an hour ago | parent | prev [-]

If only people didn't install Ask Jeeves toolbars all over the place and then asked their grandson during vacations to clean their computer.