Remix.run Logo
impossiblefork 2 days ago

If you can't secure computers against state attackers, then you have to stop using computers and to simply talk in places where there are not phones, computers etc.

If you're afraid about directional microphones out in the woods there are countermeasures for that too, but security is very possible even against the very most well-funded attackers.

Furthermore, I don't think even internet-connected secure computers are so hard that they can't be built. Limit what you do, so that you can write the program short enough that you can afford to have theoretical guarantees-- maybe write it to run on a computer with Harvard architecture to avoid buffer overflows, and you can probably build one on an FPGA, even as a hobbyist.

State attackers aren't magic.

Yeul 2 days ago | parent | next [-]

I read once that when America refurbishes an embassy somewhere in the world they bring in their own construction company. Otherwise you end up with mics in the walls.

Used to think the Chinese were paranoid with their bans on iPhones and Tesla's...

sgarland 2 days ago | parent | next [-]

Kind of. They’re required (or agree to?) to use local labor at least in part, but there American companies that manage the construction. My grandfather (a U.S. citizen) does security inspections for embassy construction, verifying that it’s built to plan, that all materials are traceable to point of origin, etc.

hammock 2 days ago | parent [-]

Why not pay a local labor team to sit idle while you do the work yourself? Would be worth it

Nab443 2 days ago | parent | prev | next [-]

iPhones and Teslas would be overkill anyway: https://www.cryptomuseum.com/covert/bugs/thing/

impossiblefork 2 days ago | parent | prev [-]

Yeah, that seems completely unavoidable otherwise.

I've always seen it as pretty strange to carry around other people's computers or using external services-- so I've always seen things like phones, Google Maps, etc. as things that it is strange that any country that isn't the US allows people to use.

I don't think one absolutely needs to make everything oneself, but I can't imagine that it's sensible that everybody use external services, so that so much information ends up in one place.

chgs 2 days ago | parent | prev [-]

Until state attackers pick up your developers kids and bring them home from school, and then nicely ask him to put in a back door.

impossiblefork 2 days ago | parent [-]

But how would they know how the developer is? This is the neat part of not putting things where people can find them out.

Also, if you really keep it short, you can always check that he hasn't by reading it. You could also just never update it, and it let become ancient and well-tested.

Spooky23 2 days ago | parent [-]

Lots of espionage and surveillance within government and contractors.

Lots of body shop contractors are fake people anyway. Pretty easy to imagine placing a compromised person in a low sensitivity area, then moving laterally.

impossiblefork 2 days ago | parent [-]

But why you hire consultants to solve core security problems?

Furthermore, surely it would just be one guy who knows OS and FPGA stuff and another guy to check it?

What I'm arguing for is that a sensible solution to security problems is to avoid complexity, so that things can be obviously secure.

Carefully defined interfaces designed to be clear, impossible to misinterpret and which are designed to be parsed and implemented without doing anything requiring some kind of fiddly parsing that can lead difficulties, and small enough that someone can implement them in an afternoon; and then you combine that with a machine inherently robust to things like buffer overflows such as Harvard architecture type things, and it's easy even for a single engineer to program something like that up on an FPGA.

Spooky23 2 days ago | parent [-]

You don’t.

You hire them for other lower priority roles, but they are inside the firewall. Most large organizations have an immature zero trust environment.

Look at the Microsoft PKI breach. The adversary was able to compromise certificate services in a corporate dev environment and parlay that in accessing US government mailboxes in a supposedly isolated cloud tenant. Microsoft has a world class security practice. The average Fortune 1000 is toast.

stackskipton 2 days ago | parent | next [-]

Microsoft PKI was because they were not doing world class security practice. For some reason, consumer environment could sign corporate environment logins. Also, they acquired some company and instead of issuing them new hardware to ensure it wasn't compromised, they just let them onto their network.

When you read the report, it was very clear that Microsoft wasn't doing "World Class Security Practice", they were taking shortcuts like everyone else does.

Spooky23 2 days ago | parent [-]

Yup. They fucked up pretty bad. How many places do you think are worse than them?

stackskipton 2 days ago | parent [-]

Probably all of them because no one loses money for bad InfoSec practices.

impossiblefork a day ago | parent | prev [-]

But Microsoft doesn't take this approach at all.

Their software is huge, with all sorts of things integrated into it and no focus at all on keeping the software small enough that one person can read it through with such care that it can be assured to be secure.

They probably run their cloud stuff on processors that can reorder instructions and all sorts of things, whereas what I'm arguing for is simple computers, things that can run a text-only search engine and where the text editor is substantially simpler than nano.

Where you decide exactly what your requirements are and make a system which solves that problem and nothing else.