Remix.run Logo
nijave 11 hours ago

I was always under the impression security was a red herring and the real reason was control. Google wants to own the device and rent it to users with revocable terms the same way SaaS subscription software works. Locking down what can run is a key step in that process

browningstreet 11 hours ago | parent | next [-]

I worked at a bank on the backend for architecture and security.. and I've posted this attestation here before, but the sheer volume of fraud and fraud attempts in the whole network is astonishing. Our device fingerprinting and no-jailbreak-rules weren't even close to an attempt at control. It was defense, based on network volume and hard losses.

Should we ever suffer a significant loss of customer identity data and/or funds, that risk was considered an existential threat for our customers and our institution.

I'm not coming to Google's defense, but fraud is a big, heavy, violent force in critical infrastructure.

And our phones are a compelling surface area for attacks and identity thefts.

josephg 10 hours ago | parent | next [-]

I wish we had technical solutions that offered both. For example, a kernel like SeL4, which could directly run sandboxed applications, like banking apps. Apps run in this way could prove they are running in a sandbox.

Then also allow the kernel to run linux as a process, and run whatever you like there, however you want.

Its technically possible at the device level. The hard part seems to be UX. Do you show trusted and untrusted apps alongside one another? How do you teach users the difference?

My piano teacher was recently scammed. The attackers took all the money in her bank account. As far as I could tell, they did it by convincing her to install some android app on her phone and then grant that app accessibility permissions. That let the app remotely control other apps. They they simply swapped over to her banking app and transferred all the money out. Its tricky, because obviously we want 3rd party accessibility applications. But if those permissions allow applications to escape their sandbox, and its trouble.

(She contacted the bank and the police, and they managed to reverse the transactions and get her her money back. But she was a mess for a few days.)

JuniperMesos 9 hours ago | parent | next [-]

> (She contacted the bank and the police, and they managed to reverse the transactions and get her her money back. But she was a mess for a few days.)

And this almost certainly means that the bank took a fraud-related monetary loss, because the regulatory framework that governs banks makes it difficult for them to refuse to return their customer's money on the grounds that it was actually your piano teacher's fault for being stupid with her bank app on her smartphone (also, even if it were legal to do so, doing this regularly would create a lot of bad press for the bank). And they're unlikely to recover the losses from the actual scammers.

Fraud losses are something that banks track internally and attempt to minimize when possible and when it doesn't trade-off against other goals they have, such as maintaining regulatory compliance or costing more money than the fraud does. This means that banks - really, any regulated financial institution at all that has a smartphone app - have a financial incentive to encourage Apple and Google to build functionality into their mass-market smartphone OSs that locks them down and makes it harder for attackers to scam ordinary, unsophisticated customers in this way. They have zero incentive to lobby to make smartphone platforms more open. And there's a lot more technically-unsophisticated users like your piano teacher than there are free-software-enthusiasts who care about their smartphone OS provider not locking down the OS.

I think this is a bad thing, but then I'm personally a free-software-enthusiast, not a technically-unsophisticated smartphone user.

josephg 8 hours ago | parent | next [-]

> And this almost certainly means that the bank took a fraud-related monetary loss, because the regulatory framework that governs banks makes it difficult for them to refuse to return their customer's money on the grounds that it was actually your piano teacher's fault for being stupid with her bank app on her smartphone

In which country? This happened in Australia. The rules are almost certainly different from the US.

SchemaLoad 9 hours ago | parent | prev | next [-]

For me the answer is separate devices. I have an iphone which is locked down and secure. I have my banking and ID apps on it but I can't mod it however I want. Then I have a steam deck and raspberry pi I have entertainment and whatever I want on. I can customise anything. And if it gets hacked, nothing of importance is exposed.

5 hours ago | parent | prev [-]
[deleted]
EvanAnderson 9 hours ago | parent | prev | next [-]

> . For example, a kernel like SeL4, which could directly run sandboxed applications, like banking apps. Apps run in this way could prove they are running in a sandbox. ... Then also allow the kernel to run linux as a process, and run whatever you like there, however you want.

This won't work. It's turtles all the way down and it will just end up back where we are now.

More software will demand installation in the sandboxed enclave. Outside the enclave the owner of the device would be able to exert control over the software. The software makers don't want the device owners exerting control of the software (for 'security', or anti-copyright infringement, or preventing advertising avoidance). The end user is the adversary as much as the scammer, if not more.

The problem at the root of this is the "right" some (entitled) developers / companies believe they have to control how end users run "their" software on devices that belongs to the end users. If a developer wants that kind of control of the "experience" the software should run on a computer they own, simply using the end user's device as "dumb terminal".

Those economics aren't as good, though. They'd have to pay for all their compute / storage / bandwidth, versus just using the end user's. So much cheaper to treat other people's devices like they're your own.

It's the same "privatize gains, socialize losses" story that's at the root of so many problems.

josephg 8 hours ago | parent [-]

Good point. I didn't think of that.

It may still be an improvement over the situation now though. At least something like this would let you run arbitrary software on the device. That software just wouldn't have "root", since whatever you run would be running in a separate container from the OS and banking apps and things.

It would also allow 3rd party app stores, since a 3rd party app store app could be a sandboxed application itself, and then it could in turn pass privileges to any applications it launches.

EvanAnderson 8 hours ago | parent [-]

It's what we have now.

I can run an emulator in the browser my phone and run whatever software I want. The software inside that emulator doesn't get access to cool physical hardware features. It runs at a performance loss. It doesn't have direct network access. Second class software.

josephg 7 hours ago | parent [-]

Its not what we have now, for the reasons you list. Web software runs slowly and doesn't have access to the hardware.

SeL4 and similar sandboxing mechanisms run programs at full, native speed. In a scheme like I'm proposing, all software would be sandboxed using the same mechanism, including banking apps and 3rd party software. Everything can run fast and take full advantage of the hardware and all exposed APIs. Apps just can't mess with one another. So random programs can't mess with the banking app.

Some people in this thread have proposed using separate devices for secure computing (eg banking) and "hacking". That's probably the right thing in practice. But you could - at least technically - build a device that let you do both on top of SeL4. Just have different sandboxed contexts for each type of software. (And the root kernel would have to be trusted).

EvanAnderson 6 hours ago | parent [-]

I'm not familiar with SeL4 other than in the abstract sense that I know it's a verified kernel.

I interpreted your statement "Then also allow the kernel to run linux as a process, and run whatever you like there, however you want." as the Linux process being analogous to a VM. Invoking an emulator wasn't really the right analogy. Sorry about that.

For me it comes down to this:

As long as the root-of-trust in the device is controlled by the device owner the copyright cartels, control-freak developers, companies who profit end users viewing ads, and interests who would create "security" by removing user freedom (to get out of fraud liability) won't be satisfied.

Likewise, if that root-of-trust in the device isn't controlled by the device owner then they're not really the device owner.

josephg 5 hours ago | parent [-]

Yes; I think that's the real impasse here. As I say, I think there is a middle ground where the device owners keep the keys, but programmers can run whatever software they want within sandboxes - including linux. And sandboxes aren't just "an app". They could also nest, and contain 3rd party app stores and whatever wild stuff people want to make.

But a design like this might please nobody. Apple doesn't want 3rd party app stores. Or really hackers to do anything they don't approve of. And hackers want actual root.

dwaite 9 hours ago | parent | prev | next [-]

Yes, sandboxing is a technological protection, but once you have important data flowing we often don't have technological protections to prevent exfiltration and abuse. The global nature of the internet means that someone who publishes an app which abuses user expectations (e.g. uses accessibility to provide command and control to attackers) is often out of legal reach.

You also have so much grey area where things aren't actual illegal, such as gathering a massive amount of information on adults in the US via third party cookies and ubiquitous third party javascript.

Thats why platforms created in the internet age are much more opinionated on what API they provide to apps, much more stringent on sandboxing, and try to push software installation onto app stores which can restrict apps based on business policy, to go beyond technological and legal limitations.

kllrnohj 9 hours ago | parent | prev | next [-]

The problem is it's quite easy to poke holes in a sandbox when you're outside the sandbox looking in, especially when the user is granting you special permissions they don't understand. These apps aren't doing things like manipulating the heap of the banking app, they are instead just taking advantage of useful but powerful features like screen mirroring to read what the app is rendering.

curt15 9 hours ago | parent | prev | next [-]

> As far as I could tell, they did it by convincing her to install some android app on her phone and then grant that app accessibility permissions.

Did she make it through the non-google play app install flow?

nijave 9 hours ago | parent | prev [-]

Web browsers already handle sandboxing

curt15 7 hours ago | parent [-]

Don't know why this was downvoted. Some people prefer to access online services from the safety of a web browser sandbox than through an always-installed wrapper app.

gzread 10 hours ago | parent | prev | next [-]

Then don't issue an app. Issue people cards to pay with and let them come to the bank for weird transactions.

catdog an hour ago | parent | next [-]

You can even use the chip on the card together with some cheap HW device to authorize the transactions made with the app. This actually exists [1] for quite some time but seems to be mostly limited to Germany. But this and the use of other HW tokens systems is on decline. Banks increasingly use apps now, increasingly without any meaningful second factor, not even offering better options. They want this and are fully to blame.

[1] https://en.wikipedia.org/wiki/Transaction_authentication_num... (This is a bit outdated, nowadays it works via QR codes instead of those flickering barcodes but the concept stays the same)

quesera 10 hours ago | parent | prev | next [-]

That'd be great, if your goal was to hemorrhage customers.

drnick1 9 hours ago | parent | prev [-]

This 100%. I don't understand why everything needs to be an app nowadays. Some things are best done in person and without to technology. No, I won't install some shitty app that requests location and network access to order lunch. If a venue does not provide a paper menu and accept cash, they have just lost my custom.

pas 4 hours ago | parent [-]

Revolut seems to work without physical presence.

And the website and app of my bank with offices is ... how should I put it ... a bit Kafkaesque.

The obvious thing banks should be doing is putting fucking restrictions on these accounts by default and let people ask for exceptions.

And of course if regulations don't encourage them to pick social-engineering-proof defaults then things won't improve.

nijave 9 hours ago | parent | prev | next [-]

Yeah, I worked at a bank once. I was told following policy and using dependencies with known vulnerabilities so my ass was covered was more important than actually making sure things were secure (it was someone else's problem to get that update through the layers of approval!). Needless to say, I didn't last long

ls612 10 hours ago | parent | prev [-]

Do you allow customers to log in to their account with a web browser on a windows machine?

daemin 8 hours ago | parent | prev [-]

What would happen to a normal person's phone when Google decided to revoke their Google account? Will the phone still function? Or is it "just" a matter of creating another Google account?