Remix.run Logo
Almondsetat 13 hours ago

>this also creates a situation where anything said across federation cannot be unsaid, which is an ironic situation for a protocol/system that often comes up when talking about privacy.

How is it ironic? No protocol in the world can force anyone to delete anything from their own device. Chat apps that implement this function are either proprietary (so you cannot control what they can do) or, if OSS, do it on a pinky-promise-basis.

progval 12 hours ago | parent | next [-]

> No protocol in the world can force anyone to delete anything from their own device.

But they either do not sign the messages or allow repudiating the signatures. Matrix signs all events forever.

Matrix also makes the entire event history (minus message content depending on room configuration) available to servers on join, even if that server's users are not allowed to see it.

wahern 10 hours ago | parent | next [-]

These are distinctions without a difference. Events replicated across several independent Matrix servers are not meaningfully different than events broadcast across independent clients in terms of observability or repudiation.

progval 9 hours ago | parent [-]

But normally when you join a conversation and are not allowed to see previous messages, you don't see anything about them. A matrix server does.

Arathorn 8 hours ago | parent | prev [-]

> Matrix also makes the entire event history (minus message content depending on room configuration) available to servers on join

The important bit is the bit in brackets: as you say, historical message content is not shared if the room's is configured not to share history.

broken-kebab 12 hours ago | parent | prev | next [-]

A protocol can mandate forced deletion. A particular client implementation may ignore it, or some users may circumvent it, so it would be a weaker kind of feature, but still a feature. And depending on circumstances it can be quite useful.

nicoco 11 hours ago | parent | next [-]

An open protocol can mandate indeed, but that is still in the realm of pinky promise security. A better design for a privacy-friendly chat protocol is to not write a lot of stuff on a lot of different remote servers when that's not necessary IMHO. One of matrix's selling points is to be censorship-proof though; in that case copying stuff as much as possible makes a lot more sense.

broken-kebab 11 hours ago | parent [-]

>pinky promise security

You are right, though I still prefer "weak feature" as a term :) There's enough value in such things. Cryptography crowd is concentrated on omnipotent Eve breaking ciphers, and that wrench from xkcd, but I dare to claim that majority of both commercial and private leaks happen just because well-intentioned users don't have enough capacity to keep track of all the things, and proverbially think twice. Features like "unsend", or timed deletion are indeed laughable on their purely technical merits, but do wonders saving users from grave mistakes anyway.

davorak 10 hours ago | parent [-]

It's hard to explain to a non technical user. Something like "We tried to delete the message, but some of the people who received your message might still have a copy." Does not sound great and is going to be hard for a non technical user to understand and hard to implement in a way that a non technical user will find satisfying.

So if I was a dev on matrix/element and this feature came across my plate I would have to weigh it against features that I know can be implemented in a way which make technical and non technical people feel satisfied and better about the application.

wkat4242 9 hours ago | parent [-]

That is exactly what happens in WhatsApp though. Maybe the message isn't there anymore but it used to say pretty much exactly that.

Almondsetat 11 hours ago | parent | prev | next [-]

A protocol can only support, never mandate. If I send you "DELETE MSG #4829" and you do nothing and reply with "200 OK; DELETE MSG #4829", nobody observing the protocol's messages will ever know what happened. Sure, an omniscent being could say "but he internally broke protocol, he didn't delete the message!", but by definition if something cannot be verified inside the protocol, it is outside of protocol.

nicoco 11 hours ago | parent | next [-]

Sure.

In practice, in federated networks bad actors end up being blacklisted. It does not provide any "formal" guarantee, but… it tends to work fine enough. For this specific "deletion request" feature, of course it should always be seen as a convenience thing, and absolutely not about security.

As with many engineering things, it's tradeoffs all the way down. For instant messaging, a federated approach, using open protocols, offers what I value most: decentralisation, hackability, autonomy, open source. My options in this space are Matrix or XMPP. I have not attempted to self-host a matrix server, but have been very happy with my [prosody](https://prosody.im/) instance for almost a decade now.

AJ007 10 hours ago | parent [-]

I don't know what's wrong with XMPP other than the network effect collapsed when the GMail chat thing was killed, while the mobile client options were poor for a very long time.

Matrix has the appearance of being a drop in replacement for Slack or Discord, but the design decisions seem so compromised that the only explanation is they did manage to establish a (somewhat weak) network effect? It certainly is not a good look for an open source project to be running on Slack or Discord (free/cheap plans rugpulled or to be soon.) Then that leaves IRC, which has a network effect collapsing at a much slower pace.

I never got far enough to try hosting a matrix server, but reading the linked post -- Matrix definitely is not GDPR compliant. The combination of whatever end form of ChatControl the EU gets along with possibly hundreds of other laws across the world and individual US states makes me think the days of a public facing non-profit or small startup running a project like this are over. (Or maybe the future of open source is funding lawyers while the development is all done for pennies by AI?)

wkat4242 9 hours ago | parent [-]

The GDPR is being neutered anyway because the EU caved in to Trump.

Not being chatcontrol compliant? That's a feature not a bug. Nobody wants that anyway. Just another stupid US lobby (Thorn).

A big organisation won't be able to run matrix for everyone no but that's the cool thing about it. People can run their own for smaller groups of people.

broken-kebab 11 hours ago | parent | prev [-]

I don't know such definition frankly. And to the best of my knowledge there are plenty of things which people call "protocols" strongly prescribing actions non-verifiable in the very sense you used. That said I'm not here for a terminological discussion. We may call it green cheese, but it's still a useful feature.

Almondsetat 11 hours ago | parent [-]

Nobody claimed it isn't a useful feature. The only claim I made is that it cannot be mandated with an open protocol, so if you expect 100% adherence in the name of privacy, you're setting yourself up for disappointment.

broken-kebab 11 hours ago | parent [-]

Good, nobody claimed any expectation of 100% adherence as well!

miloignis 10 hours ago | parent | prev | next [-]

True, and Matrix has the weaker version of the feature: https://spec.matrix.org/v1.16/client-server-api/#redactions It should absolutely work in normal situations across all servers and most all clients.

zenmac 11 hours ago | parent | prev [-]

People should related to anything federated like email. If you send something it is in someone else's computer now. With matrix or any e2ee protocols it is depending on pinky promise of the client to modify it. I thought the whole Snapchat fiasco already taught us that. Did we forget?

XorNot 3 hours ago | parent [-]

There's a difference between "I have an active adversarial actor" as a security model and "sometimes I send something I don't want to and want to delete it, the people watching are friends and acquaintances and are not deliberately preprepared to collect kompromat".

When I delete a message off Signal chat, the expectation is that the chat members are agreeing by social contract to abide by that.

a3w 13 hours ago | parent | prev | next [-]

Right, but we did have efforts to take over hardware security enclaves to deliver user data, instead of copyrighted company data, to user devices.

Tim Berners-Lee tries to make the internet a place where you can choose, what it "forgets". At least that were the news I got from the 2010s and early 2020s. As for how: DRM-like tech in the hands of users should allow for that.

So having privacy by design would be nice, and e.g. many messengers try to do "it is inconvenient to copy a message that someone send you that is marked as view-only-once-or-up-to-a-timespan, but of course, you can use an external camera, i.e. make more low-fidelity copies or even exfiltrate data".

Even F/LOS software can use/would be forced to use these proprietary enclaves or at least non-user accessible key stores. (As far as I understand hardware level DRM.)

Almondsetat 12 hours ago | parent [-]

>Tim Berners-Lee tries to make the internet a place where you can choose, what it "forgets". At least that were the news I got from the 2010s and early 2020s.

Tim Berners-Lee created the web, not the internet, which is what chat apps use. Also, unless you can provide some direct quotes about it being designed for "forgetting" stuff, I have no idea where these "news" you got came from.

>As for how: DRM-like tech in the hands of users should allow for that.

If it's in the hands of the users, i.e. open source, it can be disabled at any moment, which is exactly what my reply already addressed.

everforward 11 hours ago | parent | next [-]

I think they're talking about Solid, Tim Berners-Lee's newer venture: https://en.wikipedia.org/wiki/Solid_(web_decentralization_pr...

warkdarrior 2 hours ago | parent | prev [-]

>> As for how: DRM-like tech in the hands of users should allow for that.

> If it's in the hands of the users, i.e. open source, it can be disabled at any moment, which is exactly what my reply already addressed.

The point is that with the help of hardware-backed DRM on the client, the Matrix server could send data only to unmodified clients. You modified your client in a way that does not match what the Matrix server expects? No data for you.

Almondsetat 2 hours ago | parent [-]

You're repeating the same thing with more words. If you cannot control your hardware-backed DRM, then Matrix requires proprietary blobs to work.

dust-jacket 12 hours ago | parent | prev | next [-]

Yeah I thought this was a weird take too. Too often people take privacy for "I can do what I like". IMO deleting something you've sent to someone else is not a privacy concern at all.

tenthirtyam 12 hours ago | parent | next [-]

IIRC it is possible to have some clever encryption so that the person you sent your message to can prove to their own satisfaction that it came from you, but they cannot prove to anyone else that it came from you. Which gives you plausible deniability; you can always claim that your contact forged the message.

Can't remember what the algorithm is called.

upofadown 4 hours ago | parent | next [-]

No particular name. Just deniability. I personally like to call this particular scheme, deniability through claimed forgery. Not particularly clever. You just provide your correspondent with what they need to forge your messages after the end of the session.

I don't know if it actually could work in practice:

https://articles.59.ca/doku.php?id=pgpfan:repudiability

gabrielhidasy 11 hours ago | parent | prev | next [-]

Isn't the scheme simply agreeing in a shared key and both using it? I'll know that the message is from you if it's signed with that key and is not from me and vice versa, but neither of us can prove who created the message.

XorNot 3 hours ago | parent | prev [-]

Off The Record chat did this.

https://en.wikipedia.org/wiki/Off-the-record_messaging

bityard 12 hours ago | parent | prev [-]

I don't agree with it myself, but there are people who seem to want to frame "the right to be forgotten" as a privacy issue.

rapnie 10 hours ago | parent | next [-]

Just one example, but trying to get that revenge porn off the web, can be seen as an attempt to restore ones privacy. Where others should not have the right to continue to peek into ones private life.

bityard 2 hours ago | parent [-]

That's not quite the kind of thing I was talking about. I think that is generally already covered by current laws in most places?

The right-to-be-forgotten advocates argue that everyone should have the right to demand that any trace of their previous online existence be deleted. On social media of course, but also independent web forums, chat logs, git commits, etc.

Almondsetat 12 hours ago | parent | prev [-]

Even if it were a privacy issue, it would be impossible to enforce it technologically via FOSS software, because, by definition, the user at the other end could run a forked version with remote deletion disabled.

12 hours ago | parent | prev | next [-]
[deleted]
thaumasiotes 12 hours ago | parent | prev [-]

> How is it ironic? No protocol in the world can force anyone to delete anything from their own device.

You may have noticed the constant pushing to remove the user's access to their "own" device.

Forcing people to delete things from their own device is the whole concept of the Snapchat protocol, for example. Snapchat fortunately doesn't offer an OS and can't meaningfully be part of this push, but they make a convenient illustration.

You can check out Snapchat's bug bounty policy here: https://hackerone.com/snapchat . On the list of ineligible vulnerabilities is "screenshot detection avoidance". That's not a bug (because there's nothing they can do about it), even though it is their product's selling point.

Sometimes stronger companies want similar things, and they try to do something about it.