Vikshepa Blog

Mental Distractions

29 Jan 2022

Moxie and Ceglowski

I found this Twitter interchange regarding Telegram, between Moxie Marlinspike and Maciej Ceglowski, interesting. It is from December 2021. I had somehow seen the Moxie tweets earlier, but hadn't seen Ceglowski's, who brought the practical example of Telegram's use during the Hong Kong protests.

Marlinspike is the man behind the Signal messenger. Ceglowski is the man behind the social bookmarking service and an interesting writer on society, politics and the internet.

I have pulled their tweets out of Twitter and connected them.

Moxie Marlinspike:

It's amazing to me that after all this time, almost all media coverage of Telegram still refers to it as an "encrypted messenger."

Telegram has a lot of compelling features, but in terms of privacy and data collection, there is no worse choice. Here's how it actually works:

Telegram stores all your contacts, groups, media, and every message you've ever sent or received in plaintext on their servers. The app on your phone is just a "view" onto their servers, where the data actually lives.

Almost everything you see in the app, Telegram also sees

Here's a simple test: delete Telegram, install it on a brand new phone, and register with your number. You will immediately see all your conversation history, all of your contacts, all the media you've shared, all of your groups. How? It was all on their servers, in plaintext.

The confusion is that Telegram does allow you to create very limited "secret chats" (no groups, synchronous, no sync) that nominally do use e2ee, even if the security of the e2ee protocol they use is dubious.

There's no e2ee by default, but they talk about it like there is

FB Messenger also has an e2ee "secret chat" mode that is actually much less limited than Telegram's (and also uses a better e2ee protocol), but nobody would consider Messenger to be an "encrypted messenger."

FB Messenger and Telegram are built almost exactly the same way.

Some may feel okay letting Telegram have access to all of their data, msgs, images, contacts, groups, etc. because they "trust Telegram."

However, the point of an "encrypted messenger" should be that you don't have to trust anyone other than the ppl you're communicating with.

Actual privacy tech is not about trusting someone else w/ your data. It's about not having to. A msg you send should only be visible to you & recipient. A group's details should only be vis to the other members. Looking up your contacts should not reveal them to anyone else.

Privacy tech is really about making the tech consistent with the UI. But if Telegram's UI were consistent with the way the tech worked, every chat would be a group chat with everyone that works at Telegram + everyone that hacks Telegram + every gov that accesses Telegram, etc

For the folks writing about this space, my request is that when you write "encrypted messenger," it should at minimum mean an app where all messages are e2ee by default. Telegram and FB Messenger are built exactly the same way. Neither are "encrypted messengers."

Maciej Ceglowski

There's a disconnect between critiques of Telegram and its practical use that have made me uneasy about joining technical pile-ons around how it's not really encrypted messaging. Let me use the example of Telegram use in the Hong Kong protests.

1/I arrived in Hong Kong with each hair standing individually on end because everyone was using Telegram, which of course stores every group chat server-side like Moxie says. It took me a while to understand why it was so popular despite this shortcoming.

One reason was the ability to have three scales of chat in one app—really enormous (tens of thousands) of groups where you didn't have to share your identity, regular group chat, and one-on-one chats with people.

The one-on-one chats were popular because they could be set to an ephemeral mode, so that if a cop caught you and made you unlock your phone, you wouldn't get them in trouble. The huge supergroups were useful for organizing protest events and broadcasting information.

People were trying to avoid getting recognized in the moment, caught in the moment, or having to broadcast their identity to a huge group of strangers (HELLO I AM INTERESTED IN ATTENDING YOUR PROTEST), although this later turned out to be a huge hole in Telegram and caused a fuss

So the tradeoff was a mix of the app being usable and useful, safety in numbers, basic anonymity features in large groups, the ability to have massive supergroups, and disappearing chat. Compare this to Signal, where you saw everyone's phone number and it was buggy as hell

If the Chinese government wanted to come after you individually, you were screwed no matter what app you used. People brought phones to protests and that cell tower data was stored somewhere much easier for the PRC to obtain than even hacking Telegram.

The whole thing left me feeling far more confused about the role of E2E than I had been going in. Even today, if a state actor is seriously interested in you specifically, it's game over. Signal can keep your messages triple secret all it wants, but it doesn't really matter.

Either your device will be compromised, or the person you are having the triple-secret conversation is a government agent to begin with and even wearing a secret decoder ring on each finger is going to help.

So I think the right way to think of Telegram is an "encrypted enough" messenger, and for E2E purists to take a more careful look at why it is so widely used in protests movements, and why people find using "real" encrypted apps like Signal such a pain in the ass

The broader problem of ephemeral or spur of the moment protest activity leaving a permanent data trail that can be forensically analyzed and target individuals many years after the fact is unsolved and poses a serious risk to dissent. But E2E is not the solution to it.

I feel like Moxie and a lot of end-to-end encryption purists fall into the same intellectual tarpit as the cryptocurrency people, which is that it should be possible to design technical systems that require zero trust, and that the benefits of these designs are self-evident

But a truly trustless system is inhuman, and you're going to get monstrous results if you try to impose it on human behavior. Homo encrypticus doesn't exist any more than homo economicus. We need to think more deeply about how to make these technologies serve people as they are

The most dangerous thing about social software systems today is that they impose consequences on everyday actions that are unbounded in severity and time. You can be fired today for a social media comment you made as a kid, you can have $100M stolen by plugging in a USB device.

Reducing the blast radius of normal human mistakes, dismantling the permanent record part of the surveillance economy, and not forcing people to make irrevocable lifetime decisions every time they use a phone are the only way out of this mess. That's not solvable with software.

Moxie (response):

Hey what do I know, maybe sending all of our plaintext data to a Russian oligarch & his associates to indelibly manage is the solution to online privacy.

I’m just saying that we shouldn’t call it an “encrypted messenger,” because it simply isn’t - any more than FB Messenger is.

Tags: privacy surveillance
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.