Telegram    app edited

Hadlee Simons / Android Authority

TL;DR

  • Telegram adds reporting tools for illegal content shared in private discussions.
  • Previously the app only handled reports for public-facing content.
  • The policy shift follows just weeks after CEO Pavel Durov's arrest by French authorities.
  • Update: September 6, 2024 (2:21 PM ET): In a public statement shared on Telegram, co-founder and CEO Pavel Durov speaks out against the characterization of his company as "some sort of anarchic paradise," emphasizing the systems it has already implemented:

    Establishing the right balance between privacy and security is not easy. You have to reconcile privacy laws with law enforcement requirements, and local laws with EU laws. You have to take into account technological limitations. As a platform, you want your processes to be consistent globally, while also ensuring they are not abused in countries with weak rule of law. We've been committed to engaging with regulators to find the right balance. Yes, we stand by our principles: our experience is shaped by our mission to protect our users in authoritarian regimes. But we've always been open to dialogue.

    Original article: September 6, 2024 (1:33 PM ET): When you're looking for a secure way to communicate with people around the world, regardless of the mobile platform they use, Telegram is going to be on your short list. But the app's robust focus on privacy has also emerged as a bit of a liability, most recently culminating in French authorities arresting co-founder and CEO Pavel Durov late last month. That legal pressure appears to already be resulting in some changes to how the service operates, as Telegram gives users tools to report illegal content.

    Telegram has quietly updated its FAQ page to share what's a pretty stark departure from its past policy for moderating what's communicated in private chats, as reported by Coindesk (via Engadget). Previously, the approach was essentially hands-off:

    All Telegram chats and group chats are private amongst their participants. We do not process any requests related to them.

    But now the company's singing a different tune, making no such blanket dismissal, and instead inviting users to engage with the app's reporting tools:

    All Telegram apps have 'Report' buttons that let you flag illegal content for our moderators — in just a few taps.

    Maybe the most important distinction is what's not being said there anymore. While Telegram had reporting tools before, they were only used in terms of public-facing content, like bots, stickers, or channels themselves. The only censorship the app really embraced was going after discussions in support of terrorism. But now this policy shift clearly opens the door for users reporting each other for what happens in their private chats.

    Right now we're curious to see the extent to which this move might satisfy governments like France's, which have been accusing Telegram of at the very least complacency when it came to illegal conduct being discussed and facilitated through the app. Reporting tools are one thing, but how will investigations initiated through this tool be handled and to what extent could the app's end-to-end encryption curtail those efforts? The next few months should prove to be interesting ones for Durov and Telegram, to say the least.

    Got a tip? Talk to us! Email our staff at news@androidauthority.com. You can stay anonymous or get credit for the info, it's your choice.

    You might like

    Comments