Telegram has recently been accused of failing to crack down on violent, extremist conversation in the aftermath of the January 6 attack on the US Capitol building. In a recent report in The Washington Post, a non-profit has also sued Apple, demanding that the Cupertino-based giant removes Telegram from its App Store. In the lawsuit, the Coalition for a Safer Web, a nonpartisan group that advocates for technologies and policies to remove extremist content, alleged Telegram's role in hosting white supremacist, neo-Nazi, and other hateful content and argued that such content puts Telegram in violation of Apple's terms of service for its app store.
As the non-profit called for Telegram to be removed from both Apple and Google (a similar suit is planned against Google, the Washington Post report said), the messaging app's founder has put out a statement, shedding light on the company's efforts to moderate content, and reiterating Telegram's stand on peaceful debate and process. In his statement, Pavel Durov said that Telegram explicitly prohibits public calls for violence. "In the last 7 years, we’ve consistently enforced this rule globally, from Belarus and Iran to Thailand and Hong Kong. Сivil movements all over the world rely on Telegram in order to stand up for human rights without resorting to inflicting harm," Durov said in the statement.
Further, he said that in early January, the Telegram moderation team started to receive an an increased number of reports about US-related public activity on our platform. Telegram acted decisively by clamping down on US channels that advocated violence. Durov said that last week, moderators blocked and shut down hundreds of public calls for violence that could’ve potentially reached tens of thousands of subscribers.
Durov, in his statement, also said that while the US only accounts for two percent of Telegram's user base, it is still actively watching the situation closely.