Last week, we learned that Apple had abruptly removed the Telegram apps from its App Store, due to what it called "inappropriate content". The news might have been easy to brush off at the time, given that Apple can tend to have rather strict content policies, but as it turns out, the situation was much more serious than many would have guessed.
According to an email that was published by 9to5Mac and was written by Phil Schiller, the company had learned that the apps contained child pornography. Apple acted swiftly and removed the apps, alerted law enforcement and other authorities, and told the developers so that they could take appropriate action. Naturally, the users that posted the content were promptly banned.
The Telegram apps - Telegram and Telegram X - returned to the App Store later that day.
Here's the full email, as posted by 9to5Mac:
The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).
The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.
We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk – child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral.
I hope you appreciate the importance of our actions to not distribute apps on the App Store while they contain illegal content and to take swift action against anyone and any app involved in content that puts children at risk.
While Apple can be applauded for taking such quick action on this terrible matter, one really has to wonder where Google was. As of our report last week, the apps remained on the Google Play Store. Of course, Apple never clarified who provided the report of child pornography, and it's unclear whether Google received the same report, but it seems unlikely that the company was left in the dark on this.