Apple removed messaging app Telegram from its app store because some users were sharing images of child abuse.
The explanation was revealed in an email from Phil Schiller, manager of the App Store, which was published by Apple news site 9to5Mac.
The secure messaging app returned to the app store within hours with fixes to prevent the illegal content being served to users, it said.
Mr Shiller said users “who posted this horrible content” had been banned.
The email, which 9to5Mac said it had verified with Apple read: The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps.
“After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).”
Apple said it had put in place more controls to “keep this illegal activity from happening again”.
Telegram has been accused of harbouring violent and extremist content on its platform in the past and its use was restricted in Iran in December after claims it was used to organise four days of anti-establishment protests.
And in November, Afghanistan moved to ban the app in an effort to prevent the Taliban and other insurgent groups from using it.
The messaging app has a high level of encryption and allows for large chats of up to 50,000 users. Its secret chat function allows messages to self-destruct after they are sent.
Prime Minister Theresa May recently singled out the app as a place where criminals can hide their activities.
“No-one wants to be known as the terrorists’ platform or the first-choice app for paedophiles,” she said.