Social network Tumblr makes new changes to stay on Apple App Store

In 2018, Tumblr's iOS app was taken down from the App Store under the child sexual abuse material (CSAM) policy. A month later, the platform reacted by banning all porn and other sexually-explicit content, resulting in a 29 per cent monthly traffic decrease.

Microblogging and social networking website Tumblr which faced a years-long struggle with approval on the iOS App Store, has said that they have made fresh changes in order to remain on the Apple App Store.

In 2018, Tumblr's iOS app was taken down from the App Store under the child sexual abuse material (CSAM) policy.

Advertisement

A month later, the platform reacted by banning all porn and other sexually-explicit content, resulting in a 29 per cent monthly traffic decrease.

Since then, the platform's web traffic has remained relatively stagnant, reports The Verge.

Advertisement

Also Read | Indian scientist working on next-gen low-cost semiconductor materials

"In order for us to remain in Apple's App Store and for our Tumblr iOS app to be available, we needed to make changes that would help us be more compliant with their policies around sensitive content," Tumblr has said in a latest blog post.

Advertisement

Many Tumblr users come to the platform to talk anonymously about their experiences.

The platform said that "for those of you who access Tumblr through our iOS app, we wanted to share that starting today you may see some differences for search terms and recommended content that can contain specific types of sensitive content".

Advertisement

"In order to comply with Apple's App Store Guidelines, we are having to adjust, in the near term, what youa¿re able to access as it relates to potentially sensitive content while using the iOS app," said the platform.

Also Read | Xiaomi reportedly patents clamshell flip phone

Advertisement

To remain available within Apple's App Store, the company had to extend the definition of what sensitive content is as well as the way its users access it in order to comply with Apple guidelines.

"We understand that, for some of you, these changes may be very frustrating - we understand that frustration and we are sorry for any disruption that these changes may cause," said Tumblr.

Advertisement

Apple's CSAM feature is intended to protect children from predators who use communication tools to recruit and exploit them.

It is part of the features including scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos and expanded CSAM guidance in Siri and Search.

Advertisement


 

Advertisement

Advertisement