The App Store was created to be a safe and trusted place for users to get apps, and a great business opportunity for developers. Apple platforms and the apps you build have become important to many families, as children use our products and services to explore the digital world and communicate with family and friends. We hold apps for kids and those with user-generated content and interactions to the highest standards. To continue delivering safe experiences for families together, we wanted to remind you about the tools, resources, and requirements that are in place to help keep users safe in your app.
Made for Kids
If you have an app that’s intended for kids, we encourage you to use the Kids category, which is designed for families to discover age-appropriate content and apps that meet higher standards that protect children’s data and offer added safeguards for purchases and permissions (e.g., for Camera, Location, etc).
Your app’s age rating is integrated into our operating systems and works with parental control features, like Screen Time. Additionally, with Ask To Buy, when kids want to buy or download a new app or in-app purchase, they send a request to the family organizer. You can also use the Managed Settings framework to ensure the content in your app is appropriate for any content restrictions that may have been set by a parent. The Screen Time API is a powerful tool for parental control and productivity apps to help parents manage how children use their devices. Learn more about the tools we provide to support parents to help them know, and feel good about, what kids are doing on their devices.
Sensitive and inappropriate content
Apps with user-generated content and interactions must include a set of safeguards to protect users, including a method for filtering objectionable material from being posted to the app, a mechanism to report offensive content and support timely responses to concerns, and the ability to block abusive users. Apps containing ads must include a way for users to report inappropriate and age-inappropriate ads.
iOS 17, iPadOS 17, macOS Sonoma, and watchOS 10, introduce the ability to detect and alert users to nudity in images and videos before displaying them onscreen. The Sensitive Content Analysis framework uses on-device technology to detect sensitive content in your app. Tailor your app experience to handle detected sensitive content appropriately for users that have Communication Safety or Sensitive Content Warning enabled.
Users have multiple ways to report issues with an app, like Report a Problem. Users can also communicate app feedback to other users and developers by writing reviews of their own; users can Report a Concern with other individual user reviews. You should closely monitor your user reviews to improve the safety of your app, and have the ability to address concerns directly. Additionally, if you believe another app presents a trust or safety concern, or is in violation of our guidelines, you can share details with Apple to investigate.
These user review tools are critical to informing the work we do to keep the App Store safe. Apple deploys a combination of machine learning, automation, and human review to monitor concerns related to abuse submitted via user reviews and Report a Problem. We monitor for topics of concern such as reports of fraud and scams, copycat violations, inappropriate content and advertising, privacy and safety concerns, objectionable content and child exploitation; and use techniques such as semi-supervised Correlation Explanation (CorEx) models, and Bidirectional Encoder Representations from Transformers (BERT)-based large language models specifically trained to recognize these topics. Flagged topics are then surfaced to our App Review team, who investigate the app further and take action if violations of our guidelines are found.
We believe we have a shared mission with you as developers to create a safe and trusted experience for families, and look forward to continuing that important work. Here are some resources that you may find helpful: