Apps aimed at children have been available since the inception of the App Store. However, not all apps created for minors are safe to use. This is the main finding of a new survey conducted by two child safety organizations. The report presents the results of a 24-hour research study in which 800 apps were reviewed, and the findings are concerning.
The Heat Initiative and ParentsTogether Action study found that Apple’s App Store is a mass distributor of risky and inappropriate apps to children. Many apps have features that put children at risk of sexual abuse and exploitation, low self-esteem and poor body image, disordered eating, exposure to sexual and violent content, and more. Apple claims that the App Store is a safe place for children, but the study found that Apple takes no legal responsibility for the veracity of age ratings.
Of the 800 apps reviewed, more than 200 risky apps were identified. These apps have a combined total of more than 550 million downloads. As part of the study, the researchers examined five categories of apps: chat, beauty, diet and weight loss, internet access, and gaming. They focused on apps with an age rating of 4+, 9+, or 12+. Apps rated 17+ were not included in the documentation.
The study revealed several startling findings. For example, it uncovered 25 chat apps that made it possible for strangers to communicate with minors. Other apps are designed to provide unfiltered internet access, even though filters at home or school are supposed to block banned sites.
Elsewhere, several beauty and body-related apps encouraged fasting and setting starvation-level calorie goals. Others, including “gaming” apps, focused on users providing naked photos and ones focused on violence.
Apple describes the App Store as a “trusted place” where users can safely discover and download apps. Apple conducts automated and manual checks on every app available in the App Store to ensure this safety. Additionally, the company has introduced safety features like Screen Time to protect underage users. However, with nearly 2 million apps hosted on the platform for devices like the iPad Air and iPhone 16, it would be impossible for Apple to identify every problematic app. As Apple states, “App Store security measures alone can never be perfect.”
Ona personal note, my daughter recently turned 18, and I have allowed her to use apps since she was around 2. Over those 16 years, we discovered many apps that were marketed toward minors, but were highly inappropriate. As she grew older and her curiosity increased, I found it more challenging to shield her from the more explicit content on the App Store. However, I’d like to think that thanks to the tools Apple has provided over the years and my parenting efforts, she was protected from most, but no doubt not all, of the negative influences.
The survey makers suggest several ways to improve the App Store situation, including adding third-party reviews, increasing App Store rating process transparency, enforcing age ratings with accountability, and implementing more effective parental controls. These all sound like valuable solutions to consider, although some bad apples will no doubt remain even then.
The most effective way to protect minors from harmful content is for Apple to improve its process of identifying inappropriate apps before they receive approval. Apple should also collaborate with external organizations such as Heat Initiative and ParentsTogether Action to more rapidly identify and address any dangerous apps that may have slipped through the approval process. Even then, the best defense is engaged and informed parenting.
Parents can easily block the installation of new apps on their children’s devices using existing tools. They can also monitor the apps that have already been installed to ensure they are appropriate. This involves briefly using the apps themselves to verify that they are safe.