Apps aimed at kids are often criticised for having a negative impact on psychological development as well as being a threat to privacy, while app creators are accused of trying to hook children in order to boost revenue.

The Campaign for a Commercial Free Childhood (CCFC) in the US is one of the loudest voices taking a stand. For instance, it wanted Facebook to discontinue Messenger Kids, a service aimed at those aged under 13-years.

It didn’t succeed, but that didn’t stop it from pointing out that young children are not ready to have social media accounts as they don’t know how to “navigate the complexities of online relationships”. It said apps can exacerbate problems including depression and unhealthy sleep habits.

Even YouTube for Kids has come under fire for containing disturbing content related to school shootings and suicide. CCFC said the main concern is that the app is often trusted by parents because it claims to make it “safer and simpler” for young people to browse videos.

Alarm bells
John Molloy, CEO at Zyalin Group, which offers parental controls services, told Mobile World Live: “The biggest privacy and security risk for kids is that they usually have a carefree attitude to downloading apps, without evaluating the risks associated with what they are accepting as part of the terms and conditions.”

“It is naive to expect that kids will evaluate the permissions required and then decide not to use the app based on their evaluation. Therefore, it is highly likely that they are downloading apps that require them to provide permission to access their camera, their gallery, their microphone, their location and other private data just so that they can play a game on their phone,” he continued.

This is extremely worrying and should ring alarm bells for all parents whose children have access to a tablet or phone.

Developers will prioritise revenue
The troubles don’t end there though. CCFC criticised YouTube’s practice of not only collecting user data, but also generating “significant profits from kid-targeted advertising”. It has also urged the US Federal Trade Commission (FTC) to investigate Android apps which they believe trick kids into making in-app purchases and watching adverts.

This brings us to unscrupulous developers who will likely be mining data for revenue and bombarding users with ads. Then there are those who try to get kids addicted so they will make in-app purchases and keep up engagement metrics.

Jenny Radesky, a developmental behavioural paediatrician at the University of Michigan, said: “The childhood app market is a wild west, with a lot of apps appearing more focused on making money than the child’s play experience. This has important implications for advertising regulation, the ethics of child app design, as well as how parents discern which children’s apps are worth downloading.”

In 2018 more than 200 psychologists wrote to the American Psychological Association to call attention to the “unethical practice of psychologists using hidden manipulation techniques to hook children on social media and video games”.

A case in point: The Vox said a children’s gaming app called Doctor Kids interrupts game play with an ad inviting users to purchase a game for $1.99. “There’s a red X button to cancel the pop-up, but if the child clicks on it, the character on the screen shakes its head, looks sad, and even begins to cry.” In some other ads “the cancel button is nearly impossible to find”.

“This all seems like fair game when trying to make money off adult gamers. But with apps that deploy this strategy for young children who are only just beginning to discover the world and their surroundings, the playing field is completely skewed,” the news outlet noted.

That doesn’t mean kids’ apps have no advantages of course. The UK Safer Internet Centre states educational apps requiring children to follow objects and interact with them can help their hand-eye coordination and improve their understanding of how to interact with systems, for example knowing that pressing the right button will invoke the desired action.

It added that games help children learn to solve problems as they work their way through the challenges presented to get to the next level.

Accountability
So whose responsibility is it to make sure kids are gaining only the benefits of apps?

Molloy argues that while regulators need to make policies around these apps more robust, in the short term, the responsibility for protecting children often lands with parents. “If parents are willing to provide their kids with access to the internet, then they need to ensure they apply the necessary controls to minimise their kids’ risk of exposure to inappropriate content and privacy violation.”

He added online safety should be an ongoing conversation between parents and children, and parents need to know what their kids are doing online just as much as what they are doing offline.

Australian lifestyle magazine OffSpring suggested “a handful of good, ad-free learning apps for the toddler and preschooler crowd” can be found if parents search enough. It also quoted a parent as saying they have to teach their kids the concept “there is no such thing as a free app”.

It explained some parents deal with tantrums as they refuse to pay for in-app purchases, while others use Amazon’s FreeTime Unlimited, iKydz (offered by Molloy’s company) or others which have parental controls, and block ads and purchases.

The Vox stated Google and Apple should monitor their app stores because “big tech companies might very well be the only ones that can rein in this type of behaviour.”

But that doesn’t mean regulators don’t have a role to play, and the FTC does take notice. Last month it hit popular short-form video app TikTok with a $5.7 million fine for illegally collecting personal information from children. “This is the largest civil penalty ever obtained by the Commission in a children’s privacy case,” the FTC said.

While developers are not likely to grow a conscience overnight and make apps safer and ad-free, such fines will certainly serve as a deterrent.

What’s more regulators need to work on tightening the rules around apps aimed at kids to make sure they are not collecting their data; or asking for access to location or camera unless essential for the app; or manipulating children into spending more time or money on them, to help save an entire generation from psychological problems and the dangers of being stalked: and parents from going broke.

The editorial views expressed in this article are solely those of the author and will not necessarily reflect the views of the GSMA, its Members or Associate Members.