Apps marked safe for children on Google’s Play Store, some of which have been downloaded millions of times, contain violence and disturbing images, Wired reported.

The news outlet stated problems include games laden with guns, knives, “gore” and gambling, with parental controls apparently having little impact.

Wired sent Google a sample of 36 games with content it deemed not in line with Play Store age ratings, with a further 16 featuring “dubious content and permissions” some of which tracked location. A total of 16 of these titles have since been removed or had their ratings and permissions revised.

Google hit back, with a representative telling Mobile World Live: “When we find that an app has violated our policies, we remove it from Google Play. We want children to be safe online and we work hard to help protect them.”

“Apps in our Designed for Families Programme must follow more stringent requirements including content and advertising restrictions, and provide a declaration that they comply with all applicable privacy laws. We take action on any policy violations that we find.”

However, Wired stated it’s not the apps in the Designed for Families scheme that were of concern, but rather those which don’t come under that banner.

Not strict enough
It was worried Google doesn’t have a stringent system to assign a rating to a game, noting Apple and Nintendo have someone manually review each title.

Google on the other hand lacks a “robust, human-monitored system”, with age ratings assigned automatically after developers complete a questionnaire.

The implication is unscrupulous developers may provide misleading answers to gain a child-friendly rating, and so boost downloads and earnings.

Its not just violence that’s the problem, though. A cosmetic surgery simulator app or ones involving gambling with no actual payout may be considered inappropriate for children, but because they don’t have outright violence they will be rated as suitable for all.

Earlier this year Google-owned YouTube for Kids came under fire for containing disturbing content related to school shootings and suicide.