Joni Lappalainen, CEO
Youth protection issues in gaming are on the table in many countries across the world. This was kicked off by the controversies around loot boxes. Belgium has already banned loot boxes altogether and as a result Nintendo pulled their Animal Crossing: Pocket Camp and Fire Emblem Heroes off Belgium market last year. China has mandated already back in 2017 that game publishers have to add information about the contents of loot boxes: in the Chinese market it is now required to publish probability rates for items in loot boxes of the Chinese version of the game.
Through these rules and laws politicians want to ensure a safe environment for players of all ages – but they will inevitably also create massive challenges for game developers and publishers.
So, what are the most important trends in youth protection that you might want to look out and prepare for?
Firstly, I would like to draw attention to simulated gambling. Simulated gambling means visual or structural similarities to gambling. For example a roulette table or something resembling one is visual referral; whereas a loot box works in a similar way as a lottery: you pay to see if you get lucky and find something valuable – it’s a structural referral.
For adults, who understand how gambling works, these simulated gambling features are fine, but for younger players these features might be harmful. Legislators and pundits in many countries think these features introduce young players to the concept of gambling at an age when they might not yet be able to fully understand the value of money they are using or the mechanics behind a loot box or other simulated gambling device.
If there is to be stricter rules about simulated gambling in games targeted to minors, this would mean, for example, a Frozen themed game that has loot boxes would be out of limits, as the theme directly suggests it’s targeted to young audience.
Simulated gambling in games is discussed in many countries and changes in legislation might be coming our way too. The age ratings under International Age Rating Coalition (IARC) are moving the age rating on simulated gambling mechanics to 16+ and has set the following criteria:
- Visual or Structural similarities with gambling
- Payable stakes (real or virtual)
- Possible influence of in-app purchases
- Social media involvement
The second one I would like to address are ads in f2p-games. Features that enable faster progress or some other gain if you watch more ads, might be problematic from the viewpoint of youth protection. Why? Young players can’t yet consider the value of their time the same as adults. For this reason, the possibility to watch one ad per day for some perks is okay but giving an option to look at an unlimited number of advertisements in order to gain assets or progress in the game would not be, as it would require a more mature understanding of value of one’s time. The same goes for mechanics that can be seen as exploiting a child’s wish to play, such as skippable waiting times and short timed offers and the future might bring restrictions in order to protect young players.
Additionally, gaining progress by providing personal data – for example social media accounts – can be considered a red flag for the same reason: minors cannot be expected to understand the value and risks of giving up their personal data as well as adults. Collecting data from minors is also regulated by GDPR in European Union – a parent’s consent is always needed for the data of a minor under 16-year-old.
The third issue, and a very serious one rises from the fact that a lot of games nowadays offer ingame communication via chat rooms. This can enable something known as cyber grooming. Cyber grooming practically means an adult befriending a minor online in order to later abuse the trusting relationship they have created to their own personal gain – very often the motivation is to sexually abuse the child. The groomer tries to get the child to send pictures, videos or something they can blackmail the child with later.
Cyber groomers are able to operate in anonymous chat rooms and forums of games and websites targeted to minors. This is why there has been discussion of limitations to these: it might mean that in the future game publishers who are targeting minors could no longer offer unmoderated chat rooms at all. The chat rooms could still be provided but there would always have to be moderator present – to keep an eye on the discussions and to delete any content that might be harmful to minors. These limitations might lead to chat rooms being closed when there is no moderator present or lesser chat rooms for young players. In addition to this, the discussion suggests private chat options to be banned in games targeted to children.
Lastly, there has been talk in countries such as Germany about making a strong age verification mandatory for all 16+ content. According to proposed legislation in Germany even typing in your passport number wouldn’t do as passports are fairly easy to forge, or minors would probably also use the old trick of using an older sibling’s or parent’s passport to get in. There would have to be an even more reliable way to verify the user’s age, which would most likely mean also more complicated. So this age verification might actually become a very hard issue to solve for game publishers, not to even mention the potential players lost in the process if the verification becomes too troublesome.