• July 7, 2021

Gaming the System: Raising Responsible Kids in the Digital Age

Gaming the System: Raising Responsible Kids in the Digital Age

Gaming the System: Raising Responsible Kids in the Digital Age 1024 683 Time To Act


Read Original Medium Article

I wear many hats in life — television producer, attorney, entrepreneur — but the role that speaks to me most is being a mother. However, despite the many joys parenthood brings, it opened my eyes to many dangers I was previously unaware of. This has made me become particularly focused on a risk that has crept into many of our homes, and ironically with our blessing: the faceless social network of online multiplayer video games.

The birth of the internet in the 1990s was immediately accompanied by the fear of the unknown of what may lie in cyberspace. For decades, the world wide web has been a new playground for kids of all ages, with many of the same potential menaces that pre-digital generations were trained to avoid in real life — predators, crooks, and con artists.

This makes it especially important to show younger generations that anybody can be anything online. A person’s age, identity, and intentions can come cloaked in a deviously coded anonymity. It’s a lesson that many adults struggle with themselves, however, the sharp rise in fake profile-based scams continues to rise. In addition, the constantly-evolving world of online multiplayer video games adds new and sophisticated dimensions to this age-old struggle.

Almost every teen boy in the United States –and a vast majority of teen girls — plays video games. One recent estimate from the Pew Research Center puts it at 97% and 83% respectively. Games are no longer just a fun activity; they can be a gateway to scholarships and careers in a wide range of areas from game design to eSports.

As top-tier games like Fortnite and Minecraft have achieved unparalleled reach with youth demographics over the last decade, online sex crimes have gone up as well. Just between 2013 and 2019, the number of reported “sextortion” crimes (where a child is coerced online into producing or distributing illegal sexual content of themselves) rose from 50 to over 1500, a number that experts believe to only be a fraction of the actual sum.

This problem is a result of several interconnected factors. First, the issue of user privacy, and the question of how developers can monitor private messaging without infringing on the rights of their customers.

Second, the actual process of identifying inappropriate or illegal content is extremely complicated. Identifying a concept as vague as age is difficult for algorithms, and forensic specialists have warned that new machine learning systems are unlikely to be effective “any time in the near future.” This is further compounded by the risk of false identification.

While children need the ability to have a private space because it is important for their development, it doesn’t need to be in an environment where adults can find themselves one-on-one with a child in a chat room.

There are current systems in place: Microsoft’s PhotoDNA scans for child pornography, Project Artemis looks for conversations that indicate child grooming, Roblox applies filters to all chats with bad language, but it also seeks and blocks situations where one player tries to convince another into talking offline, usually by asking for their phone number.

There is no simple solution. It takes a village. We need both gaming and social media platforms to step up and commit to doing their part. One way to achieve this is to standardize a combined human and artificial intelligence moderation policy. Parents also need to play an active role by taking responsibility, monitoring their children and teaching them to block and report inappropriate and offensive content. Also, they need to ensure their children will report and block the inappropriate or offensive content. The least controversial approach is education; parents need to inform and prepare kids on how to deal with online predators.

Raising the bar means raising our standards. Demanding online communities foster healthy environments, protecting its users from toxic behavior and being unapologetic in doing so. While also raising our standards as users. We need to understand the motivation behind our behavior, and ask, “Why am I sharing this content? Why am I making this comment? Does this contribute to the greater good? Would I say this in real life?”

Online resources can benefit its users in a multitude of ways. However, The anonymity of online communications creates a world where kids can say and be spoken to with language that can be detrimental to their well being and may lead to depression and self-esteem issues. Talk to your children openly about online etiquette and urge them to speak up by reporting or blocking those who may be viewed as a threat.

Luciana Brafman | July 7, 2021 | Photo Credit: Jessica Lewis / Pexels