Getting Away With It

 

This evening I watched an ITV News report all about extremist content on the Roblox gaming platform. The UK's Independent Reviewer of Terrorism Legislation has issued a warning that "parents and teenagers "should not be using" Roblox, which is available to children as young as seven years old." Users could take part in mosque shootings,  school shootings, weblink to a white supremacy group and various costume 'skins' for a far-right political and white supremacy groups. According to ITV the platform has parental controls but even with these activated, users could access this content. All pretty disturbing, especially as a parent if you have put in place controls you believe will protect your child from content like this. In their defence a spokesperson for Roblox said, "No system is perfect, but as a result of the effectiveness of our teams and technology, it is very unlikely that someone would stumble upon these types of experiences or content. For example, in one of the games shared with us by ITV News, the visit numbers were very low - just a few hundred. By comparison, the most popular games on Roblox have billions of visits. In any case, we took swift action to remove the game from the platform."

This defence is a similar defence that we have seen repeated by social media platforms for years, "no system is perfect" and that this is just affecting a small number of users compared to the billions who use the platform. This is where I'm getting increasingly annoyed. In literally any other industry, governments step in and enact legislation to protect people, eg. food, drug, civil engineering, etc.  However, the online social platforms have for many years seemingly been 'getting away with it'. Little on no legislation and increasingly less content moderation. The blame always seems to be put back on the users, the platforms never take responsibility, even when,
"It's no secret that social media has devolved into a toxic cesspool of disinformation and hate speech. Without any meaningful pressure to come up with effective guardrails and enforceable policies, social media platforms quickly turned into rage-filled and polarising echo chambers with one purpose: to keep users hooked on outrage and brain rot so they can display more ads." Futurism
If an industry created a product that harmed its customers, legislation is created to protect us, take the example of the laws around food standards, toy standards, smoking or building regulations, for instance. In a recent post I wrote about whether it is possible to use generative AI ethically. There was a lovely quote by the author of a post I referred to in my blog. I'm going to borrow that quote and adapt it for online platforms.
"Here’s my proposal: It’s not on us, on you and me, to use online social platforms ethically or responsibly. It’s on the companies to build safe, reliable, ethical products. If you can’t do that and still make money, you don’t deserve to make money. And until that happens, I’d like governments, to lead with the message that these platforms as they currently exist simply cannot be used ethically."

Whilst I agree that its arguable that no system can be perfect, in many industries there is external oversight, ensuring that they produce products of a safe standard for their users. Online platforms (and GenAI) don't appear to work to the same standards. 

Comments

Popular Posts