Beyond WhatsApp and Facebook, there are many other platforms used by children and teens that may be open to abuse
Since 14-year-old Molly Russell killed herself in 2017, the apps and services our teenagers and children use – and their safety – have become a key concern for parents. Last week, the digital minister, Margot James, stated that “the tragic death of Molly Russell is the latest consequence of a social media world that behaves as if it is above the law”. James went on to announce plans to introduce a legally binding code and duty of care towards young users for social media companies.
Britain’s children are not just using the likes of Facebook, Instagram, WhatsApp, YouTube, Pinterest and Snapchat on a daily basis. There is a wealth of apps targeted at teens and children that have their own ecosystems and controversies.
Houseparty
What is it?
In essence, a multi-person Skype or FaceTime video conversation. The screen – the app can be used on smartphones or laptops and desktops – is split into up to eight different tiles. It’s the preteen equivalent of a conference call, with participants talking over one another – a way for the day’s gossip to continue beyond the school gates.
How safe is it?
You can connect on calls not only with friends, but friends of friends (with an on-screen warning about “stranger danger”). Different levels of security can be set on the account that limit the type of people who you can chat to, but some may choose to chat to anyone. “That would worry me,” says Dr Victoria Nash, deputy director of the Oxford Internet Institute, who has researched child safety online. “It combines live streaming video with stranger danger.” There is discouragement of dubious behaviour, though; users are asked to enter their phone number when registering, theoretically making them trackable.
Controversies
The phone number tie doesn’t stop criminals: two Mancunian children aged 11 and 12 were reportedly confronted by men who exposed themselves to other users in one chat in 2017. Anti-child sexual exploitation agencies in Rochdale have also investigated the app after preteen users were alleged to have been targeted by adult men using the app.
Kik
What is it?
Kik has been around for the best part of a decade, but the text messaging app still remains popular among teenagers and children, partly because it allows anonymous sign-ups that don’t require tethering an account to a phone number. Which is an immediate red flag.
How safe is it?
Kik’s anonymity makes it particularly problematic. Users can create accounts and groom children (or send explicit messages and images) without fear of being traced. “On Kik, people can’t be traced if they’re on it for nefarious reasons; that’s concerning, definitely,” says Nash. An investigation by the UK’s National Society for the Prevention of Cruelty to Children (NSPCC) found that it was the seventh most recorded method used by child groomers last year (the first six were Facebook, Snapchat, Instagram, text messages, WhatsApp and face-to-face conversation). And even if it’s not grownup users trying to communicate with children, there are still risks. The lack of an easily accessible digital trail makes it a boon for cyberbullies.
Controversies
Last September, it was reported that British police forces had investigated more than 1,100 child sexual abuse cases involving the app in the last five years.
TikTok
What is it?
TikTok – a Chinese app with hundreds of millions of users – is a successor to the defunct video sharing app Vine. Originally called musical.ly, the app allows children to lip-sync to their favourite songs and is a hotbed for memes: short, sharable, community-driven posts that mimic a theme.
How safe is it?
A significant proportion of TikTok’s users are children and teenagers, which is a draw to predators. “There’s something weird about the performative nature of TikTok,” explains Kyle McGregor, assistant professor at the Department of Child and Adolescent Psychiatry, New York University Langone Health, who has studied the impact of social media on children. “How do we make sure there aren’t risque, provocative, potentially illegal things being shown on this platform?” It’s something that concerns Nash too. “I worry about age-inappropriate content, which might be song lyrics, for example, or sexualised content. Young children mouthing along to adult songs doesn’t sit well with me.” TikTok has introduced a “Digital Wellbeing”feature that is meant to limit young users’ access to inappropriate content but its effectiveness is moot.
Controversies
The app has battled negative headlines in recent months after journalists discovered that older users were soliciting nude images from underage users and that it was being used to host neo-Nazi propaganda. In a world where personal data is treasured like a prize, Nash also worries about the app being owned by ByteDance, a Chinese company. “I have concerns about how far they would comply with GDPR, even though they’re bound by it,” she says.
Twitch (and Discord)
What is it?
Twitch was formerly called Justin.tv and has become the home for video game streaming. Users can watch “streamers” play games while on camera, interacting with fellow fans in text comment sections that run alongside the video.
How safe is it?
The main concern around Twitch is common to most platforms: that users your children meet on Twitch may have bad intentions and aim to take the interaction offline. There’s also a text-based chat app popular with gamers called Discord, which features chatrooms, often containing hundreds of participants. “Peer pressure via text chat on video streaming services is one of the things we need to be aware of, particularly with online grooming,” says Nash. “It’s encouraging people to commit acts they might not necessarily think of as sexual. People who do this deliberately are clever and manipulative people.”
Controversies
Some of Twitch’s biggest controversies aren’t around security flaws but stem from the opinions it hosts. Critics have said that the text chat that accompanies video streams is a hotbed of racism, with casual slurs tossed around – potentially warping young users’ beliefs. “Children can be exposed to inappropriate content that’s posted or broadcast in real time,” explains an NSPCC spokesperson. “It’s important the government considers how it ensures these types of sites are moderated properly.”
Depop
What is it?
An eBay-like app for selling unwanted goods. Load up the app and you’re presented with a grid of perfectly presented pictures showing off colour-coordinated clothes you can buy with a minimum of friction. More than 10 million users – most teenagers and in their early 20s – log on to the app to buy and sell.
How safe is it?
Relatively so, unless you think capitalism is dangerous. You may regard it as teaching children important business skills – marketing, negotiation, pricing and so on.
Controversies
Platforms selling items are often used to market contraband, and Depop is no different. In 2016, Depop users were offering ritalin and dextroamphetamineas well as unlicensed “smart drugs” such as modafinil for sale through the app, despite Depop’s no-tolerance rule on restricted sales. The issue persists; last year, an investigation found laughing gas, imported cigarettes, vodka and cannabis on sale.
Coming soon…
Squad is similar to Houseparty, but differs in one key aspect: you can share your screen with the other participants on your call. That feature helps replicate the way teens interact in real life: digital gossip and drama bleeds into the offline world. Squad’s screen sharing is a great idea, but the ability to share anything is a worry.
Released this spring, Byte is the ballyhooed brainchild of Dom Hofmann, the co-founder of Vine. Vine 1.0 gave the world Jake and Logan Paul, so savvy teenagers may see it as a springboard to superstardom. But the risk with that is that children embarrass themselves on camera, to a potentially massive audience. As with all these apps, using them is fraught with danger.
The Guardian
Leave a comment