While efforts have been made to ensure children are safe online, the internet is a wild playground of inappropriacy and harmful content. This week, UK health secretary Jeremy Hunt called for social mediato be safer for children, accusing platform holders of turning a "blind eye" towards the safety of kids.
YouTube is also taking steps to be a safer platform after its problems with unsafe content plaguing children's channels last year. Parents can now choose to restrict the YouTube Kids app to content that has only been vetted by humans, to ensure nothing inappropriate slips through.
In light of these recent slip ups, we asked our panel of experts if and how social networks could be working harder to protect children online. Is the onus on the platform or the parents?
YouTube is a dangerous space to leave your kids to navigate on their own and truth be told the onus is on more than the influencers here; parents need to have more oversight on where their children and consuming video, YouTube needs to have more control and parameters around where kids can be and influencers need to recognise YouTube is flooded with kids who may stumble on their content.
Since YouTube is far from tackling these issues, it’s up to other leaders in the industry to find a solution. Which is exactly what we’ve done through SafeFam.
After the adpocolypse in November, and the recent ultimatum from Jeremy Hunt, social media and video platforms have got to figure it out. We decided to take matters into our own hands, along with Kid-Tech company SuperAwesome back in November when we launched our partnership.
SuperAwesome has long standing relationships with premium kids brands, who turn to them for answers on how to advertise to the U13s in a safe and compliant way; which means not collecting data on kids and running safe ads for kids on carefully vetted publishers’ sites and YouTube channels.
When Cherry Pick Talent and SuperAwesome joined forces, in order to offer a more safe solution for kids brands looking to partner with influencers, we knew we had to take the safety and compliance issues that were rampant across YouTube into our own hands and offer our clients talent who they could book with confidence. Enter SafeFam.
SafeFam is either young talent, family channels or aspirational talent who recognizably have a young audience who are committed to creating safe, fun, positive content. The channels have been vetted by our in-house team, don’t swear or upload content that is not fit for children based on an array of guidelines which they promise to adhere to by signing a pledge.
This allows brands to work with talent who are squeaky clean and for parents to sit their kids in front of YouTube channels they know are deemed as safe!
The solution has to be a combination of work from both platforms and parents.
The platform’s responsibility lies in the creation of child-safe versions which are simple to activate, along with relevant education to ensure that parent understand that these options exist. They also have a responsibility to ensure that content on this version is vetted and suitable.
The final responsibility has to lie with the parents, who are the ones who will have to activate these versions for their child. YouTube and Twitch can create the perfect system that checks the user’s age before they create accounts, but can do nothing about a parent handing their own account over to their child.
Overall, all parties involved need to work together to build a safer internet for children (and everyone else for that matter). To be concrete:
- Online platforms could improve their policies and actions on what is not ok language and behavior (for example timeouts and bans in chats help moderate the conversation). The platforms also need to do a better job in filtering recommended content and vetting channels, videos and streams.
- Content creators could think about who their audience is - and openly communicate what kind of content they publish.
- Last and not least, parents need to actively educate their children about the platforms, help them discover suitable channels and streamers, and also monitor what sites & platforms their children use.
Pascal Clarysse started looking for so-called Growth Hacks a good decade before the buzzword was coined.
Clarysse used to be the marketing driving force at Lik-Sang.com, where he was in charge of relentlessly spotting new trends, waves and magic holes. In recent years, he's served as a marketing consultant for various indie studios, participating in launching mobile games and the occasional Kickstarter campaign.
Of course, platforms have a duty filtering the content pro-actively and refining their systems for always improved pipelines. Content creators with a vast infantile audience have responsibilities too.
My children's education are my responsibility and that includes complicated subject matters that won't disappear just because there's a paradigm shift in the media spectrum.
But since the world will never be perfectly filtered and will always have a random element of chaos to it, be it online or outdoors, the ultimate burden of responsibility with any child remains with the parents. It's not enough to turn some settings on and off - the good old fashion parental dialog is still the decisive part.
I don't mind if my children stumble upon something that provokes weird thoughts in them, as long as they feel compelled to talk to me about it and we discuss it as a family, comparing what we saw with our grid of values. Ask questions about what they see and what they think of it, listen to the answers openly and be wise and poised about your answers.
Don't hesitate sitting down and watching the stuff with them, like you would have done on TV with your parents back in the day. Don't expect the platforms or the teachers or any other authority to do that for us. My children's education are my responsibility and that includes complicated subject matters that won't disappear just because there's a paradigm shift in the media spectrum.
19th centuries parents had to care about what their kids were reading, 20th century about what they watch on TV, now it's YouTube and games. Nobody ever said parenting was easy, and technology is not going to change that by much.
People may not like my response, but the reality is just like TV programming you have to monitor what your kid watches and teach them whats right and wrong. If a kid is taught morals and how to think he/she will know what not to do or say.
This being said YouTube has already age gated channels to 6+, 12+ or 18+ which harmed a lot of channels a while back by deleting active subs that could not find or search for the channels they use to watch. Whether or not YouTube should do that is up for debate as I don’t think its really their job, but since they also wish to protect advertisers further it is kind of a double win for YouTube to do.
As for talent (influencers) they should have the least responsibility in this matter, first off the point of social media like YouTube is to be able to express and be yourself and be ok with other judgements and happy with who you are so that other people (Fans) know its ok to be themselves and can relate.
Having people force them to be fake or conform teaches kids to conform rather than to be happy with who they are and to explore who they are. Again this is why parents are the ones that need to actively choose what there kids watch and teach them why and why not to watch certain people, but the overall concept of your kids watching someone who is showing the world they are themselves even with flaws is not a bad thing in my opinion.
Another important fact is that most Influencers are very young under 25 and can’t be expected to be perfect role models at all times 24/7 let alone know what you or society is ok with kids watching. This being said YouTubers should be using common sense and not doing anything that is negative, hateful, or morally wrong on purpose when they have influence.
Also, talent that have kids watching need to be extra careful also about what content they show as there are many kids and parents they need to be role models for.
We live in a society where technology is asked to take responsibility of parenting, which for me is wrong (in the past TV was supposed to act as a parent, then videogames and now it's the internet). I believe it's right to have a safe guarding system online for Twitch, Youtube etc, however the ultimate responsibility needs to stay with parents.
As in every thing the balance resides in the middle. Gated content is a great idea as well as making sure streamers are accountable and responsible for their own actions at the same level that standard celebrities and journalists are expected to be.
So to summarise, for me it is a combination of common sense from the content producer, age gate from platforms but ultimately the responsibility needs to stay with parents. That needs to be the ultimate gatekeeper of what their children are exposed to.