TikTok is on a mission to protect its younger community, creating a safe and inclusive space for its users aged 13-17. There is a lot going on behind the scenes to allow this goal to be met, with both the internal TikTok team and external partners playing an important role.
James Chandler, Chief Marketing Officer at IAB UK led two panel sessions both with the overwhelming theme that ‘safety by design’ is TikTok’s core principle.
People are at the heart of safety
Dr Liam Hackett, Global CEO of youth charity, Ditch the Label, and Lynn Sutton, Head of Outreach & Partnerships, Trust & Safety EMEA at TikTok, chatted to James about the partnership they have embarked on in order to educate the young TikTok community.
Lynn emphasised how the TikTok team represents the diversity of usership on the platform, as well as noting how being a second generation platform is beneficial for TikTok as they can base their learnings on the pain points that other social media platforms have historically faced. TikTok is in a fortunate position as the pioneer to solutions to social media challenges.
For many young teens, TikTok is the first social media platform that they will use, so it is essential to educate them on how to use it safely, they can then apply these learnings to future online activity. Lynn explained that by default, 13-16 year old TikTok users do not have access to the private messaging feature and their accounts are set to private. The platform is also consistently monitored to ensure that there are no users below the age of 13 who have managed to slip the net.
With the help of youth experts such as Dr Liam Hackett and the team at Ditch the Label, the TikTok team is open to criticism that informs new safety updates from a technological and personal point of view.
Happy and safe users = happy and safe advertisers
It’s not just TikTok users and the parents of those younger users who have a desire for TikTok to be as safe and robust as possible. Advertisers care deeply about safety on TikTok as they want to feel confident about where they’re spending their dollars.
Through a blend of human activity and technological advances, TikTok is able to do its utmost to ensure that the platform is fun and reaches diverse audiences, still with safety being paramount.
Focussing on the needs of younger users
In the second panel session, James chatted with Vicki Shotbolt, CEO of Parent Zone, TikTokker and teacher, Mister MBA, and Jade Nester, Head of Data Public Policy, Europe at TikTok chatted about how to educate the parents of social media users on how to navigate the risks of social media, rather than shying away.
When parents hear scary anecdotes about TikTok, their natural instinct may be to keep their child as far away from the app as possible. But this isn’t realistic in the world we live in, it’s far more productive to set boundaries and stay up to date with your child's digital life.
Luckily, TikTok does offer some hard controls that can settle parents' minds, such as turning off the direct messaging feature and being able to control screen time.
Balancing creativity and safety
In order for users to unveil their full creative potential on TikTok, it’s essential that they feel safe enough to do so. Creative features are developed with age brackets in mind, for example, users aged 13-15 have access to different features compared to 16-17.
We’re moving more rapidly as a society to desiring an authentic experience from social media, rather than a polished depiction of reality. The internet as a whole offers a vast number of opportunities, Vicki even described it as the ‘most democratising thing that's ever been invented’.
Safety shouldn’t have to be a barrier to creativity, and according to Katie Eyton, Chief Ethics and Compliance Officer at Omnicom Media Group UK, and Giles Derrington, Senior Government Relations and Public Policy Manager, who spoke on the third panel, it would be unfair to deny teens access of the internet so instead, the emphasis needs to be on education.
The education that TikTok and its partners are offering to its young users can be applied to more than just use of social media, and can also be a huge part of getting teenagers up for the hurdles they may face as they enter into young adulthood.
Remember, there is always work taking place behind the scenes in regards to GDPR policies, monitoring of algorithms, and tightening thresholds when it comes to damaging content. Today’s Transparency Forum did an excellent job at informing us of this.
The next Transparency Forum will take place at the beginning of 2023, so be sure to check back then to see what we learn and unpack.