TikTok is the wild west of social media feeds (and they’re all kinds of a wild west). A scroll could start on a dance trend, jump to a clip of raw chicken ‘marinating’ in NyQuil, and end on video of someone filing their own teeth. It’s weird out there, and occasionally dangerous. Now TikTok is taking its next steps to rein things in.
The company announced a rating system called “Content Levels,” that it plans to institute an early version of “in the coming weeks,” in a Wednesday blog post. TikTok had indicated back in February that it was moving towards age-based feed restrictions, and Content Levels offers the first details of what that might look like. App users will also now have more control over their own video streams, with the ability to selectively mute hashtags.
Though the social media giant wrote that their new moderation scheme is based on the ones used by the film, TV, and gaming industries, the company won’t immediately be displaying ratings along with video clips. Instead, the sorting and filtering will happen on the back end.
“When we detect that a video contains mature or complex themes, for example, fictional scenes that may be too frightening or intense for younger audiences, a maturity score will be allocated to the video to help prevent those under 18 from viewing it across the TikTok experience,” wrote the company. “We have focused on further safeguarding the teen experience first and in the coming months we plan to add new functionality to provide detailed content filtering options for our entire community so they can enjoy more of what they love.”
Each “maturity score” will be be assigned by a TikTok moderator. Though, in the past, the company has mentioned the possibility of platform creators assigning a rating to their own content before posting.
Gizmodo requested additional details on how the new system will work. Here is what a company spokesperson had to say:
Each video uploaded to TikTok is reviewed to ensure it complies with our Community Guidelines. If the video passes this review, it will be allowed on the platform. Once the video is visible on TikTok, it might be sent to our Trust and Safety colleagues for moderation - for example as a video increases in popularity or when a community member reports the video, for example.
During the review process, not only will the moderator review the video for any violations of our Community Guidelines, they will also assign a Content Level to the video. As we develop the system, we’re looking at ways we can introduce sophisticated technical solutions to the classification process whilst maintaining a level of accuracy we’re comfortable with.
As far as whether creators will be able to find the rating for their content, this isn’t a functionality that’s available in the first version of Content Levels. Over the next few months, we’re eager to spend some time listening to feedback before making further adjustments.
TikTok emphasized in its announcement that the incoming content moderation system is early days. “We also acknowledge that what we’re striving to achieve is complex and we may make some mistakes,” the company wrote. But in the meantime—while we’re waiting for comprehensive top-down, age-based content filtering—app users can now create their own restrictions. Hashtags or words can now be muted in “For You” or “Following” feeds, so scrolls can be slightly more curated than they were before. The platform said that this, and more efforts to diversify recommended videos will also be coming in the next few weeks.
TikTok has had a meteoric rise, especially among teenagers and even younger children. In the first three months of 2022, it was the most downloaded app worldwide. Throughout its rocket journey to the top though, TikTok has faced lots of flack—both for its controversial and allegedly flawed privacy policies and for its impact on users.
The platform already has content guidelines, and bans specific categories of videos based on user reporting and employees tasked with sifting through posts. In March, two former TikTok moderators sued the company over trauma they say they incurred while working to filter out violent or otherwise inappropriate videos from the platform. The lawsuit claims that TikTok doesn’t provide adequate mental health services or protection to moderators. Which doesn’t necessarily bode well for a planned expansion of moderation across the app.
The company is also facing lawsuits from parents who claim their children were hurt or even killed because of content they saw on TikTok. In May, the mother of a 10-year-old girl sued the company after she said her daughter died of asphyxiation attempting a “Blackout Challenge,” popularized on the app. More parents filed similar lawsuits this month. New legislation in California could further allow parents to sue over claims of social media addiction.
It remains to be seen if the platform’s new content moderation efforts can make a dent in the issue of potentially dangerous, viral video trends.
Update 7/13/2022, 3:00 p.m. ET: This post has been updated with additional information from a TikTok spokesperson.