Like many social media platforms and apps, TikTok feeds are built employing a recommendation algorithm that works on the number of tools and factors to personalize it for each person. Now, TikTok has published a new blog post explaining how its recommendation feed works, and it offers tips for personalizing the feed to avoid being served random videos you do not be interested in.
TikTok’s recommendation algorithm is created around input factors in ways somewhat much like the way YouTube measures and monitors engagement. The way people interact with the app affects the tips served, including posting a comment or following a free account. If some body only follows cute animal accounts, and solely double taps to like or comments on videos about animals, TikTok will serve them more animals. This also helps inform TikTok’s algorithm about videos people might not be enthusiastic about — if you’re only interested in Hype House creators, for instance, TikTok may not offer videos from the “bean side” subgenre on the app.
User interactions are just one part of the equation, though. TikTok states that video information, which “might include details like captions, sounds, and hashtags,” and device or account settings also provide an effect on the feed. Language preference, country setting, and device type will factor in to ensure “the system is optimized for performance,” in accordance with the post. The post also states, however, that device and account settings “receive lower weight in the recommendation system relative to other data points we measure since users don’t actively express these as preferences.”
Again, like YouTube, everything boils down to engagement. If some body finishes a video as an alternative of flipping to the next one halfway through, that action is registered as a stronger indication of interest. The post also stresses that its recommendation system is based on the content, definitely not the creator. Anecdotally, which means unless Charli D’Amelio — TikTok’s most followed creator — suddenly starts making videos about frogs, beans, or self-deprecating jokes, she’s not planning to appear in my feed (and she doesn’t!).
TikTok is frequently applauded for its recommendation system; once it’s finely tuned, the app becomes one of the most readily useful scrolling experiences. My personal theory is that’s why TikTok is really addicting — everything is really perfectly curated to your unique interests, it’s hard to place the phone down once you’re sucked in. But TikTok’s recommendation algorithm still has its own flaws that the company raises in its post.
“One of the inherent challenges with recommendation engines is that they can inadvertently limit your experience — what is sometimes referred to as a ‘filter bubble,’” the post reads. “By optimizing for personalization and relevance, there is a risk of presenting an increasingly homogenous stream of videos. This is a concern we take seriously as we maintain our recommendation system.”
Some of this is often innocuous — people who only like horse videos may possibly only see horse videos. Some of it can also be exclusionary. The app might not surface videos from the Black Lives Matter protests or may not recommend disabled or queer creators, if a user doesn’t specifically go out of their solution to tune the algorithm for the reason that direction. TikTok’s post addresses the filter bubble by explaining its goal of interrupting repetitive content. The “For You” feed “generally won’t show two videos in a row made with the same sound or by the same creator,” the post says.
The idea is that more new types of videos will surface on a feed than ones that feel like more of the same. But that doesn’t always work. I’ve scrolled through three to four videos, one after the other, that most used a well known song for a popular trend on the app. How exactly TikTok chooses which videos to surface for each personalized feed is still somewhat of a black box, but it’s an area the company reaches least highlighting as one in need of improvement.
Another issue that TikTok takes seriously is not surfacing dangerous content. This can be an issue that YouTube specifically has faced criticism over for many years. According to TikTok, content which has graphic material like surgical procedure or “legal consumption of regulated goods,” like alcohol, may not be qualified to receive recommendation since it could encounter as “shocking if surfaced as a recommended video to a general audience” — put simply, young kids. That’s why many creators on TikTok will upload a video over and over again or talk openly about feeling shadow banned over particular content.
TikTok has faced criticism from marginalized groups for perhaps not recommending content, including members of the LGBTQ+ community. It’s a concern YouTube routinely faces, and the Google-owned video site is currently facing a lawsuit after several LGBTQ+ creators claimed YouTube hid their videos in restricted mode and wasn’t surfacing their content in its recommendations. TikTok admitted it had suppressed content from some creators, intending it to be a short-form solution to bullying.
“Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy,” a spokesperson told The Verge in December 2019. “While the intention was good, the approach was wrong and we have long since changed the earlier policy in favor of more nuanced anti-bullying policies and in-app protections.”
The full blog has more in-depth instructions about how precisely to personalize your own “For You” page, but it’s refreshing to see the company open up about one of its competitive advantages. TikTok’s algorithm is one of the more fascinating ingredients to its worldwide success — it’s even part of the daily conversation within the app’s fast-growing culture, where TikTok users refer to different growing trends and subgenres as “sides” favored by the algorithm.
Lots of viral-hungry users try to work out how to game TikTok to get more views and capitalize on new trends — and that comes down to feeding the algorithmic recommendation tool different bits of data to advertise videos that may not naturally surface by themselves. Now, TikTok is pulling back the curtain a little more to give people a chance to take action themselves.