The TikTok algorithm gives way to personal choice; tackles the rabbit hole problem

The TikTok algorithm has been both the key to the video streaming app’s success and the biggest criticism leveled against it. But it now gives users the option to filter out the topics you don’t want to see.

The company is also introducing new automated moderation tools, including one that (finally!) enforces age limits on non-kid-friendly videos, and another that aims to solve the “rabbit hole” problem of users who are shown a succession of depressing or other messages. potentially harmful videos…

The TikTok algorithm

TikTok differs from conventional video streaming apps like YouTube in that its algorithm has much more control over what you see. Instead of users picking the videos they want to see, you just pick a few initial interests, and from there, the algorithm takes over.

TikTok determines your likes using a range of signals, including which videos you watch throughout, and which ones you like, share and follow.

This proved to be a hugely successful approach for the company, measured by both app downloads and usage, but was also heavily criticized. One of the main criticisms has been that it quickly puts users into “silos”, where they only see a small subset of content.

A study conducted last year showed that it can be actively dangerous.

A bot was programmed with sadness and depression as “interests”. Less than three minutes after using TikTok, on his 15th video, [bot] kentucky_96 stop on it [sad video about losing people from your life]. Kentucky_96 watches the 35 second video twice. Here TikTok gets his first idea that the new user might be feeling depressed lately […]

The user stops instead on one about mental health, then quickly skips through videos of an ex’s disappearance, advice on how to move on, and how to maintain interest. a lover. But kentucky_96 lingers on this video with the hashtag #depression, and these videos about anxious suffering.

After 224 videos on the bot’s overall journey, or about 36 minutes of total watch time, TikTok’s understanding of kentucky_96 is taking shape. Videos about depression and mental health issues outnumber those about relationships and breakups. As of now, kentucky_96’s stream is a deluge of depressive content. 93% of the videos posted on the account deal with sadness or depression.

TikTok also appears to be extremely poor at filtering out particularly dangerous content, such as a “blackout challenge” believed to be responsible for the deaths of seven children.

Keyword filters

For the first time, TikTok is giving users the ability to filter certain types of content by blacklisting specific words and hashtags.

Viewers can [already] use our “not interested” feature to automatically skip videos from a creator or that use the same sound. To further empower viewers to personalize their viewing experience, we’re rolling out a tool that people can use to automatically filter videos with words or hashtags they don’t want to see from their For You or Next streams – that whether it’s because you’ve just finished a homemade project and you don’t want any more DIY tutorials or you want to see fewer dairy or meat recipes as you switch to more plant-based meals. This feature will be available to everyone in the coming weeks.

Age-restricted videos

TikTok is finally introducing age restrictions on videos that aren’t suitable for kids. Previously, the app warned young users that a video might not be suitable, but still let them watch it. The company is finally stopping children from watching such videos.

In the coming weeks, we will begin introducing an early release to help prevent content with overtly mature themes from reaching 13-17 year old audiences. When we detect that a video contains mature or complex themes, for example, fictionalized scenes that may be too scary or intense for younger audiences, a maturity score will be assigned to the video to help prevent those under 18 years of viewing it through the TikTok experience. .

TikTok algorithm will reduce potentially harmful content

The TikTok algorithm is also being trained to solve the rabbit hole problem of a stream of potentially dangerous content.

Last year, we started testing ways to avoid recommending a series of similar content on topics that may be fine as a single video but potentially problematic if viewed multiple times, such as related topics to diets, extreme fitness, sadness and other feel-good topics. We’ve also tested ways to recognize if our system may inadvertently recommend a narrower range of content to a viewer.

As a result of our testing and iterations in the US, we’ve improved the viewing experience so that viewers now see fewer videos on these topics at once.

Photo: Florian Schmetz/Unsplash

FTC: We use revenue-generating automatic affiliate links. After.


Check out 9to5Mac on YouTube for more Apple news:

Comments are closed.