The findings come from a joint technical investigation by Amnesty, the Algorithmic Transparency Institute (ATI) and the National Conference on Citizenship and AI Forensics. The two reports “Driven into the Darkness: How TikTok Encourages Self-harm and Suicidal Ideation” and “I Feel Exposed: Caught in TikTok’s Surveillance Web” draw attention to human rights violations that underage and young TikTok users are exposed to.
“The reports expose the manipulative and addictive way TikTok feeds are designed. These are aimed at keeping users on the platform for as long as possible. The way TikTok works and the recommendations it displays based on its own algorithms “(…) children and young people with existing psychological problems can be exposed to serious dangers,” Lisa Dittmer, research expert at Amnesty, is quoted as saying. According to Amnesty, TikTok risks increasing mental health problems such as depression, anxiety and self-harm.
Potentially harmful content
TikTok’s “For You” feed is a highly personalized page that you can scroll through endlessly. It contains algorithmically recommended content that the system selects based on the presumed interests of the users. The research found that after just five to six hours on the platform, almost half of the feed consisted of mental health videos that contained potentially harmful content – that’s ten times more related videos than accounts that had no interest in mental health content showed, Amnesty warned.
For the technical research, more than thirty automated accounts were set up that supposedly belonged to 13-year-old users in Kenya and the USA to record the impact of the algorithmic recommendations on young users. An additional manual simulation included one account each in Kenya, the Philippines and the USA.
This downward spiral occurred even more quickly when the mental health videos suggested to the test accounts were clicked on and viewed again by Amnesty’s experts. After just three to 20 minutes using this method, more than half of the “For You” feed consisted of videos that addressed mental health issues. Within an hour, numerous recommended videos appeared that normalized or even romanticized suicide.
Amnesty International’s research showed that TikTok’s business model is inherently abusive and rewards users in order to bind them to the platform and collect more and more data about them. Additionally, TikTok would only apply its protections in certain parts of the world, leaving some children and young people even more exposed to exploitative data collection than others, Amnesty said.
- First EU mandatory report from Tiktok: four million harmful videos
- This is how many people use TikTok in Austria every month
- Social media: 72 percent see manipulation as a danger
Amnesty demands that TikTok – not just in Europe – respect the rights of all its younger users by banning any targeted advertising aimed at people under 18 worldwide. In addition, the “For You” feed should not be personalized by default. Instead, users should be able to actively decide which interests influence their content recommendations based on their informed consent and if they want a personalized feed.
TikTok refers to community guidelines
In response to Amnesty International’s research, TikTok referred to its community guidelines, which outline what types of content are banned and therefore removed from the platform if reported or otherwise identified. This includes a ban on content that depicts, promotes or provides instructions for suicide and self-harm, as well as related challenges, dares and games that “depict or promote acts of suicide and self-harm” and “share plans for suicide and self-harm”.
TikTok also said it has initiated a process to develop a “company-wide human rights due diligence process that includes conducting regular human rights impact assessments,” Amnesty said in the release. However, the social network did not provide any information about what specific risks it had identified for the human rights of minors and young users. The fact that TikTok is currently not conducting a company-wide human rights due diligence review is a clear failure by the company to assume its responsibility for respecting human rights, Amnesty criticized.
For your saved topics were
new articles found.
info By clicking on the icon you can add the keyword to your topics.
By clicking on the icon you open your “my topics” page. They have of 15 keywords saved and would have to remove keywords.
info By clicking on the icon you can remove the keyword from your topics.
Add the topic to your topics.