In my blog, I talk a lot about fashion and the fashion industry. One topic that I want to shed light on is the fashion industry’s beauty standard. And the harmful effects it has on a girl’s body image. There is no secret that TikTok, is run by an algorithm. This algorithm has lots of power over the content you are fed. My TikTok page is filled with fashion-related content because that is the type of video that I engage with the most. Fashion-related content, however, comes at a price. Being skinny and fashion come hand in hand, and when you are shown hundreds of thin girls wearing trendy outfits, it starts to take a toll.
This next section was written for my CMNS 455 class last semester. However, I wanted to include this in my blog as well!
According to the National Eating Disorder Association (NEDA), in the United States alone 20 million women and 10 million men suffer from an eating disorder at some point in their lifetime. Eating disorders disproportionately affect teens with higher prevalence in those between the ages of 12-25. Correspondingly, TikTok’s target audience is teenagers and young adults, with 41% of users ranging between 16 and 24. This correlation is important to note as I argue that the TikTok algorithm is failing to detect pro-anorexia (pro-ana) communities from infecting the For You page which is triggering for people who are currently dealing with or are recovering from an eating disorder.
Almost all social media platforms or visually interactive websites have fallen victim to pro-ana communities, which are spaces dedicated to promoting eating disorders by presenting graphic material to encourage and motivate site users to continue their efforts with anorexia and bulimia. The visual nature of these platforms allows for this toxic type of content to run rampant. Discovering and handling these pro-ana communities is common for web companies. In 2012, Tumblr announced that it would be banning blogs that actively promote self-harm this includes “blogs that glorify or promote anorexia, bulimia, and other eating disorders; self-mutilation; or suicide”. With this noticeable history of pro-eating disorder content on social media sites, it is disheartening to see that TikTok was not better prepared. TikTok released a statement saying that it will take down content that “encourages or may encourage acts that are likely to lead to physical self-inflicted injury”. Yet, this has not stopped the spread, the algorithm is not doing a well enough job detecting pro-ana content because just by watching one video related to weight loss or eating disorders it results in hundreds of suggestions of pro-ana videos on the For You page.
The “For You” section is run by an algorithmic code, in social media algorithms aid in maintaining order, and facilitates ranking content and search results. The For You page curates a feed by showing videos that are related to your profile, location, and suggests videos that are like ones you have already engaged with. However, as the app grows in popularity the downfalls of its algorithm are starting to show. For many users such as myself who have dealt with disordered eating, the TikTok algorithm is becoming one of my worst enemies. One random suggested video can pop up and the algorithm notes on how you interact with said video. If you end up liking or watching enough videos all the way through, that is all you will see on your main feed. Moreover, simply by engaging with one video, it can lead to a spiral of more and more triggering content which can have devasting resultsIt is important to as well that the engineers who are working on the algorithm or who are screening for triggering content are most likely not able to fully understand or relate to what type of content may be triggering. It is reported that eating disorder patients often report feeling “triggered” by certain images or words. Content such as weight loss progress reports, what I eat in a day, weight loss hacks, and body checks may be triggering for some. Body checking is an obsessive behavior in which an individual focuses on certain features of his or her body, this has manifested on TikTok by people filming themselves different angles of their bodies. These types of videos are often overlooked when screening for harmful videos and continue to flood users’ homepages.
Some may argue that the algorithm is functioning accurately, TikTok is giving users a personalized and entertaining experience and there are options to keep unwanted content out of your For You page. There is the option to select “not interested” on any videos recommended by the algorithm. Moreover, TikTok will only show eating disorder-related content if you’re already engaging with it and keep doing so. Therefore, if you actively are trying to keep this type of content out of your For You page this content will not show up. Also, social media companies’ recommendation algorithms aren’t trained to make moral and health-related judgments.
The issue I have with this argument is that although the algorithm is not trained to make these moral or health-related judgments it is TikTok’s responsibility to keep its users safe and to create an environment that does not promote unhealthy eating habits. Researchers have reported a “clear pattern of association” between social media use and disordered eating thoughts and behaviours. Therefore, as a platform that is currently the hub for teenagers and young adults the app must uphold and maintain a safe community for all its users. Taking this all into consideration, I call on TikTok to work more closely with trained professionals to effectively handle this type of content that is landing on so many users For You page. Let this be a learning opportunity for other social media platforms to learn from so that this type of content does not infect any more sites.