TikTok has gained incredible popularity within the past year or so, exacerbated by a pandemic that has kept people across the world in their homes. The creative video app is an appealing option to occupy oneself while the world is trying to stop the spread of COVID-19, but unseen ramifications of TikTok have emerged as well. In my experience, one of the most apparent issues is the presence of harmful messages surrounding eating disorders and body image. This stems from the “For You” page algorithm potentially playing negative or disturbing videos in front of hundreds of millions of users, and the inability to truly monitor harmful content as it is uploaded to the app. As much as users may think themselves immune, increased social media use is seen to introduce eating issues more than anticipated.
In a recent effort to be more transparent following the tumultuous period in which President Donald Trump attempted to block the app in America, TikTok released a blog post describing how their “For You” page, or “fyp,” operated. Essentially, any active or inactive reaction to a video dictates how succeeding videos are put on a user’s “For You” page. Some actions are stronger indicators than others, however, such as sharing or liking a video, following its creator, and watching the complete video make similar clips show up in the future. Similarly, if a user were to choose not to watch the full way through or click “not interested” in settings, the algorithm changes accordingly. The algorithm also functions on the basis of what language is used, the area the video was made, and the type of device the video was made on. Therefore, when a video is made and posted, it is sent to a small group of users’ “For You” page, and if there is high engagement, it will continue on to become more viral, with the opposite being true for a video not receiving good reactions.
Hashtags are another way for users to gain followers and views, as tagging #fyp and #foryoupage on videos can encourage a faster journey to millions of “For You” pages. Similarly, if a user positively reacts to a video with other hashtags, other related videos will be provided to their “For You” page. While TikTok has ways of predisposing certain videos of interest to be on a viewer’s “fyp” through the small survey of questions about the content the new user is interested in, they also claim to introduce new videos that are not similar to other positively reacted-to videos to prevent the user from entering a cycle of identical ideas and content on their feed. However, similar videos remain, and a user is fed more of what they responded to.
In my use of the app, I began by saying I was interested in things like sports, cooking, fitness, and entertainment. Immediately when I started scrolling through my “fyp,” I was shown numerous videos of young women and men talking about what they eat in a day, their Eating Disorder recovery (which is often inspiring, when giving the right message!), or a weight loss transformation through numerous songs or sounds on the app that has become a trend to show off a “glow up”. One flaw, in my mind, of the “For You” algorithm is that if a viewer watches a video, even if they do not like it, similar content will keep showing up if they do not actively express disinterest. Most times, I watch a video expecting something different and then end up seeing more harmful and unhealthy practices of eating and thinking of the person’s body.
Some captions or the text embedded in the videos also prove to be graphic, as even if there is a “#TW” (trigger warning) at the beginning of the video, there is still not much room left for imagination in some clips. Within a matter of ten minutes, my “For You” page became a scary show of teens and adults alike being obsessed with their appearance and becoming thinner. TikTok’s blog post claimed to monitor posts on the app which violate their use policies, such as nudity, “graphic medical procedures,” or “legal consumption of regulated goods.” This leaves much room for content such as body image and eating to go unnoticed, something that was harmful for me and I ended up locking my phone after 10 minutes of using the app.
Enter: the common debate about social media sites harming the mental health of their users.
A recent study found that there was a prevalence of “subclinical” eating disorders— ones that may not have been diagnosed but still manifest as unhealthy habits of eating and approaching food that may lead to other eating disorders later on—due to the influence of social media. Even without it being a diagnosed eating disorder, dissatisfaction with one’s body, a negative image of one’s body, and incorrect and insufficient eating can later on influence the overall health of an individual and is something to be taken seriously.
Though social media cannot be deemed the sole reason for increased body image problems, the study makes it clear that there is a relationship between a high volume and frequency of social media use with eating concerns. The more that adults aged 19-32 across America used social media, the more likely they were to have more concerns about what they eat and body image-related issues. This implies that visual-oriented platforms may be the issue, as seeing thin models in magazines and actors and actresses on TV can’t compare to constantly seeing people with the “ideal” body type every time someone checks their phone.
Another problem presented through TikTok is that users can easily get around the blocks the app has put up to prevent harmful ideas about body image and eating to be shared. If a hashtag is blocked, users simply override this by replacing a number that resembles a letter in the word, for example swapping a “1” for an “i.” This allows the posts to persist, still reaching the “For You” pages of millions of people. TikTok is therefore a place where even if people are posting body-positive content, there is a darker side that can also harbor negative ideas in people’s minds, perhaps starting a cycle deeming their body as ‘imperfect.’
While it seems pointless to insist that TikTok screen each video before it is posted, I do believe the company owes more forethought to prevent their users from being introduced to these harmful topics and feelings about themselves. Yes, there is a line of freedom of speech that should not be crossed, but at the same time, TikTok is not Twitter—it does not serve primarily as a mode of communication. Tik Tok was invented out of the idea of entertainment and fun, and the messages I saw coming out of my first use of the app were anything but. In an age where kids and young adults are maturing in the shadow of social media sites, such as and especially Tik Tok, some protection should be provided in order for the people vulnerable to these ideas to avoid developing dangerous and harmful habits of their own.