When Elyse Stieby opens her Instagram app, among the first things she sees are weight loss tips on the “explore” page: The number of calories in eggs, a medium coffee and a potato.
Stieby says she tries to just look at photos of her friends’ posts, rather than the recommended content Instagram serves her in her feed and through the explore tab — the app’s version of a personalized landing page and search bar accessed through the magnifying glass icon at the bottom of the app. But she says she knows the app’s algorithm chooses what it shows her based on what it thinks she wants to see — so the makeup, hair and body tips are tough to avoid.
“I don’t need to lose weight. I’m 102 pounds,” said the 18-year-old materials science major at Ohio State University.
Experiences like Stieby’s are at the center of a storm of criticism surrounding Instagram owner Facebook. In September, Facebook paused plans for an Instagram app designed especially for children after lawmakers voiced concerns about the app’s effects on young people’s mental health. Instagram is supposed to be for children older than 13, but kids younger than that have been able to get on the platform. Facebook whistleblower Frances Haugen leaked internal documents to the Wall Street Journal and the Securities and Exchange Commission that suggested the company knew that the use of Instagram may hurt the mental health of young women and girls. She testified in front of a Senate committee saying Facebook put growth and profit above anything else. Facebook has fought back, denying the claims.
Instagram has been steadily increasing the amount of recommended content it shows people. In July, the app started putting videos from people you don’t know right alongside your friends’ posts in the main feed. And the explore tab — a curated collection of algorithmically recommended content — is a wild West of images the app thinks you will like based largely on other posts you’ve interacted with. Impressionable teens may ultimately pay the price as the explore tab spits out content including idealized images and dubious “self help” recommendations.
Social media apps Snapchat and TikTok have also been criticized for promoting content that could warp self image or encourage harmful behaviors.
Still, experts say there are some steps teens, parents and schools can take to help teens handle the challenges that come with social media use.
While some experts caution that the impact of social media on mental health isn’t fully understood, others have found demonstrable effects.
“The idea that Facebook just learned about this, as a problem for kids’ mental health, is complete baloney,” Jim Steyer, founder and CEO of family advocacy organization Common Sense Media, said.
How the recommended content works
The photo feed we see on Instagram is typically filled mostly with posts from accounts we follow, and the same is true of stories, temporary posts that hover at the top of the feed and disappear after 24 hours.
But we can’t control what shows up in the explore tab or the slots for recommended posts inside our feeds. Instagram’s algorithm selects those based on a few factors. According to the company, it’s determined by the post and account’s popularity, whether the user has interacted with posts from that account before and the types of the content the user has interacted with even if they just tapped to read a caption or look closer.
Unlike your “Ads Interests,” which Instagram uses to target you with ads and which are listed under Settings — Security — Access Data — Ads Interests, you can’t view what types of recommended content the app thinks you want to see. The only way to cut back on unwanted content is by clicking on the offending image in the explore tab, tapping the three dots in the corner and selecting “Not Interested.” Over time, the app should show fewer similar posts. You can also ask to see less sensitive content, which includes bare bodies, drugs and firearms, by going to Settings -> Account -> Sensitive Content Control and choosing “Limit Even More.” Instagram automatically limits sensitive content for people under 18.
Teens are savvy, but ‘the algorithm’ is a burden
Gloria Wetherbee, 20, took a social media marketing class as part of her coursework at the University of Mary Hardin-Baylor in Belton, Texas, where students learned the best ways to compel audiences to interact with content. The class made her more aware of the ways content creators and social media companies drive engagement as she tries to avoid images of idealized bodies on Instagram, she said.
She’s careful not to tap on images of influencers, fashion tips or weight-loss content. Even sending them to a friend to make fun of the images means she’ll see more of them, she said. Instead, she carefully scrolls past them.
“I know part of the algorithm is sending new things and seeing what sticks, but I feel like I’ve honed my usage down so I don’t get it as much any more,” Wetherbee said.
Stieby says her explore page on Instagram has some self-help infographics with messages like, “Don’t let technology blind and consume you.”
The boys she knows see different content, she said.
“A lot of stuff is about the way you look and feeling pretty, or how to get skinnier or more toned or, ‘This is how you do your makeup so that guys will like you. Wear this perfume so that guys like you,’ ” she said. “But a guy’s Instagram, it’s like, ‘Oh, look at this car, it makes a cool sound.’ “
Discrepancies in the ways boys and girls use social media — and the content they’re served — ring true for many teens, Wagstaff said. As Stieby put it: Boys see cars, girls see beautification tips.
But that doesn’t mean boys don’t struggle with self image, according to Wagstaff. Researchers are uncovering more instances of disordered eating in men, she said. And body image issues aren’t the only social media trap boys can fall into: Some pockets of the Internet promote violent or bigoted ideologies and teen boys are especially vulnerable, she added.
Ways to mitigate impact
Some parents may feel the itch to snatch the phone and ban the app. But pause a moment before launching your teen’s smartphone into the nearest body of water.
Kids that get their phones taken will likely get their hands on a new one, Wagstaff cautioned, and deleted apps can still be accessed from any Internet-connected device.
Instead, parents, schools and platforms must work together to educate kids not only about the risks of social media, but also about the mindset it takes to move through a tough world with confidence and self love, she said. Parents should connect teens with resources to practice mindfulness and self-compassion, both of which help build resilience in the face of constant comparison.