In a recent article, leftist tech outlet Wired describes how the Facebook’s Instagram algorithm repeatedly showed one young mother videos and images of ill, dying, or dead children. The constant drumbeat of negative content on social media platforms is particularly harmful to young women and teen girls, with China’s TikTok only expanding the negativity.
Wired reports in an article titled “Instagram Keeps Showing Me Children’s Tragedies,” that one young mother was shown regular videos and photos of ill, dying, or dead children after the Instagram algorithm tracked her online browsing and learned that she had recently had a child. After months of her Googling normal new parent worries about her pregnancy and children, the Instagram algorithm appeared to catch onto this and added to her fears.
Wired writes:
But something on my screen has continually surprised and rattled me in this first year of parenthood. During quiet nap times spent scrolling my feeds, I’ve found myself transfixed by posts about babies and children who are ill, dying, or dead. As I watch recipe breakdowns and home-makeovers on TikTok, videos from mothers grieving the untimely deaths of their children pop up, impossible to flick away. My Instagram Explore page often suggests accounts focused on or memorializing babies with severe health challenges and birth defects. My husband has walked in on me looking at my phone and crying about children I don’t know so many times that he’s (gently, reasonably) suggested a social media hiatus.
Despite the visceral distress they provoke, these videos keep appearing on my screen for a reason: because I watch them. Raptly. I remember the names and conditions of these imperiled children, whether they are living with San Filippo syndrome or enduring chemotherapy, whether they have just died of myocarditis or SIDs. I remember their siblings and favorite things. I check up on them. If they have died, I check on their parents. A tourist snooping into the land of sick kids, I have absorbed the morbid lingo of digitally mediated death, like “so-and-so gained his wings” and the eerily popular “happy heavenly birthday!” All the social platforms, at their core, demand engagement; I am so engaged, I tremble.
Am I consuming content about sick and dead babies as entertainment, in the same way that someone might watch a horror movie? There’s some overlap, I think, in my behavior here and the habits of ardent true-crime fans, who hoover up grisly dispatches about real-life violence—including child abductions—with such enthusiasm they’ve fueled a content boom for all things murder and gore. There’s a theory that true crime’s popularity with women, in particular, is tied to their fears of becoming a victim of crime. Watching it can provide a cathartic moment, an opportunity for releasing pent-up anxieties. This is, undoubtedly, connected to my anxiety.
Instagram is also toxic for teen girls according to Facebook’s own internal research. The WSJ published internal research as part of its “Facebook Files” series that outlined the company’s own findings.
“Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” the researchers said in a March 2020 slide presentation posted to Facebook’s internal message board, reviewed by The Wall Street Journal. “Comparisons on Instagram can change how young women view and describe themselves.”
…
“We make body image issues worse for one in three teen girls,” said one slide from 2019, summarizing research about teen girls who experience the issues.
“Teens blame Instagram for increases in the rate of anxiety and depression,” said another slide. “This reaction was unprompted and consistent across all groups.”
These concerns also exist for other platforms, like the massively popular TikTok. In 2021, Breitbart News reported that the popular Chinese-owned video app TikTok uses its algorithm to suggest videos to minors that include the use of illegal drugs and sexually explicit content.
The Wall Street Journal noted at the time a story of a 13-year-old TikTok user who searched the app for “OnlyFans,” the name of the subscription website that is primarily used to host pornographic content. The underage user then watched multiple videos, including two advertising access to pornography.
Upon returning to TikTok’s “For You” feed, which displays content to users based on their interests and previously watched videos, the 13-year-old user was served a number of videos related to the tag “sex.” These videos included role-playing videos where people pretended to be in a relationship with caregivers, the WSJ notes that in one video, a man’s voice instructs a woman wearing a latex leotard, stating: “Feel free to cry. You know it’s daddy’s favorite.”
The Wall Street Journal noted that the 13-year-old user that was served these videos by TikTok doesn’t actually exist, the account was one of dozens of automated accounts created by the WSJ to examine TikTok’s algorithm.
The WSJ writes:
TikTok served one account registered as a 13-year-old at least 569 videos about drug use, references to cocaine and meth addiction, and promotional videos for online sales of drug products and paraphernalia. Hundreds of similar videos appeared in the feeds of the Journal’s other minor accounts.
TikTok also showed the Journal’s teenage users more than 100 videos from accounts recommending paid pornography sites and sex shops. Thousands of others were from creators who labeled their content as for adults only.
Read more about TikTok at Breitbart News here. Read more about Instagram at Wired here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or contact via secure email at the address lucasnolan@protonmail.com
COMMENTS
Please let us know if you're having issues with commenting.