“I got more than a few videos of women in outfits that were barely covering any skin, like lingerie or see-through things, or just half-nude women dancing or modeling something,” said Blackheart. “I didn’t linger too long to see much more because it made me uncomfortable.”
Blackheart was dealing with a common problem for new, and even long-time, users of TikTok and similar social media apps. Content designed to get the most engagement, including suggestive clips of women or things meant to shock, are regularly shown to new users. And people who have been on the apps longer sometimes find themselves unable to get the sexual videos out of their automated feeds, despite never liking them or following those creators.
The fix most often suggested by the companies is for a person to “train” their own feed to stop showing unwanted videos. But that takes time and isn’t as effective when pitted against human nature.
We spoke to five people who have struggled to get sexual content out of their feeds, and tested each of the apps ourselves as new users with no history on the sites. We found that sexual content was suggested by default to new users on four of the five apps, although the material rarely violated community guidelines.
Apps show sexualized content for a reason
If the videos are known to disturb some people, why do they appear by default? TikTok uses recommendation algorithms to decide which videos a person will see, drawing on a wide range of signals such as whom the person follows and what the person has liked.
More important, video feeds also use information viewers are not deliberately sharing, such as how long they let videos play and whether they click through to the comments. The apps assume basic demographic information including a viewer’s gender and age without asking directly.
TikTok is the most well known of the five apps for suggesting shocking content to new users, but similar patterns are seen in apps trying to mimic TikTok’s algorithmic success: Instagram’s Reels, Snapchat and YouTube Shorts. Other apps use many of the same signals, although none are completely transparent about how their algorithms work.
The problem of showing sexualized content to the wrong audiences is long-standing, according to social media experts. Recommendation algorithms often prioritize shocking content because it is lucrative.
“The business model of every platform, especially social media, is to keep you on the platform as long and as much as possible. They try to offer you certain content to keep you interested and engaged,” said Sandra Wachter, a professor of technology and regulation at the Oxford Internet Institute. “The truth is, what keeps you engaged is often toxic, controversial, scandalous and gossipy.”
The benefit of these types of algorithms is that they can learn your preferences over time, discerning that you prefer concert videos of Taylor Swift over Megadeath (or a mix of both). But when you log in for the first time, they often don’t have anything to go on aside from basics such as your country and language preferences.
“They call this the cold-start problem. How do you make a behavioral prediction without any behavior?” says Christian Sandvig, the director of the Center for Ethics, Society, and Computing at the University of Michigan.
The apps probably default to showing these people the most popular content, Sandvig says. And according to the numbers of likes and comments on many of the sexualized videos we reviewed, they are broadly popular.
They’re weird but not quite pornography
We looked at examples from people struggling with their feeds and tested the services on our own. For the most part, the videos we saw did not violate community standards of the companies showing them.
The videos were typically young women dancing or posing suggestively in minimal clothing but never fully nude. There was simulated sex without nudity, suggestively cropped body parts, and women posing with their age, weight and location overlaid.
We did a test of our own, setting up accounts for each of the four services — TikTok, Instagram Reels, YouTube Shorts and Snapchat Spotlight — from scratch. While it’s nearly impossible to block all signals that the apps use to profile you, even as a new user, we tried to have as blank a slate as possible.
For most, we used a spare iPhone with a prepaid SIM card and turned off the option to allow apps to request to track. We used new burner email addresses to create accounts and gave no information or minimal information during setup. TikTok lets you pick interests, so we chose outdoors, sports and gaming. YouTube Shorts didn’t require signing in, so we viewed it on a clean web browser. We also tried to avoid staying longer on videos that included sexual content, but as many regular users have found, even a few seconds can make a difference.
On TikTok, it took until the third video to get a young woman posing suggestively. Within five minutes, the feed had devolved into more-explicitly sexual content, including many young-looking women in school uniforms. In one video, a young man and woman simulated sex with their clothes on. Another featured the Pornhub logo and a pile of condoms. Many of the videos had up to hundreds of thousands of likes.
TikTok declined to comment on the issue but has said that it is coming out in the coming weeks with a tool to filter out videos by keywords or hashtags. It said it would follow up with other tools such as “Content Levels,” which should ensure younger viewers do not see “mature” content.
Instagram Reels started out strong with a video of cute otters and took a bit longer to show sexual content — about five minutes. Then it played a mix of videos similar to what we saw on TikTok, starting with two young women dressed as schoolgirls and kissing.
“Our ranked feed and recommendations for new users prioritize posts and recommendations we think users are more likely to enjoy based on what’s trending or popular, or based on interactions, but we understand we may not always get it right,” said Instagram spokeswoman Christine Pai. The Meta-owned company says it is working on improving suggestions and customization tools for new users.
Snapchat — which suggests videos in its Discover tab and Explore section — likewise showed a mix of sexual and shocking content that didn’t explicitly violate the company’s rules. Snapchat spokeswoman Rachel Racusen said the company is updating its editorial guidelines to “make clear what types of content are appropriate for our diverse audience” and is focused on stricter enforcement and penalties.
There was one outlier in our tests: YouTube Shorts. It didn’t show sexual content, but within the first five minutes, it did play videos promoting the conservative lightning rods Jordan Peterson and Andrew Tate, and it showed people criticizing women for wearing short skirts, and a number of gun-related clips. All were mixed in with video game, sports and dog content.
“No two viewers’ journeys are the same. We’re always working to improve our recommendation system to help people find content they want to watch and find valuable,” said YouTube spokesperson Elena Hernandez.
Sex isn’t the only kind of shock content
Lacy Phillips joined TikTok over the summer as part of her job as a social media and digital manager for a literary agency. At first, her feed was mostly “a lot of pretty people being dull,” but soon it showed her odd videos, including real 911 recordings of terrified women in violent situations — something especially triggering to Phillips. It took her about two weeks to train the app to not show unwanted content and stick to her interests, including K-pop and book content.
When Washington Post tech columnist Geoffrey A. Fowler set up a new Instagram account for photos of his new child, he was shown suggested posts of babies with severe and uncommon health conditions.
“You always want whatever you’re recommending to clients to be brand safe,” says Phillips. “It makes me a little bit hesitant to recommend my friends get into TikTok or even recommend it to clients.”
If your TikTok, Reels, Shorts or Snapchat feeds are showing you content you’re uninterested in, there are some simple — if time-sucking — ways to try to fix it.
- Say you’re not interested: Each app has a way to say you’re not interested or dislike a video, either by pressing and holding (TikTok) selecting the More options (three dots on Instagram and Snapchat) or hitting dislike (YouTube).
- Seek out content you do like: Look up hashtags and keywords for your core interests, follow those creators and read or engage in the comments.
- Train yourself to look away: The apps look at how long you view a video, if you read the comments, and if you click on the creator’s profile to decide what to show you. Scroll away quickly, consistently.
- Give it time: To get your feeds right where you want them, you’ll need to invest time — days or weeks — and likely see videos that you don’t want along the way.
Source link