Skip to content

Why Instagram suggests sexualized content

Families feed algorithm

Exposed photos and videos of children are increasingly appearing on the Instagram platform - often....aussiedlerbote.de
Exposed photos and videos of children are increasingly appearing on the Instagram platform - often posted by the parents themselves..aussiedlerbote.de

Why Instagram suggests sexualized content

Instagram is actually a harmless photo platform. However, the algorithm delivers a toxic mix and apparently displays sexualized content of children - who should not be on the platform at all.

In spring 2010, the two Stanford graduates Kevin Systrom and Mike Krieger were working on a photo service in San Francisco. A few months later, the Instagram app was launched. The rush is huge, thousands of people download it within a few hours. The computer systems crashed again and again. In 2012, Facebook strikes for one billion US dollars.

Today, two billion people worldwide are registered on Instagram. The photo app is the fourth largest social network after Facebook, YouTube and WhatsApp. It is no longer just a photo community: videos have also been able to be uploaded since 2013. Reels were introduced in 2020, short clips with effects, like on Tiktok.

Since 2016, Instagram no longer displays posts chronologically, but instead uses an algorithm. This is now becoming the platform's downfall. Because it ensures that sexualized clips of children are flushed into the timelines. Journalists from the "Wall Street Journal" (WSJ) have discovered this. They used test accounts to follow young athletes, teenagers and influencers. And users who follow these young girls. The result was a toxic mix of sexualized content with minors and adult pornography.

Companies stop advertising on Instagram

For Jürgen Pfeffer, Professor of Child Protection on Social Media Platforms at the Technical University of Munich, this is hardly surprising. After all, this is how the network works: we see what Instagram thinks we want to see. "As long as you're not interested in these topics, it's quite possible to spend years on social media without seeing problematic content," says Pfeffer in the ntv podcast "Wieder was gelernt".

Between July and September alone,Facebook removed 16.9 million pieces of content relating to the sexual exploitation of children, twice as much as in the previous quarter. At X, ten to twenty percent of the content revolved around pornography, according to the expert. The former Twitter is the only major platform where this is not banned.

According to the WSJ research, advertisements from major brands such as Disney, Walmart, Tinder and the Wall Street Journal itself run between the lewd videos, which are less than enthusiastic about them. Some, such as Tinder parent company Match Group, have since stopped their advertising in the Instagram reels as a result.

"Strict guidelines for nude images"

The parent company Meta seems to have been aware of the problem for some time. However, a Meta spokeswoman will only answer ntv.de's interview request in writing: "We do not allow content on Instagram that is related to the abuse and endangerment of children and have strict guidelines regarding nude depictions of children," she writes in an email. "We remove content that explicitly sexualizes children. More subtle forms of sexualization, such as sharing images of children in conjunction with inappropriate comments, will also be deleted."

According to Instagram, it uses photo matching to filter out images and content that endanger children. This applies to photos with naked children, for example. It automatically scans all images and deletes them if necessary. Nevertheless, there is a lot of content of children on Instagram or other social platforms that should not be there according to the protection of minors or the guidelines.

One of the reasons for this is that automated systems find it more difficult to analyze video content than text or still images, writes the WSJ. It is even more difficult with reels, as they show content outside the user's own circle of friends - from sources that the user does not follow. Meta employees had security concerns even before the introduction of Reels 2020.

Algorithm needs to be trained

Instagram filters out problematic content using AI and machine learning, explains Angelina Voggenreiter, research assistant at the Technical University of Munich, in the podcast. "To do this, however, the algorithm first needs a test set with sufficient data to show what images should not be there." However, this is particularly difficult with child pornography. This is because this data is not necessarily available in this quantity, especially at the beginning.

The systems are good at recognizing nudity, "but when it comes to children who are half-dressed, it is very, very difficult to filter them out automatically," explains Voggenreiter. The platforms are largely dependent on other users reporting such content, but are overwhelmed by the large number of these reports and cannot delete them immediately.

In addition to the problem that automated systems have more difficulty analysing video content than text or still images, social media platforms are best able to monetize this content: Users should spend as much time with them as possible. Experience shows that they stay on screen longer with moving images than with text or photos. During this time, more advertising can be shown, which generates more money.

Employees are not allowed to change the algorithm

Instagram has also optimized its algorithm to suggest videos to users that match their interests or previously clicked content. So if you often look at young girls or women in bikinis, similar videos are suggested. An endless stream - which continues until you actively search for other content or these recommendation algorithms are "significantly" changed. But that's not what Meta want, current and former Instagram employees told the WSJ. So as not to lose users, as explained in the report.

Technically, this is not a problem, says social media professor Jürgen Pfeffer. "Have you ever tried to see the goals from your favorite Bundesliga club's last game on YouTube? You probably won't find them because media companies have strong copyright laws behind them. It works very well to hide content that you are not supposed to see." Better regulation at European level could be a solution.

User age is not checked

Another weakness of Instagram is the age of users. You are not actually allowed to register until the age of 13. In reality, there are probably millions of children on the platform. This is because, although children are asked to state their age correctly when registering, it is not checked. Once registered, they can move freely on the platform. Only when they upload content themselves do they have to verify their age, says expert Voggenreiter. However, this is difficult and is rarely enforced.

Meta is said to have known about this problem for years, but ignored it. Dozens of US states have therefore sued the company. According to the statement of claim, Instagram deliberately lured children to the platform. And also collected personal data such as their locations or email addresses. Although the parents did not give their consent - which is actually mandatory.

This comes as no surprise to Pfeffer. After all, user numbers are the most important "commodity" for companies like Meta. "A key metric for the stock market price of platforms is how many new users they have per month," the social media expert analyzes in the podcast. Monitoring the hundreds of thousands of new Instagram accounts every day would be expensive and time-consuming. A delay in the registration process could have an impact on the company's success.

Despite all the justified criticism of Meta, the experts also see another, much bigger problem on Instagram: Family influencers. These are parents who show off their children on their accounts, sometimes half-naked in diapers and bikinis. They feed the algorithm with these images and videos and provide pedosexual users with new "material" for their fantasies.

Read also:

  1. Despite Meta's strict guidelines on nude depictions of children, a significant portion of pornographic content on the former Twitter (X) revolved around such material, according to the expert.
  2. Instagram's algorithm, which was introduced in 2016, is often criticized for suggesting sexualized content to users, including children. This has led to a toxic mix of such content and adult pornography being displayed on the platform.
  3. Companies like Disney, Walmart, Tinder, and the Wall Street Journal itself have expressed concern after discovering that their ads often appear between lewd videos on Instagram Reels, which contain child pornography and pedosexual content.
  4. Meta Platforms, the parent company of Instagram, has acknowledged the problem of child pornography on the platform but has failed to effectively combat it, leading to lawsuits from dozens of US states alleging that the company deliberately lured children to the platform and collected their personal data without parental consent.

Source: www.ntv.de

Comments

Latest

Grave accusations levied against JVA staff members in Bavaria

Grave accusations levied against JVA staff members in Bavaria

Grave accusations levied against JVA staff members in Bavaria The Augsburg District Attorney's Office is currently investigating several staff members of the Augsburg-Gablingen prison (JVA) on allegations of severe prisoner mistreatment. The focus of the investigation is on claims of bodily harm in the workplace. It's

Members Public