Social Media platform Instagram continues to recommend adult-oriented content to underage users, a report by the Wall Street Journal has found.

The report has exposed cracks in Meta Platforms’ earlier claims to provide a secure and age-appropriate digital environment for teenagers. 

The publication had tied up with an academic researcher to undertake a seven-month-long test to investigate the issue. The process involved creating new accounts for 13-year-old users. 

In its investigation, WSJ found that within twenty minutes of teenagers consuming content on Reels, their feeds were inundated with promotions from creators who offered to send explicit photos to users who engaged with their posts. 

Meta’s past promise falls through 

In January, the Mark Zuckerberg led social media giant announced that it will implement new content guidelines to ensure teenagers using the platform get a secure and age-appropriate digital environment as advised by experts. 

It had aimed to eliminate any material that was deemed unsuitable for teenagers whether it was part of content on Reels or ‘Explore’. 

The company had said, 

We’re automatically placing teens into the most restrictive content control setting on Instagram and Facebook. We already apply this setting for new teens when they join Instagram and Facebook, and are now expanding it to teens who are already using these apps.

Past investigations and accusations 

News reports in the recent past too have placed Instagram in a dock. 

A WSJ investigative-report published earlier this month claimed that the platform helped connect and promote a vast network of accounts openly devoted to the commission and purchase of underage sex content. 

In April, Meta’s new encryption technology for direct messages on Instagram and Facebook was slammed by the Virtual Global Taskforce- an alliance of 15 law enforcement agencies. 

The VGT had said the announced implementation of the encryption was an example of a “purposeful design choice” that degraded safety systems and weakened the ability to keep child users safe.

The VGT’s statement stemmed from concerns that the technology feature, while aimed at enhancing privacy, could shield online child predators and paedophiles

Meta had defended its technology feature. 

Platform Already under scanner in Europe

Last month, the EU had opened fresh investigations into Facebook and Instagram over concerns that the platforms were not able to protect child users. 

The bloc had said that the recommendation engines of the platforms could “exploit the weaknesses and inexperience’” of children and stimulate “addictive behavior”.

It said it could also reinforce the so-called “rabbit hole” effect that leads users to watch increasingly disturbing content. 

The EU is looking into Meta’s use of age verification tools to prevent children under the age of 13 years from accessing Facebook and Instagram. 

It would also find out whether the company is complying with the bloc’s Digital Service Act (DSA) to ensure a high level of privacy, safety and security for minors. 

As part of the probe, the commission will look into Meta’s use of age verification tools to prevent children under the age of 13 from accessing Facebook and Instagram. 

It will also find out whether the company is complying with the bloc’s Digital Service Act (DSA) and enforcing a high level of privacy, safety and security for minors.

More trouble for Meta?

The latest report by the WSJ could thus bolster the EU’s claims, in what could potentially spell trouble for the twin social media platforms.

As part of the EU investigation, any violation could mean fines of up to 6% of Meta’s annual worldwide revenue. 

The post Instagram continues to recommend adult content to underage users: WSJ report appeared first on Invezz

Author