The European Commission has launched an investigation into TikTok, YouTube, and Snapchat, aiming to uncover how their video recommendation algorithms operate.
The inquiry, announced on 2 October, focuses on the platforms’ responsibility for distributing harmful content, particularly to vulnerable users. This investigation comes under the framework of the EU’s Digital Services Act (DSA), which imposes strict transparency requirements on major online platforms.
The Focus of the Investigation
The investigation stems from concerns about the role of video algorithms in spreading harmful and misleading content. The EU is especially concerned about fake news related to elections, as well as content that glorifies eating disorders, depression, and drug abuse.
The probe will also examine the autoplay feature and infinite scrolling, features that can encourage users to passively consume large amounts of content. Regulators are concerned about whether these features have adequate safeguards in place to prevent the spread of harmful or dangerous material.
This is a crucial aspect, as these platforms host millions of users, including vulnerable individuals who may be more susceptible to negative influence.
Obligations Under the Digital Services Act
Under the DSA, online platforms with over 45 million monthly users must comply with stricter transparency rules. This means they are required to explain how their algorithms work and demonstrate what actions they take to reduce exposure to harmful content.
A senior EU official emphasized that the inquiry should be seen as a “wake-up call” for platforms to improve their practices, including providing users with the option to hide certain types of videos.
Failure to cooperate with the investigation could result in hefty fines under EU law. This would apply to platforms that refuse to furnish documents, provide false information, or fail to meet the DSA’s requirements.
First Step in a Larger Process
While the current investigation is a significant step, it is just the beginning. The EU Commission will scrutinize the responses from the platforms and decide whether to initiate formal proceedings.
This initial phase is critical for the Commission to assess the depth of compliance and transparency shown by the platforms in question.
The investigation is not limited to TikTok, YouTube, and Snapchat. Other platforms like Meta, which owns Facebook and Instagram, are being examined separately.
The Commission initiated an investigation into Meta’s practices in May to ascertain if the platform’s interface intentionally exploits children’s inexperience and fosters addictive behavior.
Ongoing Scrutiny and More to Come
This probe is part of a broader initiative to enforce the DSA. The EU already has pending investigations into platforms like TikTok, X (formerly Twitter), and Chinese e-commerce giant AliExpress.
A top EU Commission official recently indicated that more investigations regarding DSA non-compliance are expected to follow.
The European Commission’s investigation into TikTok, YouTube, and Snapchat marks an important moment in the regulation of social media platforms.
As these platforms continue to shape the way millions of people consume content, the EU aims to hold them accountable for ensuring their algorithms do not spread harmful or misleading information. The coming months will reveal how these platforms respond and whether further action will be taken.
The Information is Collected from Euronews and MSN.