Meta under investigation in the EU for DSA violations against minors
posted Sunday May 19, 2024 by Scott Ertz
This week, the European Commission, the executive arm of the European Union, initiated an investigation into Meta, the parent company of Facebook, Instagram, and Threads. The Commission believes that the company and its social networks have not done enough to prevent harm to minors and to verify that minors are not able to access aspects of the company's offerings that they are prohibited from according to the Digital Services Act (DSA).
The DSA and minors
The represents a significant shift in the digital landscape, particularly for minors. One of the key aspects of the DSA is its focus on the safety and rights of children online. It includes measures such as the swift removal of illegal content, which encompasses child sexual abuse material, illegal hate speech, terrorist content, or illegal products. This act aims to create a safer digital space where the fundamental rights of all users, especially minors, are respected and protected.
Another important provision of the DSA is the ban on targeted advertising aimed at children. This means that platforms like Snapchat, Google, YouTube, Instagram, and Facebook are no longer allowed to show targeted ads to underage users. This move is expected to significantly reduce the exposure of minors to potentially harmful or inappropriate content, as well as limit the commercial exploitation of their online behavior.
Furthermore, the DSA sets out clear due diligence obligations for online platforms, ensuring that any user, including minors, can flag illegal content and have a clear means of contesting platforms' content moderation decisions. This empowers young users to have more control over their online experience and ensures that their voices are heard in the digital environment.
Most importantly, in this case, the DSA also provides protection for children from harm created by online platforms themselves. This includes preventing children under a particular age from accessing certain online platforms and requires those platforms to employ sufficient checks to prevent it.
The EU's case against Meta
The European Commission believes that Meta has potentially violated several aspects of the DSA. In particular, the probe will be investigating whether or not Meta has done enough to ensure that minors are not permitted to sign up for an account. Of course, there are ways around any technological checks, but it is required that the platforms at least try to enforce these rules.
It is a huge difference between the DSA in the EU and the COPPA in the US. Per COPPA, the responsibility is mostly on the user to self-regulate. But the DSA requires that the platforms enforce the rules. Interestingly, the EU does not provide clear guidelines, technology to implement, or even a certification process. It's mostly a guessing game.
In addition to the age restrictions, the probe will look into how the Meta algorithm works and whether or not it purposefully creates "rabbit holes" into content that can be harmful to users, especially young users. Several studies have shown that social media, and in particular brands owned by Meta, have significant negative effects on children, so this probe is likely to return some interesting results.
If the company is found to be in violation of DSA, the penalty can be steep. In fact, for a first violation, the EU can levy a fine of up to 6 percent of a company's global annual turnover. For Meta, that would be a major penalty. If the company doesn't fix the problems outlined in the complaint, future violations can cost the company up to full banning within the jurisdiction.