Tensions run high as EU lawmakers are faced with American oligarchs refusing to abide by social media regulations. Aoife Kilbane-McGowan discusses the past, present and future safety of EU citizens on the internet.
Two decades on from Irish society being revolutionised by the introduction of Bebo and Facebook, it is undeniable that our public sphere is now completely integrated with social media platforms. According to a report from DataReportal, Ireland had the third highest percentage of internet users in the world in 2023, with a 99% internet adoption rate.
The first comprehensive governing framework on social media platforms was not introduced until the Digital Services Act (DSA) was passed in the European Parliament in 2022. Prior to the DSA, social media functioned as a wild west for private companies to collect user data, advertise, and experiment with algorithms. The lack of regulation had major social impacts including widespread smart device addiction, mass data leaks, and the dissemination of fringe conspiracy beliefs.
The legislation has set a global benchmark for online platform regulation, but as the implementation deadline for the DSA has passed, it has caused significant tension between the EU Commission and the American tech conglomerates that dominate the internet.
Meta CEO Mark Zuckerberg has repeatedly blamed the DSA for delays in rolling out new services, particularly AI integration and Instagram Threads, to European users; political groups of the EU Parliament, such as the Socialists and Democrats, and The Left, have lambasted the company for violating privacy laws, child protections, and the integrity of democratic elections.
The DSA sets out four major responsibilities for Very Large Online Platforms (VLOPs), or social media platforms with more than 45 million users, across the EU: to increase ease of use in reporting and removing illegal content, child and data protection, moderation consistency and transparency, and election interference prevention.
While many of these measures had previously voluntarily been adopted by VLOPs, the DSA has formalised them and created transparency in how and why platforms remove and censor content. The Digital Transparency Database requires all VLOPs and Very Large Search Engines (VLSEs) to submit reports on all instances of account and content removal with reasoning, and to provide an outside of court fair appeal process.
Since Elon Musk’s take over of X in 2022, the company has been accused of manipulating internal algorithms to promote far-right content on the For You section of the app’s home page, as well as through the reply function beneath posts.
The database now has more than 9 billion reports. The database classifies removal reasoning into non-specific categories such as, ‘scope of platform service’, ‘illegal or harmful speech’, and ‘negative effects on civil discourse or elections’ which make up the vast majority of removals on Instagram, Facebook, Tiktok and X (formerly Twitter).
The deadline for compliance with the DSA for all online platforms was in February 2024, and in the year since, the EU Commission has already been forced to open enquiries into Meta platforms and X. Since Elon Musk’s take over of X in 2022, the company has been accused of manipulating internal algorithms to promote far-right content on the For You section of the app’s home page, as well as through the reply function beneath posts.
This reply function prioritises accounts that are verified through a paid subscription to the app, which is significantly more popular among right-wing users, over organic engagement, further boosting far-right content.
If Meta or X are found to violate the DSA regulations they could face fines of up to 6% of their global annual turnover, equivalent to hundreds of millions of Euros.
In addition to this, Musk himself is facing allegations of election interference through his endorsement of Germany’s hard right Alternative for Germany (AfD) party ahead of the parliamentary elections on 23 February. Meta is also likely to face a second investigation under the DSA should the company attempt to roll out their new ‘Community Notes’ programme that launched in North America, replacing in-house fact checking professionals by outsourcing the responsibility to users themselves.
If Meta or X are found to violate the DSA regulations they could face fines of up to 6% of their global annual turnover, equivalent to hundreds of millions of Euros. These fines could be existential to the operation of American tech companies in the EU. At the Artificial Intelligence Summit held in Paris earlier this month, US Vice President JD Vance took aim at the DSA directly, potentially threatening EU-US relations if the regulations remain in place and fines are applied.
What does this mean for the future of social media in the EU? That remains unclear, but opens up a massive opportunity for policy development and innovation by MEPs. One possibility is that X and Meta platforms exit the market rather than comply with DSA regulations and pay their fines. Drastic as it may seem, such a shift may also usher in an era of a more diversified social media landscape.
Already, growth can be observed of existing, but smaller platforms, that are willing to operate under the DSA such as Bluesky and Substack. More extreme reactions have proclaimed this would be tantamount to a Digital Iron Curtain, cutting EU citizens off from the most popular social networking platforms around the world.
Their exit remains highly unlikely for now, and the real division between European and North American users is already underway. In the meantime, Meta is continuing to provide Europe with a fact-checking team, and despite Elon Musk’s best efforts, X still must take down illegal content upon reporting in the EU, which includes neo-nazi or far-right rhetoric and imagery on most of the continent.
Now, the real question is on how far the new administration in the US is willing to go diplomatically on behalf of American VLOPs, and how much pressure the EU Commission is able to withstand in order to protect our rights to privacy, protection from hate speech, and access to correct and unbiased information online.