Having helped re-elect Donald Trump with a $250 million donation and generous boosting on his X platform, Elon Musk has turned his gaze to Europe, notably the UK and Germany.
He is currently convulsing British politics with an assault on Keir Starmer over historic cases of male grooming by Muslim gangs. He has called Labour minister Jess Phillips a "rape genocide apologist," and claimed that "hundreds of thousands of little girls in Britain [are]... being systematically, horrifically gang-raped".
Musk is now enthusiastically endorsing the far-right AfD in the German elections.
Alarm over his potential to single-handedly turn an election through his X platform reached fever-pitch this week, crystalising around Musk's conversation on Thursday with Alice Weidel, the AfD’s joint leader, just five weeks ahead of the German election.
Musk has certainly radicalised his political views. He stands accused of amplifying extremist, anti-immigrant and conspiracy-laden discourse through the global megaphone that he controls.
Spanish prime minister Pedro Sanchez, marking the 50th anniversary of the death of Franco, charged this week that Musk "attacks our institutions, stirs up hatred and openly calls for the support of the heirs of Nazism in Germany's upcoming elections".
Musk himself, and his supporters, counter that mainstream politicians and the media are trying to censor the political views of anti-establishment parties and a new groundswell of voters, disenchanted with the status quo, especially on issues like immigration, and who are crying out for "common sense" policies.
Whichever side is right, the Musk phenomenon has cast a fresh light on a landmark piece of EU legislation: the Digital Services Act (DSA).
The DSA obliges very large social media and other online platforms to remove harmful and illegal content, to mitigate against the risks content can have on the vulnerable, on children, or on the ability of voters to make a free and informed choice when it comes to elections.
The law, and the European Commission, whose small but growing teams of experts implement it, have been caught in the crossfire between Musk’s free-speech advocates, and a growing band of European leaders who want the Commission to immediately use the DSA to bring Musk to heel.
This week French foreign minister Jean-Noel Barrot called on the Commission to apply the DSA "with the greatest firmness". If not, the Commission’s powers should be given back to member states.
Indeed, there has been a suspicion - denied by officials - that Commission President Ursula von der Leyen is deliberately slow-walking an ongoing DSA investigation into Musk’s X platform so as not to alienate the incoming Trump administration.
JD Vance, the incoming US vice-president, has already suggested continued US support for NATO should be conditional on the EU dropping threats to regulate Musk.
The true picture is a complicated one.
The DSA is a huge and powerful new piece of legislation which is operating in newly charted waters.
It is taking on corporations like X, TikTok, Instagram, Facebook, YouTube and others which have deep pockets, expensive lawyers, and - in the case of Musk - the enthusiastic support of Donald Trump (Meta this week dropped its factchecking policy in favour of the user community approach, seen by critics as the company pre-pandering to the new administration).
Unlike the Commission’s anti-trust approach, the DSA gives Brussels direct enforcement powers: ultimately, a private company can be fined 6% of global turnover if found in breach.
While the EU is hoping that the DSA will lead a global push to bring free-wheeling tech companies into line, the sheer ambition of the law means that officials are extremely careful in how it is used and that cases stand up in court.
In fact, 53 of the DSA’s 93 articles cover "procedural safeguards" to prevent the Commission from over-reaching when investigating online platforms.
None of this has saved the DSA from a highly partisan debate.
When the former Industry Commissioner Thierry Breton warned Musk last July that his online conversation with Trump could be in breach of the DSA (if it artificially boosted the content into the timelines of users who didn’t necessarily ask to see it), Musk invited Breton to "f*ck your own face".
Ahead of the interview with Weidel, Musk posted: "First, the EU tried to stop me from having an online conversation with president @realDonaldTrump. Now they want to prevent people from hearing a conversation with Alice Weidel, who might be the next chancellor of Germany. These guys really hate democracy."
In reality - and officials have been at pains to say this all week - there is nothing in the DSA that prohibits an online interview with a political party, no matter what their views.
Ireland’s EU Commissioner Michael McGrath told RTÉ News this week: "There is never an issue or problem with the conducting of any interview. Freedom of Expression is a fundamental right in the European Union.
"But clearly, very large online platforms have enormous power, and they have the ability to amplify certain content above and beyond other content.
"When it comes to the conduct of elections, we do have to make sure that our elections are conducted in a free and fair manner without undue interference."
In other words, the Musk-Weidel conversation could unduly influence the German election if it was boosted more prominently - through so-called algorithmic ranking - in the news feeds of millions of voters, giving the AfD an advantage that other parties would be denied.
However, there is a high bar to establishing if a social media platform like X has maliciously written hundreds of thousands of lines of code - ie, the algorithms - in order to disrupt the democratic process.
The European Commission started out with 100 staff, now boosted to 150, to run the DSA. Much of the work is done by the European Centre for Algorithmic Transparency (ECAT) in Seville, and many contractors have been hired from the ranks of staff recently laid off by tech giants.
Essentially, the DSA describes categories of risk that can come into play when enormous social media companies interact with hundreds of millions of users under the cloak of anonymity.
These risks include - among others - content that is illegal (such as child abuse imagery), risks to freedom of expression, violence against women, and risks to civic discourse and free and fair elections.
The law requires a much greater degree of openness and cooperation from X, Instagram, TikTok and others than before: companies must publish annual reports on how they are moderating content and mitigating the risks, they must make their data systems and algorithms open to independent researchers, and they must respond when asked to take harmful content down.
Since many of the laws covering illegal content are national - as opposed to EU - laws, it is up to digital coordinators in member states (in Ireland’s case, Coimisiún na Meán) to demand a response from the platforms.
Social media output is round the clock and relentless, with volumes of new content being blasted into the information space.
As the DSA beds down, it has become, however, an uneasy web of relationships between national regulators, the European Commission and the tech giants themselves, who may be more interested in keeping one step ahead.
"Algorithms are rapidly and constantly changing," says former Dutch MEP Marietje Schaake, currently international director at Stanford’s Cyber Policy Centre and author of The Tech Coup: How to Save Democracy from Silicon Valley.
"But that should not mean companies can opt out of compliance. It does mean we need to innovate and update the oversight mechanisms to be able to meet the challenges and new realities that emerging technologies bring."
Furthermore, social media output is round the clock and relentless, with volumes of new content being blasted into the information space.
Companies may be unwilling to hand over documents, and the Commission has the powers to demand material at short notice, with 24- or 72-hour deadlines.
In some instances, the tech companies are willing to respond quickly.
When TikTok launched its TikTok Lite service in Spain and France without a prior risk assessment, regulators were concerned that this new version of the app, which encouraged customers to earn points by watching more videos, was potentially highly addictive and harmful to younger users.
The Commission threatened interim measures, but within six weeks TikTok voluntarily withdrew the app from its EU operation and committed never to deploy such a system in Europe again.
However, several other open investigations are an early test for the DSA.
In December, after Călin Georgescu, the far-right, pro-Moscow candidate in the Romanian presidential election, came out of nowhere to win in the first round, there were immediate allegations that a Kremlin-back campaign on TikTok was responsible.
The European Commission stepped in after declassified secret service documents purported to reveal a large number of coordinated and fake TikTok accounts posting support for Georgescu.
These had overloaded the algorithm operating the so-called "recommender system", meaning that voters’ news feeds were flooded with content they wouldn’t have chosen.
In May, the Commission opened an investigation into whether Facebook and Instagram’s risk systems did not protect minors from addictive behaviour or online activity which re-inforced the so-called "rabbit hole" effect, whereby children could be driven to ever more harmful suicide- or eating disorder-related sites.
A further probe has been looking at whether Facebook’s advertising policies did not protect against so-called doppelganger sites, web pages designed to look like genuine online newspaper pages, but which were fake and filled with disinformation.
In December 2023, a probe was launched into X’s blue check system on the basis that an $8 per month charge to have a blue tick would be no deterrent to a motivated individual wanting to boost disinformation to a greater number of X users (the investigation is also looking at whether independent analysts - such as those who uncovered the doppelganger phenomenon - were being shut out of X’s data systems).
Progressing these cases is proving highly labour intensive.
In July, the Commission issued a preliminary finding that X’s blue check system breached the DSA.
Under the rules, X is entitled to challenge the finding, meaning a large team of Musk’s lawyers receiving an estimated six thousand pages of Commission documents, including confidential information from insiders on the functioning of the company’s algorithms (the latter has meant the Commission having to protect sources), or documents that might exonerate X’s methodology.
It’s understood that since September, the Commission has been sifting through hundreds of pages of rebuttals and annexes from X. The company has contested the case against it and challenged the Commission on procedural grounds.
What we are seeing is the algorithmification of our public space.
Meanwhile, a second case is ongoing against X, looking at whether both its "community notes" system of user-driven fact-checking and its policies on hate speech and violent content are effective.
This week Commission officials braved relentless questions about Musk’s interventions in Germany and elsewhere. Why weren’t the year-long investigations into X bearing fruit?
"We can’t afford to wait until the German elections," admitted spokesperson Thomas Reignier.
"We opened an inquiry in 2023 and we are awaiting the outcome of that inquiry. We need to collect evidence. The Commission is working…24/7 on all these major online platforms, gathering evidence. We will continue that work, and we will ensure that measures are taken."
Officials insist that it is only when the back and forth with X is complete, and when they believe they have a watertight case, that the investigation will move from the technical to the political level.
Sources were unable to say if Musk’s interview with Alice Weidel on Thursday evening would be added to the investigation or whether a fresh probe would be opened.
"No decision has been made by the Commission to open any new investigation, but we are clearly monitoring developments," Ireland’s Commissioner Michael McGrath told RTÉ News.
"It is the use of the technological tools in an unfair manner that could potentially influence elections, [that] is where the commission has the power to step in where required, but that will be used only on a case by case basis and always based on the evidence that we can assemble," he said.
In the event, Musk and Weidel livestreamed their conversation on X, discussing everything from nuclear energy to whether Musk believed in God to the prospect of human colonisers on Mars returning to earth - in a future date - to save humanity from annihilation.
There were many startling and unchallenged assertions. Musk said it was "literally" legal to steal in California if the stolen item was worth below $1,000; Weidel said German students were taught "nothing" at school or university except gender studies and that Hitler was a "communist".
Whether the event is in breach of the DSA remains to be seen, and officials admit that Musk - with 211 million followers, and a tireless poster - is not your average X user.
On Friday, Commission spokesperson Thomas Reignier strongly denied a claim by Weidel that "150 bureaucrats" were watching the conversation; he said what was being monitored was not the content, but the systems put in place to broadcast the event (he said "two or three" officials were doing this).
Either way, it is unlikely that there will be any conclusion to the investigations into X before the German elections on 23 February (the AfD is currently in second place, polling around 20%), but some observers take a more fearful view of Musk’s powerful influence on electoral politics and the information space.
"We've gone from these big tech operators having been accountable to no one to Musk himself being a super-spreader of innuendo, of conspiracy theories, or of outright lies," says Georg Riekeles, associate director at the European Policy Centre.
"What we are seeing is the algorithmification of our public space. If these algorithms are not put together properly, they will render the capacity to organise liberal democratic societies virtually impossible, if everything that is promoted tends to be the most shocking, the most extreme views.
"With the DSA, the EU has given itself the powers to enforce transparency and ban algorithms that amplify lies. Now the EU must be ready to sanction."
The Commission insists that the DSA has established a framework for sincere and proactive cooperation between regulators, at EU and national level, and the tech companies.
During a period last year of real fear about deep fakes, officials followed elections in Slovakia, Poland, Finland and the Netherlands, as well as the European Parliament elections in June, concluding that problems were kept to a minimum thanks to guidelines issued and a collaborative approach with tech companies.
According to Coimisiún na Meán, this also applied to the Irish election in November, with the agency regularly engaging with tech companies to ensure election integrity was upheld.
"Throughout the election period we scrutinised platforms’ compliance with the DSA," says a spokesperson, "including the obligation of VLOPs [very large operating platforms] to assess and mitigate systemic risks of negative effects on civic discourse and electoral processes arising from how their services were used during that time".
The fact remains that it could take a long time before breaches of the law are established and fines imposed.
In the case of X, Elon Musk is unlikely to be deterred by the threat of financial sanctions, and the potential for retaliation by the Trump administration cannot be discounted."The next US administration can and will bring a lot of trouble for the EU on numerous fronts, and the DSA fines won't bankrupt Musk," says Marietje Schaake, author of The Tech Coup.
"It is vital that the EU starts doubling down on European alternatives and public digital infrastructure to limit the dependencies. Sanctions or fines are always reactive. This moment in time reminds us that the EU needs to be pro-active."