This story originally appeared in Jacobin on Oct. 23, 2024. It is shared here with permission.
This month, when Emmanuel Macron’s newly chosen prime minister, Michel Barnier, laid out his first government agenda to the National Assembly, much attention was naturally focused on the budget and immigration. But a seemingly throwaway line pointed to another aspect of security and policing. “We will generalize the methods experimented with during the Olympic and Paralympic Games,” Barnier promised.
I have previously written for Jacobin about the controversial algorithmic video surveillance that France rolled out in advance of the Olympics — a test that was supposed to last through March 2025 and concern only large-scale public events like sporting matches and concerts. Experts in surveillance and human rights told me about the debilitating effects that such mass surveillance can have on dissent and peaceful protest — creating a dissuasive “chilling effect.”
“You get people used to it in that happy face, ‘celebration capitalism’ environment of the Olympics, and then that new technology that was injected during the Games in that state of exception becomes the norm for policing moving forward,” Jules Boykoff, a political scientist who has published multiple books about the Olympics, warned at the time.
As if on cue, as soon as the Olympics ended, a steady drumbeat of Macronist politicians began manufacturing consent around the need to keep this technology, which has not yet been independently studied or analyzed.
In September, less than two weeks after the Paralympic closing ceremony, Paris police chief Laurent Nuñez declared himself “in favor” of extending the technology. The news channel France Info, citing a government source, reported that Barnier’s interior minister also envisioned making the technology permanent. (Currently, a bill floated by a member of his right-wing Républicains party proposes a three-year extension.) Barnier’s speech, while not specifically mentioning the algorithmic video surveillance tool, pushed in this same direction.
“The government is going to do everything in its power to entrench [algorithmic video surveillance technology],” Élisa Martin, a member of the left-wing France Insoumise, told me over the phone. “We’re absolutely certain of this.”
“A Security Showcase for the World”
On July 26, as hundreds of boats carrying Olympians made their way down the river Seine during a rain-soaked opening ceremony watched by twenty-five million viewers worldwide, another show was taking place underground. On the French capital’s metro platforms, nearly five hundred state-of-the-art surveillance cameras were capturing and analyzing human behaviors in real time, assisted by an artificial intelligence tool called CityVision. The AI-based technology, produced by a French start-up, was rolled out across the metro system.
Above ground, 45,000 national police and an additional 20,000 military operatives patrolled the city. An estimated fifty-three drones were shot down by military anti-drone units in the first several days of the Games.
“The Olympics, and especially the opening ceremony on the Seine, were sold as “a security showcase for the world and a moment of experimentation,” Noémie Levain, a legal expert at La Quadrature du Net, a digital rights NGO, told me.
After the National Assembly passed an omnibus Olympics bill on May 19, 2023, which included, among other things, the legalization of AI-assisted mass surveillance tools, French tech start-ups presented offers for Olympics contracts — with several then selected in January 2024. One has been likened to a French version of Palantir — the Peter Thiel–owned surveillance company best known for its discriminatory policing tool used in cities like Los Angeles, which is set to host the next summer Olympics in 2028.
“The bread and butter of these companies is the analysis of human bodies,” Levain told me. “The idea is to analyze them, classify them, and come up with data points.”
The tool, as currently intended, is supposed to catch “predetermined events,” such as a terrorist attack or an “unusual crowd movement.” But researchers worry that the increased use of algorithms in predictive policing, which use racially biased statistics as their initial input, creates a pipeline for additional surveillance of vulnerable communities. “Increasing evidence suggests that human prejudices have been baked into these tools because the machine-learning models are trained on biased police data,” Will Douglas Heaven wrote in MIT Technology Review.
In the French case, little is known about how the technologies actually operate and at what point they’re deployed by police, Yoann Nabat, a jurist and lecturer at the University of Bordeaux, said. “It’s a black box,” he told me.
Algorithmic video surveillance “is only supposed to be a decision aid,” Nabat added. “It is supposed to alert the person behind the screens to say, ‘Be careful, you have to look in this place, at this time.’ Except that there’s a thin line between automation and human interaction. We know that with the lack of existing resources that decision support often turns into the decision itself.”
From Experiment to Fait Accompli
Much has been written about the shock doctrine — the period often following a natural disaster when vulture-like private companies swoop in to take over public services. According to Boykoff, a similar process takes place before, during, and after mega-events like the Olympics.
Katia Roux, the head of advocacy at Amnesty International France, described a similar phenomenon in France. “We haven’t had the balance sheet yet,” she told me of the Olympics surveillance tools. Yet, “there’s a clear political desire to legalize this technology and the Olympics were just a way to get a foot in the door.”
Elia Verdon, a member of France’s Observatory on Surveillance and Democracy, agreed. “I think we have to be careful with periods of experimentation,” she warned. “We end up accepting a technology at a given time in response to a possible threat, and then the next threat, they go even further.”
Since pronouncing themselves in favor of the new technology, Barnier and Nuñez have since walked back their statements about the extension of the measures, saying that they are still waiting for the results of a government report that must be presented before parliament by the end of the year. But it seems that the French public may have already been swayed by government rhetoric — with a recent poll showing that 65 percent of French people supported augmented video surveillance in public space.
“If it’s deemed successful, they’ll extend it,” Levain, from La Quadrature du Net, said. “If it’s not, they’ll say we need more experiments with it. There are so many actors involved, so much money, so much lobbying, they’re not just going to say, ‘Alright, let’s just stop.’”
With an increasingly hard-line government using immigration as a wedge to pass restrictive policies, it’s hard to imagine that the army of private companies in what Macron calls his “start-up nation” won’t step up with more offers.