Connect with us

Technology

Paris is preparing for an AI-monitored Olympic Games

Avatar

Published

on

Hundreds of thousand of athletes and visitors attending this year’s Olympics could have their movements analyzed by a real-time AI video surveillance tool.

When this year’s Summer Olympics start in Paris next week, nearly 100 floats will be filled with the world’s best athletes they were expected to make their way across the River Seine. About half a million fans will cheer as their country’s sporting ambassadors make their way through the Louvre, the Eiffel Tower and a guidebook full of other historical monuments. But fans won’t be the only ones watching. Thousands of CCTV cameras overlooking the river will monitor events in real time. Behind the scenes, powerful new artificial intelligence models will scan through the footage looking for signs of danger hidden in the hustle and bustle. The controversial new AI-based surveillance system, which has been criticized claiming that this could breach the European Union’s wider privacy lawsis one of the many ways France is using technology to make this year’s Olympic Games one of the most tightly controlled ever.

AI surveillance will look for public disruptions

French lawmakers passed a new law at the end of last year This temporarily gives law enforcement the ability to use “experimental” artificial intelligence algorithms to monitor public video feeds and provide “real-time crowd analytics.” In practice, the AI ​​detection models will reportedly scour the feeds of thousands of CCTV cameras looking for signs of potentially dangerous anomalies hidden in the Olympic crowd. Those warning signs can include people wielding weapons, larger-than-expected crowds, fights and brawls, and unattended luggage.

A police officer stands in front of a giant screen showing videos taken by surveillance cameras on the streets of Levallois-Perret, outside Paris on January 10, 2012 at the Levallois police station. Credit: LIONEL BONAVENTURE/AFP via Getty Images

France is working with a number of technology companies for AI analytics, including Wintics, Videtics, Orange Business and ChapsVision. Law enforcement agencies have already tested the new system in a number of subway stations, the Cannes Film Festival, and a packed Depeche Mode concert. Paris Police Chief Laurent Nunez recently told Reuters the concert trial went “relatively well” and that “all lights are green” for the use of the system during the Olympic Games.

If the AI ​​model detects a potential threat, it will report it to a human law enforcement officer, who will then decide whether or not to take further enforcement action. French officials claim the real-time analytics will all take place without ever using facial recognition or collecting other unique biometric identifiers. Instead, law enforcement and their private partners say the model will only measure “behavioral patterns” such as exercise and positioning. Officials claim that the AI ​​cannot identify individuals based on their biometric identity.

“It’s not about recognizing ‘Mr. X’ in a crowd,” said French Interior Minister Gérald Darmanin he said at a meeting with French lawmakers earlier this year. “It’s about recognizing situations.”

Olympics will put France’s new ‘experimental’ AI video surveillance to the test

But some critics question whether it is technically possible to perform this kind of AI video analysis without inadvertently collecting and comparing biometric identifiers. This could put France at odds with Europe’s General Data Protection Regulation (GDPR) and the recently passed EU AI Act. A coalition of 38 European civil society organizations wrote in an open letter earlier this year argue that the model’s reported monitoring of gait, body positions and gestures could still qualify as biometric markers used to identify certain individuals or groups. If that is the case, the groups argue, the system would violate existing GDPR rules that limit the scope of biometric data collection in public spaces.

The GDPR rules allow certain exceptions to the biometric collection rule under a benefit of general interest, but rights groups argue that the permissions granted in the French case are too broad and disproportionate to any apparent threats. Rights groups and some lawmakers who oppose the accelerated law was also afraid that this might be possible set a dangerous precedent for future draft laws on public surveillance and could potentially undermine the EU’s broader efforts to rein in AI surveillance. Amnesty International advisor on AI regulation Mher Hakobyan said the supervisory authority, even if only temporarily, “France is in danger of permanently transforming into a dystopian surveillance state.” Human Rights Watch, that wrote his own letter to French lawmakers against the accelerated lawalso fears that it poses a “serious threat to civil liberties and democratic principles,” and risks further exacerbating racial disparities in law enforcement.

“The proposal paves the way for the use of invasive, algorithm-driven video surveillance under the pretext of securing major events,” Human Rights Watch said. wrote in his letter. “The very existence of untargeted (often indiscriminate) algorithmic video surveillance in publicly accessible areas can have a chilling effect on basic civil freedom.”

Others, meanwhile, fear that the supposedly temporary new measures will inevitably become the status quo. The surveillance law officially expires in 2025, although lawmakers will have the option to extend its shelf life if they wish. Supporters of the expanded powers argue that these are necessary tools to strengthen the country’s defenses against potentially deadly terrorist attacks. France has that specifically experienced more than half a dozen major attacks over the past two decades, including a series of shootings in 2015 which killed 130 people. The 2015 incident resulted in France issuing a temporary state of emergency that it is over extended for more than two years.

“We have seen this before at previous Olympic Games, such as in Japan, Brazil and Greece,” said digital rights activist Noémie Levain of La Quadrature du Net. during an interview with the BBC earlier this year. “What were supposed to be special safety features for the special circumstances of the games were eventually normalized.”

France is stepping up security for a massive open-air opening ceremony

France’s emphasis on security at this year’s Olympics goes beyond video surveillance. Authorities have designated the immediate vicinity of parts of the Seine River where the opening ceremony will take place as a “anti-terrorism perimeter.” The approximately 6km stretch will be subject to increased security levels between July 18 and 26.

About 20,000 French residents who live and work within that perimeter will do so as well reportedly being forced to undergo background checks prior to the Games to determine whether or not they have alleged links to suspected Islamist extremist groups. These people each receive one government issued QR code which will allow them to navigate the area during the event. Well-armed police and military units, which have become a common sight across Paris over the past decade, will reportedly ten times their normal attendance. Local law enforcement will reportedly work with hundreds of dive bomb specialists, counter-terrorism units and specialist troops trained to take down potential drone threats.

For years, the Olympics have served as a testing ground for countries around the world to advertise and deploy their latest digital monitoring tools. China used facial recognition at security checkpoints during the 2008 Olympic Games in Beijing and again during the 2008 Olympic Games more recent winter games. Russian intelligence officials similarly monitored the 2014 Winter Olympics in Sochi supervised digital communications and internet traffic from both competitors and visitors. In all these cases, host countries justify going beyond the bounds of normal surveillance operations as a means of ensuring security in a time of unprecedented scrutiny. There is a legitimate reason for concern. The Olympic Games have happened source of violence on more than one occasion. But even as the immediately perceived threat diminishes, host countries are known to cling to their new monitoring capabilities, say one practitioner. ultimately erodes civil liberties over time. Whether France will follow that same playbook, however, remains to be seen.