Real-time cameras equipped with artificial intelligence (AI) are set to monitor the streets of Paris during the upcoming summer Olympics, aiming to detect suspicious activity such as abandoned luggage and unexpected crowds. However, civil rights groups express apprehension over the use of this technology, considering it a potential threat to civil liberties. French AI company, led by François Mattens, who opposes the concept of a surveillance state, is among the bidders for the Olympics video surveillance contract.
A recent law permits the police to employ CCTV algorithms capable of identifying anomalies like crowd rushes, fights, and unattended bags. Notably, facial recognition technology, as implemented in China for tracking “suspicious” individuals, is explicitly prohibited by the law. Nonetheless, opponents argue that this is just the beginning, fearing that the French government intends to make these security measures permanent, despite the experimental period ending in March 2025. Similar situations occurred in previous Olympic Games held in Japan, Brazil, and Greece, where what was initially justified as special security arrangements for the event became the new norm, according to Noémie Levain from the digital rights campaign group La Quadrature du Net.
Elements of the new AI security system have already been implemented in some police stations across France, including the southern Paris suburb of Massy. The mayor of Massy, Nicolas Samsoen, explains that the AI device monitors their extensive network of 250 security cameras, alerting human police officers to potential issues such as sudden groupings of people. The final decision regarding appropriate action remains in the hands of the human officers, ensuring that the algorithm serves as an aid rather than a decision-maker.
During a demonstration, a piece of abandoned luggage was deliberately left on the street near the police station. Within thirty seconds, an alarm was raised, and CCTV footage of the suitcase appeared on the control room screen. The AI algorithm responsible for this detection had been trained on a vast database of various images featuring lone bags on the streets, with the collection continuously expanding. However, it’s acknowledged that more complex tasks, such as identifying a person on the ground in a crowd or discerning between the start of a fight and a temporary increase in crowd density, pose greater challenges.
The XXII group, a French start-up specializing in computer vision software, is awaiting further specifications from the French government to fine-tune their bid for a portion of the Olympics video surveillance contract. Their expectations include the AI system being able to detect fire, fights, people on the ground, and abandoned luggage. However, they acknowledge that significant time and effort will be required to implement these systems, making the prospect of having them ready for the Rugby World Cup in September unlikely.
François Mattens of XXII emphasizes that their technology strictly adheres to the law and ethical considerations, assuring that facial recognition will not be employed. Nevertheless, digital rights activist Noémie Levain dismisses this distinction, viewing AI video monitoring as a tool enabling state surveillance, analyzing and scrutinizing individuals’ bodies and behaviors. Levain argues that it erodes anonymity, hampers personal freedom in public spaces, and leads to mass control.