Thousands of people catching trains in the United Kingdom likely had their faces scanned by Amazon software as part of widespread artificial intelligence trials, new documents reveal. The image recognition system was used to predict travelers’ age, gender, and potential emotions—with the suggestion that the data could be used in advertising systems in the future.

During the past two years, eight train stations around the UK—including large stations such as London’s Euston and Waterloo, Manchester Piccadilly, and other smaller stations—have tested AI surveillance technology with CCTV cameras with the aim of alerting staff to safety incidents and potentially reducing certain types of crime.

The extensive trials, overseen by rail infrastructure body Network Rail, have used object recognition—a type of machine learning that can identify items in videofeeds—to detect people trespassing on tracks, monitor and predict platform overcrowding, identify antisocial behavior (“running, shouting, skateboarding, smoking”), and spot potential bike thieves. Separate trials have used wireless sensors to detect slippery floors, full bins, and drains that may overflow.

The scope of the AI trials, elements of which have previously been reported, was revealed in a cache of documents obtained in response to a freedom of information request by civil liberties group Big Brother Watch. “The rollout and normalization of AI surveillance in these public spaces, without much consultation and conversation, is quite a concerning step,” says Jake Hurfurt, the head of research and investigations at the group.

The AI trials used a combination of “smart” CCTV cameras that can detect objects or movements from images they capture and older cameras that have their videofeeds connected to cloud-based analysis. Between five and seven cameras or sensors were included at each station, note the documents, which are dated from April 2023. One spreadsheet lists 50 possible AI use cases, although not all of these appear to have been used in the tests. One station, London Euston, was due to trial a “suicide risk” detection system, but the documents say the camera failed and staff did not see need to replace it due to the station being a “terminus” station.

Hurfurt says the most “concerning” element of the trials focused on “passenger demographics.” According to the documents, this setup could use images from the cameras to produce a “statistical analysis of age range and male/female demographics,” and is also able to “analyze for emotion” such as “happy, sad, and angry.”

Lire l’article complet sur : www.wired.com