During the past two years, eight train stations around the UK tested artificial intelligence (“AI”) surveillance technology with CCTV cameras. Thousands of people catching trains likely had their faces scanned by cameras at ticket barriers with the data sent to Amazon Rekognition for analysis.

The AI trials used a combination of “smart” CCTV cameras that can detect objects or movements from images they capture and older cameras that have their video feeds connected to cloud-based analysis. The image recognition system was used to predict travellers’ age, gender and potential emotions with the suggestion that the data could be used in advertising systems in the future.

As Wired reported on Monday, AI researchers have frequently warned that using the technology to detect emotions is “unreliable,” and some say the technology should be banned due to the difficulty of working out how someone may be feeling from audio or video. In October 2022, the UK’s data regulator, the Information Commissioner’s Office, issued a public statement warning against the use of emotion analysis, saying the technologies are “immature” and “they may not work yet, or indeed ever.”

The scope of the AI trials, elements of which have previously been reported, was revealed in a cache of documents obtained in response to a freedom of information request by civil liberties group Big Brother Watch.

Posted in

Iron Will

Leave a Comment

You must be logged in to post a comment.