top of page

Privacy Concerns Raised Over Network Rail's Emotion Detection Trial

Network Rail, the public body managing most of the UK's railway infrastructure, has been secretly trialling Amazon's AI software to monitor rail passengers at major stations across the country. The controversial scheme, which has raised privacy concerns from civil liberties groups, involved cameras filming people's faces at ticket barriers to detect their approximate age, gender, and even emotions The two-year trial took place at nine stations, including London Waterloo, Euston, Manchester Piccadilly, and Glasgow. According to documents obtained through a Freedom of Information request, the images were captured every five seconds or when a person passed through the barriers. The data was then sent to Amazon's Rekognition software for analysis. While Network Rail claims the trial aimed to improve customer service and enhance passenger safety, the documents reveal a broader intent to use AI for tackling trespassing, crime, overcrowding, accidents, and anti-social behaviour. The public body also stated that the information could be used to "maximise advertising and retail revenue." Civil liberties group Big Brother Watch has submitted a complaint to the Information Commissioner's Office (ICO) over the trial, citing privacy concerns. Jake Hurfurt, the group's head of research and investigations, stated, "Technology can have a role to play in making the railways safer, but AI-powered surveillance could put all our privacy at risk."


The ICO had previously advised companies against using emotion detection technology, deeming it "risky" and the technologies "immature." AI researchers have also warned that using such technology to detect emotions is "unreliable" and should be banned.


In the European Union, such systems are banned or deemed "high risk" under the Artificial Intelligence Act. The ICO has expressed concerns about the potential for incorrect analysis leading to inaccurate assumptions, judgments, and discrimination. As the use of AI surveillance technologies continues to grow, regulators and civil society groups are calling for greater accountability, transparency, and public trust in these systems. The Network Rail trial highlights the need for robust data protection measures and a thorough assessment of the risks and benefits of such technologies. Network Rail has refused to answer questions about the scheme but stated that it takes "proportionate action" and complies with relevant legislation regarding surveillance technologies.


1 view0 comments

Kommentarer


bottom of page