Privacy advocate says facial recognition tech ‘not for police’ – Irish Times

Human rights advocates have raised serious concerns with Attorney General Helen McEntee about legislative plans to allow police to use facial recognition technology, describing it as “unfit for policing.”

The Irish Civil Liberties Commission (ICCL) and six academics working in law, technology and digital policy jointly wrote to Ms McEntee last week, warning that legislation would be introduced to enable Irish police to use the technology in criminal investigations.

Following a November 10 briefing with the Ministry of Justice and the Irish Police, the group rejected the department’s contention that the technology was safe, ethical or legal simply because it was viewed differently by different jurisdictions. Used by police organizations in jurisdictions.

“These comments obscure the fact that the technology is increasingly banned or suspended for regulation around the world,” they said in a letter to Ms McEntee dated November 23.

They said the remarks “ignored” the fact that 200 civil society organizations around the world, as well as 13 NGOs and seven universities in Ireland, had also called for the ban.

“We do not want to replicate the authoritarian character of regimes where such technology is gradually normalizing in all spheres of public life,” the letter said.

The letter was signed by Liam Herrick, executive director of ICCL; Barry O’Sullivan, professor of computer science at UCC, Cork; Elizabeth Farries, director of UCD’s Center for Digital Policy; and TJ McIntyre, associate professor of law at UCD and chair of the Irish data privacy campaign group Digital Rights Ireland.

Ms McEntee, who laid out plans for the technology earlier this year, said safeguards and codes of conduct would be put in place to comply with EU privacy laws and people’s privacy.

The minister said a significant amount of crime involved technology and police needed to have the technical resources to deal effectively with it.

She pointed to the troves of CCTV footage collected by police investigating crimes such as child abduction, child sex abuse and murder investigations.

Human rights and privacy advocates told Ms McEntee in the letter that they agreed that keeping children safe was “of the utmost importance”, but that “protecting children also means protecting them from the dangers and disproportionate consequences of technology”.

The group told the minister they were particularly concerned about Garda members denying “significant and robust scientific evidence that demonstrates issues of accuracy and bias”.

“Not only was the risk to vulnerable groups, especially those with darker skin, not acknowledged, members repeatedly said accuracy was not an issue,” they said.

They have questioned why the government is committing money and resources to enacting a law that may require major changes in the near future, as Ireland will comply with the upcoming EU AI bill, which will cover the technology.

The group asked the minister to consult with interested parties before deciding to use the technology for policing, as the processing of biometric data “constitutes serious interference” in all cases.

They are concerned about community groups that have not formally consulted the Data Protection Commission or are at risk of “established issues of accuracy and bias” with the technology.

The letter also expressed concern about potential risks to the administration of justice.

“The risks to fair trial rights and privacy rights, as well as the possibility of a denial of justice, require clear evidence that this technology is necessary and that it is the least intrusive way to achieve its goals,” the group said.

“We do not believe these tests have passed. Trials of facial recognition technology by other police departments have highlighted these human rights law concerns.”

Source link