What’s the best way to tell area residents about plans for a new asylum shelter nearby?
The government should tell communities directly about plans for new asylum shelters, some activists and politicians say.
At the moment, it’s using software from the company Siren to monitor and manage its IT systems, said a spokesperson. But it’s considering whether to use it for more.
An Garda Síochána spent €30,855 on algorithmic software last year, show records released under the Freedom of Information Act.
That sum went to Siren, which, according to its website, is an “investigative intelligence platform” that specialises in analysing multiple large data sets to spot and explore possible connections.
At the moment, An Garda Síochána is using Siren, and its version of ElasticSearch, in a limited way to “to manage Outages, Key Dependencies and display Action dashboards in the Network Operations Centre (NOC)”, according to the Garda press office.
That means to monitor IT systems and servers to check they’re working. “The importance to policing is that these systems are essential to maintain connectivity and continuity of service,” they said.
But An Garda Síochána is also “currently evaluating Siren Solutions in one of the policing business areas”, they said. “It may be used for other search related problems in the future, including policing, but this is at early stage of evaluation.”
It’s the first time that An Garda Síochána has spent money on artificial intelligence systems, the records say – and a move in line with trends in policing in other countries, such as the United Kingdom and United States.
Using algorithms for policing should come with caution, says Elizabeth Farries, assistant professor of digital policy in University College Dublin, speaking generally.
There’s the unresolved problem of bias, she says. “Studies show that policing operates in a biased and discriminatory manner. Law enforcement AI risks similarly discriminatory policing outcomes because AI is not neutral.”
Siren says that the functions their application uses “are not known for biases and errors can be corrected in the system”.
“We do not use people recognition imaging software in our solution or other AI technologies known to have bias problems,” says a spokesperson for Siren.
Siren won a tender put out by the Department of Justice “for the provision of ElasticSearch, Associated Management Products and Support Services”, according to An Garda Síochána.
ElasticSearch is commonly used in the software industry by developers and system engineers, says Laura Nolan, a software engineer and member of Tech Won’t Build It, which runs events raising awareness around ethics and tech.
She’s used it herself, she says. “It’s very important for diagnostics, for example if you’re investigating software errors or having some performance problem.”
A constant stream of data is injected, and you can set up queries to spit out alerts when criteria are met, she says. “So you can use it to be a sort of always-on monitor for data coming in looking for certain types of conditions.”
You can also use ElasticSearch to link up data from multiple sources. “That’s obviously where privacy and civil rights implications come in if they’re doing this with data that relates to people,” says Nolan.
According to Siren’s website, its software can also be used by law enforcement for big data analytics, searching through big data and visualising connections and relationships through diverse data sets.
“Siren provides a platform to link data together and draw conclusions in a graphical manner to help investigations through data visualisations,” they say.
“In essence, it is used to catch “bad guys” and discover relationships between data sets that were simply not visible before,” they say, in a Q&A on their product.
Videos from Siren show how its products can be used more widely for policing something gardaí are looking at, but not yet doing, they say.
One video tracks how searching through data sets of people, vehicles, crime reports, and traffic cameras, on top of data on location and messages from seized mobile phones, can solve vehicle theft. It shows how to map it all, too.
“The investigative world is made up of disjointed data that needs to be connected. People (for example) are connected to vehicles they own, which are connected to locations where they’ve been, which may be connected to events, and so on,” says a blog post, describing how Siren works through ElasticSearch.
These are systems designed to suck in huge amounts of data, hundreds of thousands of records, and to search across that and to analyse it, says Nolan. “This very much points to them using a mass surveillance approach rather than a targeted surveillance approach.”
A Garda spokesperson said that there are no data protection aspects, or personally identifiable information, in the low-level log files in the network operations centre where ElasticSearch is being used at the moment.
“Privacy impacts and software solution design were carried out in the design of the AGS Policing Systems,” they said.
Garda technology is outdated and inadequate, making it hard for police to do their job, said the Commission on the Future of Policing in Ireland’s 2018 report.
The report also says that data should be seen as a “strategic asset and a key factor in determining police decisions”.
“Professional data analysis is an essential tool in modern policing and should be available in each Garda division and in an enlarged Garda Analysis Service at Headquarters,” it says.
Expanding use of algorithms is one way that police forces elsewhere have modernised.
In England and Wales, police have ramped up their use, found a report last year commissioned by the Centre for Data Ethics and Innovation.
It’s driven by the need to grapple with ways to handle more and more complex data, to allocate limited resources, and to respond to the expectation to adopt preventative policing, the report says.
But it has also bred concerns in those countries about whether the police are using them in the right way.
According to the Centre for Data Ethics and Innovation’s report, interviewees had concerns that a lack of an evidence base, poor quality data and insufficient skills and expertise.
There are also issues around bias. Biases are often baked into the datasets that police use, says Farries, the assistant professor of digital policy in University College Dublin.
“If police profile certain people based on their membership in a historically marginalised community, then the data sets from these policing practises can be used to build biased policing algorithms,” says Farries.
“While democracy requires that we hold our governments to a high standard of transparency, private companies are not held to the same standards,” says Farries.
Transparency from the Gardaí in how they use such technology is one thing but there’s an issue around how tech algorithms operate within a black box, she adds.
A Siren spokesperson said: “All actions within the Siren platform are captured for audit.”
According to An Garda Síochána, all new technology systems that the force uses are approved by the ICT Governance Committee, which looks at how they work for the business, how data is managed, stored and used, and the uses of algorithms.
Gardaí are committed to the United Nations Interregional Crime and Justice Research Institute goal of “responsible innovation that adheres to human rights principles and maintains public trust”, said a spokesperson.
An Garda Síochána works closely with the Data Protection Commission on all matters around processing personal data, they said.
That includes regular consultation and assessments “for any new projects where potential risks might present”, they say.
The Department of Business, Enterprise and Innovation is on the cusp of releasing an artificial intelligence strategy which would give guidance on its use, says Farries.
“I would ask if our governments should draw a line at investing in policing surveillance AI where the human rights impacts have not been fully evaluated,” says Farries, noting similar calls for facial recognition tech AI moratoriums at the EU level.
[A correction was made on 13 January at 11.48am. A previous version of this article said that the Department of Enterprise, Trade and Employment is on the cusp of releasing an artificial intelligence strategy when it is the Department of Business, Enterprise and Innovation that is leading Ireland’s National AI strategy. We apologise for the error.]
Get our latest headlines in one of them, and recommendations for things to do in Dublin in the other.