Before the refurbishment of the football pitch at Bluebell Road in the west of the city, it was an anti-social blackspot, says Michael O’Shea, chairman of Inchicore Athletic FC.

That was our homeground and it was a grass pitch, and they were constantly getting cars burned on it and broken bottles, says O’Shea. “It was in darkness all the time, so you never really seen who was on it.”

After a refurbishment in November 2018 for Dublin City Council, a new artificial astroturf pitch was installed, as well as floodlighting and a new high-tech closed-circuit television (CCTV) security system.

“I think it’s after drastically reducing the rate of anti-social behaviour in the area,” says O’Shea.

Installed on the football pitch are Hikvision cameras, embedded with what is called “deep learning” technology, AI that trains on people visiting the pitch.

It’s not totally clear how this AI works, how it identifies people, whether it has access to a database in order to identify people or whether it creates a database based off of recognising and storing people’s biometric data.

The technology is not something that the council should be using without carrying out a Data Protection Impact Assesment, says Elizabeth Farries of the Irish Council for Civil Liberties.

Deep Learning

Hikvision cameras were installed in November 2018 to protect the council’s investment in a new full-size all weather pitch at Bluebell Community Centre from anti-social behaviour, says a spokesperson for Dublin City Council.

Four cameras were installed at the new full-size, all-weather pitch at a cost of €9,416 plus VAT, according to council records released under the Freedom of Information Act.

According to the council, “The information will be recorded on a local monitoring station in Bluebell Community Centre and out of hours by a Dublin City Council contracted security company, which will be encrypted and secure.”

According to the website of CTS Tech, the firm that both supplied and installed the Hikvision CCTV system, “a Hikvision deep in-mind NVR” was included in the installation.

The Hikvision Deep in Mind NVR, according to the company’s website, mimics human learning and memory processes by incorporating algorithms that improve video analytics performance, including the ability to identify human activity to a high degree of accuracy. This includes the capabilities of facial recognition.

The advantage of this technology, according to Hikvision’s website, is that it will save costs by preventing false alarms triggered by things like animals, which regularly prompt unnecessary call-outs of security personnel.

Was this feature necessary? “There wasn’t an awareness of issues around ‘deep learning’ technology so concerns weren’t raised at the time,” says a spokesperson for Dublin City Council.

Eoin O’Dell, academic lawyer in Trinity College with an interest in General Data Protection Regulation, says there’s always a tendency to over-engineer and buy the best technology that you can afford.

“So they bought the best cameras they could afford. It turns out that it came with this additional feature,” he says.

There may be a good reason for the cameras for purposes of security, says O’Dell. But whether the use of this specific technology is proportionate should have been analysed through a Data-Protection Impact Assesment, he says.

A Belated Impact Assesment

Dublin City Council didn’t carry out a data-protection impact assesment for the installation of the cameras, according to documents released by the council under FOI.

“When you are undertaking something that’s going to have a significant impact on data protection or privacy, before you do it, you should undertake an assessment exactly how much of an impact it’s going to have and whether the impact is justified or proportionate,” says O’Dell.

According to the Data Protection Commission’s website, when an organisation collects, stores or uses personal data, the individuals whose data is processed are exposed to risks. These include this personal data being stolen, or being used for differing purposes.

“A Data Protection Impact Assessment (DPIA) describes a process designed to identify risks arising out of the processing of personal data and to minimise these risks as far and as early as possible,” says the commission’s website.

Under GDPR, a DPIA is mandatory “where data processing is likely to result in a high risk to the rights and freedoms of natural persons”, continues its webpage on DPIAs. This, they say, is particularly relevant when a new data processing technology is being introduced.

In other words, with GDPR having been rolled out in May 2018, a DPIA should have been carried out before the cameras were installed.

“The Data Protection Commissioner has said that people making good faith attempts to comply in circumstances where they don’t quite achieve full compliance, there would be a little bit of give and take there,” says O’Dell, “whereas somebody that is deliberately flouting the rules, they would be less accommodating.”

It’s not just for such deep learning CCTV cameras that a DPIA would be needed, says O’Dell. It’s generally needed for traditional CCTV cameras too to analyse whether or not their use is proportionate for reasons of security.

According to a spokesperson for Dublin City Council, a data-protection impact assesment is currently being undertaken retrospectively.

“The cameras are currently turned and because of the unease around HKVision [sic] camera Dublin City Council will make arrangements to substitute the cameras with more traditional CCTV cameras,” says a spokesperson for Dublin City Council.

Unease around Hikvision cameras is a result of their use for surveillance in China. Hikvision is a Chinese company, partly owned by the Chinese state, and one of the world’s leading manufacturers of surveillance technology, including that of facial recognition.

The council have yet to respond to queries as to whether similar cameras were installed in other places in the city without data-protection impact assessments being properly carried out.

There’s a danger, says O’Dell, that such technology creeps in without a proper conversation around whether its use is proportionate or not.

“It’s a classic example of mission creep, you put something in for a good reason and then you ramp up the use. So CCTV has a good underlying reason and then you expand and you expand and you’ve increased the level of surveillance to an almost impossible extent,” he says.

[CORRECTION: This article was updated on 19 March at 11.30. An earlier version stated that DPIA stands for ?Data Impact Analysis when in fact it means Data Protection Impact Assessment. Apologies for the error.]

Sean Finnan is a freelance journalist. You can reach him at

Leave a comment

Your email address will not be published. Required fields are marked *