For “urgent” questions, the Department of Justice directs asylum seekers to a chatbot. Is it trustworthy?

A disclaimer on it says, “no responsibility is accepted by, or on behalf of the Department of Justice for any errors, omissions or misleading statements”.

For “urgent” questions, the Department of Justice directs asylum seekers to a chatbot. Is it trustworthy?
File photo of office hours notice attached to the barricades outside a closed International Protection Office on Mount Street. Photo Shamim Malekmian.

When people seeking asylum email the International Protection Office (IPO), they first get an automated reply.

That email lists a few other email addresses to try, for accommodation questions or about work permits.

In the end, it says if a query is “URGENT […] you can try our chatbot, ‘Erin’”.

The chatbot, which pops up on the IPO website, links to a disclaimer and won’t start chatting unless visitors click “Yes, I accept”. People can start chatting even if they hadn’t clicked to read the disclaimer, though, as long as they say they had.

For those who follow the link, the disclaimer, which is in English, says Erin doesn’t give legal advice, and users must agree to interact with it at their own risk.

It says its data is from the Department of Justice and offered in good faith. But “no responsibility is accepted by, or on behalf of the Department of Justice for any errors, omissions or misleading statements”.

All of those things can happen. Erin’s answers can be misleading, leave out key info, and just unhelpful.

Meanwhile, the IPO has closed its phone line since August 2024, and some people seeking asylum say it can take a long time before they can hear back from a human if they email asylum officials.

“They rather send you an automated email that doesn’t really help anybody, you know, people are going through a lot,” said Tebogo Brian Mogotsi, who is navigating the asylum process at the moment.

The bot is also a cause of concern for non-profits and asylum-rights advocates who say a chatbot with all of its shortfalls isn’t suitable to handle urgent questions from people seeking asylum.

“These matters often require careful assessment of individual circumstances, which an automated system cannot do,” said Fiona Hurley, CEO of migrants’-aid non-profit NASC.

Says Lucky Khambule of the Movement of Asylum Seekers Ireland (MASI): “We do not believe it is suitable to use [chatbot] for urgent queries.”

“People should be responded to by human beings in a respectful way,” says Khambule.

A spokesperson for the Department of Justice has not yet responded to queries sent on 13 March, including one asking if it believes the chatbot is an appropriate tool to handle pressing queries.

The department also has another chatbot, Tara, which answers questions for people applying for citizenship – to a similar standard of unreliability, its disclaimer suggests.

Into the void

It’s unclear which company the Department of Justice has hired to customise the chatbot for the IPO website and how much it’s paid for it. It has not yet responded to those questions.

But the bot’s answers can be misleading.

If someone asks about the possibility of travelling back to their country of birth while still trying to get asylum, say if there was a death or illness in their family, the chatbot offers false hope.

“You will need to apply for a travel document. You can do this by contacting the Ministerial Decision Unit of the Department of Justice by email,” it says.

But the reality is more complicated than that, says Wendy Lyon, partner and immigration solicitor at Abbey Law.

People can’t apply for a travel document to the ministerial decision unit, she says, that’s just for asking permission to leave.

“You then need to apply to the Travel Document Unit,” she said.

Travel document application forms don’t even have the option of applying as someone still in the asylum process, Lyon says.

Besides, “permission to leave is often refused”, she said.

If someone asks whether they can bring their lawyer to their asylum interview, Erin, the bot, doesn’t make it clear that they can.

It says you can email to ask, but Lyon, the solicitor, says you can email to notify, but the IPO can’t block someone’s lawyer from attending.

The bot’s language can be filled with jargon, and hard to understand even if someone’s English is good.

It calls the asylum interview the “section 35 interview”, referring to the clause of the law under which asylum interviews are conducted, which can be confusing.

Hurley, the CEO of NASC, says she has concerns about the accuracy of the bot’s replies.

She said she understands the drive for efficiency, but beefing up human resources is the way to go “in situation where rights and entitlements are at stake”.

That’s much better than a flawed automated system “that is heavily disclaimered”, Hurley said.

Talk to me

For lots of questions, the bot also asks you to email different places, and can’t help beyond that.

Mogotsi, the guy who’s seeking asylum, says getting responses from a human is an uphill battle.

“The whole email situation with the IPO is just not working,” he said.

He says his emails to the Minister for Justice have also gone unanswered.

Mogotsi hasn’t used the chatbot. “Honestly? I don’t want to speak to a chatbot,” he says. He says chatting to a bot would probably annoy him.

Khambule, of MASI, says something similar. That for people stressed out from navigating the asylum application, chatting with a bot can be agitating.

“Removing the human factor limits the belief that their queries will ever be addressed,” he said.

Mogotsi says when he first applied for asylum, the IPO gave him a whole bunch of documents, but he was so exhausted that he didn’t read them. He was also sleeping rough at the time, he says.

The fact that people like him are entitled to apply for a state lawyer? He missed all that, he says.

“And nobody said it to me verbally at the IPO,” said Mogotsi.

He went to his interview without any legal advice and got a rejection. He’s from a country that’s labelled “safe”, and so it’s up to him to rebut that assumption.

“And I think they took advantage of that, they really put me in a corner, I believe if I had a lawyer things would’ve gone much better,” he says.

The problem, Mogotsi says, is that officials aren’t keen to talk to them about their rights, and the bot is just another symptom of that.

“I want to speak to a human being, you know, the chatbot is just telling you, forgive my language, to piss off.”

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Dublin InQuirer.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.