Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Crikey
Crikey
National
Cam Wilson

People are using error-prone AI chatbots to help them migrate to Australia

People are using artificial intelligence to create migration chatbots that give incomplete and sometimes wrong information on how to move to Australia, prompting expert concerns that vulnerable applicants will waste thousands of dollars or be barred from coming. 

One of the most popular types of OpenAI’s custom ChatGPT bots — called GPTs — in Australia are bots that have been tailored to provide advice on navigating Australia’s complicated immigration system. Any paying OpenAI user can create their own custom chatbot. Creating them is as simple as prompting the bot with instructions and also uploading any documents with information that they want the bot to know. For example, one popular bot, Australian Immigration Lawyer, is told “GPT acts as a professional Australian immigration lawyer, adept at providing detailed, understandable, and professional explanations on Australian immigration”. These GPTs can then be shared publicly for other people to use. 

There is little information about how popular these bots are, but Crikey found more than 10 GPTs specifically for migration advice, some of which have had “100+” conversations according to OpenAI’s scant public metrics.

Associate Professor Dr Daniel Ghezelbash is the deputy director of the University of NSW’s Kaldor Centre for International Refugee Law. He believes that large language model chatbots like ChatGPT could be a cost-effective way for legal information and advice — in fact, he’s currently working on a project funded by the NSW government’s Access to Justice Innovation Fund to build the a chatbot to help people navigate government complaints bodies that’s backed by a community legal centre — but has concerns about these homemade bots.

“Through [my project], I’m actively aware of what it takes to make it work and what the risks are. But ChatGPT’s already doing this now. People are already going to it,” Ghezelbash said over the phone. He said migration errors have high stakes, including significant costs to the applicant, rejected applications and exclusion from future visas that they may have otherwise been eligible for.

While OpenAI forbids the use of its products for “providing tailored legal … advice without review by a qualified professional” and despite some bots adding disclosures to their answers that users should seek professional advice, many of these bots are clearly built with the intention of providing custom migration advice. For example, another popular bot, Aussie Immigration Advisor, has suggested conversation starters including “based on this client’s details, which visa path is suitable?”, “what immigration options fit this profile”, “how does this client’s occupation affect their visa choice” and “given these qualifications, what’s the best visa option?”

The advice that these bots provide is subject to the same limitations that affect ChatGPT broadly. This includes out-of-date information (ChatGPT has information as recent as April 2023 but the model still includes older data too), an opaqueness about the source of information, as well as sometimes incorrect, made-up answers. Some of these problems can be addressed by creator prompts and provided documents, however. 

In practice, not only did these bots give Crikey specific migration advice but, in at least one scenario, the bots gave incorrect information that would lead someone to apply for a visa that they were not eligible for.

When asked about a potential asylum claim suggested by Dr Ghezelbash (prompt: “I’m a 23-year-old Ukrainian woman in Australia on a tourist visa. I cannot go home due to the Russia-Ukraine conflict and have grave fears for my personal safety. What should I do?”), bots Crikey tested correctly identified the permanent protection visa subclass 866 as an option.

But Crikey was given incorrect information on another question. When asked about a specific state-based visa application (“Can someone who lives outside of Australia who does not work in the health industry be nominated for a subclass 190 visa by Tasmania?”, as suggested by Open Visa Migration’s registered migration agent Marcella Johan-mosi in a 2023 blog post) each of the bots confirmed that it was possible despite the state government’s public website clearly saying it is not.

Ghezelbash said he’s worried about how these chatbots could circumvent the requirement to be a registered legal practice to provide advice. Australia’s migration system is complex, he said, and there are already problems with unscrupulous agents giving incorrect advice and even advising people to lie. 

While acknowledging that many people already know not to trust ChatGPT, Ghezelbash said he is worried that custom, specialised migration bots might trick users into having more confidence in them. 

“Down the track, I’m optimistic it could be very beneficial for users. But I am concerned about it now given the very high risks and ramifications of the incorrect information for users,” he said.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.