Get all your news in one place.
100’s of premium titles.
One app.
Start reading
IT Pro
IT Pro
Technology
Rory Bathgate

ChatGPT privacy flaw exposes users’ chatbot interactions

OpenAI has revealed a privacy flaw in its popular chatbot ChatGPT temporarily allowed users to see the conversation titles of other users in their chat history.

Users on Reddit and Twitter started reporting the bug on 20 March and shared screenshots indicating that their ChatGPT web history contained titles they didn't recognise. 

While the contents of the chats do not appear to have been accessible while shared in this fashion, OpenAI pulled the chat history altogether while the bug was addressed.

On the same day, major ChatGPT outages were reported and those with access noted inconsistent service. OpenAI noted the outages on its status page, and restored service within hours of the initial reports

Sam Altman, CEO at OpenAI, tweeted that while the issue was “significant”, it has now been resolved.

Altman did not provide the name of the open source library in question, nor provide an exact percentage of affected users.

With millions of daily visitors, a privacy flaw affecting even a small percentage could have resulted in widespread data sharing, and Altman’s promised “technical postmortem” should address these concerns.

Each ‘chat’ a user has with ChatGPT is saved as its own instance in a user’s history, with a title decided based on the contents of the conversation.

On Monday, users found that their history contained titles pertaining to unfamiliar topics or functions, as well as titles written in other languages indicating that the flaw was a worldwide issue.

OpenAI may have to carefully outline its data protection policies and procedures, and reassure users that its open source supply chain is secure and will not lead to similar issues down the line

Some on Reddit have reported seeing other types of information, but did not provide verifiable evidence to back up these claims.

“I see someone else's phone number as the phone number tied to my account. I'm concerned but not concerned enough to quit the app,” stated one user.

Another alleged that they had signed up for ChatGPT Plus, the $20 (£16) per month subscription plan for the platform, under another email that had become linked to their account and as a result were not granted access to the service.

The bug came at a crucial time for the AI firm, which has just released its GPT-4 model. Users who are subscribed to ChatGPT Plus have access to the GPT-4 variant of the chatbot already, with OpenAI having promised “human-level performance”.

AI competitor Google launched its own chatbot Bard in the UK and US this week, with users able to sign up for open access through a waiting list.

After reportedly upending internal teams to compete with ChatGPT, Google has expressed hope that Bard, which is powered by an optimised version of its 540 billion-parameter large language model (LLM), LaMDA, will be quickly improved through user feedback.

Onlookers are already drawing comparisons between Bard and ChatGPT, which along with Microsoft’s GPT-4-powered Bing chatbot have emerged as the standout competitors of the new AI era.

While Microsoft and OpenAI have collaborated, Google has relied on its market dominance to buoy Bard’s chances.

The search giant has also been outspoken about the shortcomings of generative AI. In February, Alphabet chairman John Hennessy warned that Google had been “hesitant” to release Bard as it was still in development, and the blog post that announced Bard described it as an “experiment”.

In ITPro’s internal testing, the chatbot has operated similarly to earlier versions of Bing Chat, with fast response times for text generation but a tendency to engage in false or fictional outputs known as ‘hallucination’.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.