Home Tech The conversation histories of ChatGPT users were leaked due to a bug.

The conversation histories of ChatGPT users were leaked due to a bug.

by THE GULF TALK

According to the CEO of OpenAI, Sam Altman, a glitch in ChatGPT allowed certain users to view the titles of other users’ conversations. Some users shared images of chat histories on social media platforms Reddit and Twitter, claiming they were not their own. Altman expressed regret over the incident and stated that the company had fixed the “significant” error.

However, many users remain concerned about the platform’s privacy. Since its launch in November of last year, millions of people have used ChatGPT for various purposes such as drafting messages, writing songs, and coding. Each conversation is saved in the user’s chat history bar for later reference.

Users reported on Monday that conversations were appearing in their ChatGPT chat history that they claimed they had not had with the chatbot. One Reddit user even shared a photo of their chat history showing titles such as “Chinese Socialism Development” and conversations in Mandarin. On Tuesday, OpenAI briefly disabled the chatbot to fix the problem, according to a statement made to Bloomberg.

The company clarified that users were unable to access the actual chats. Despite this, the error has raised concerns among users who fear their private information could be exposed through the tool. Some have speculated that the glitch suggests OpenAI has access to user chats, although the company’s privacy policy states that data is only used for training the model after personally identifiable information has been removed. OpenAI’s CEO tweeted that a “technical postmortem” would be conducted soon.

The incident occurred just one day after Google unveiled its chatbot Bard to beta testers and journalists, with Google and Microsoft competing for dominance in the rapidly growing market for AI tools. However, the frequency of new product updates and releases has some worried that mistakes like this could have unintended consequences or be harmful.

Related Articles

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More