This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Insights Insights
| 1 minute read

Open Source Code Exposes ChatGPT User Data

Well. If AI (artificial intelligence) is the "robot overlord" threat to humanity that some people would like to believe, it is unfortunately no smarter than we are when it comes to fending off data breaches. (At least not yet.)

For anyone who hasn't dabbled with it or doesn't know about ChatGPT: it's an AI chat service that allows users to interact and ask questions or seek services. Examples might include "write me a term paper on 'Pride and Prejudice'," or "explain AI to me like I'm a five year old."  Rather than a search engine pulling already-existing results, the AI behind the chat engine generates written results to those inquiries on demand. The theory behind AI is that the engine will get smarter (and produce better results) over time, as it "learns" from successive user inquiries.

Why It Matters

As a data breach goes, this is probably no more signficant than any other exposure of user data; the details exposed include contact information and payment information of account holders, all of which should be manageable by the company. There is a good reminder here for any company that has open source code in its services to make sure that you know how it all works and is configured, so that you do not inadvertently allow a data breach. The headlines here, however, are chiefly about this breach having happened at a novel and high-profile service, rather than being about a novel kind of breach.

Why did OpenAI take ChatGPT offline in the first place? Officials said they found a bug in an open-source library, which allowed some users to see titles from another active user’s chat history. "It’s also possible that the first message of a newly created conversation was visible in someone else’s chat history if both users were active around the same time," OpenAI officials said.

Tags

data security and privacy, hill_mitzi, insights