A Meta AI leak has revealed a serious privacy flaw in Meta’s chatbot system. For a short time, users could view other people’s prompts and AI-generated responses, all because of an insecure backend vulnerability.

Meta has since patched the issue. But cybersecurity experts say this flaw serves as a warning to any company offering AI chatbot services.

How the Vulnerability Worked

Sandeep Hodkasia, founder of security testing firm AppSecure, discovered the issue in December 2024. While experimenting with Meta AI’s prompt regeneration feature, he analyzed network traffic and noticed something odd.

When a user modified a prompt, Meta’s backend assigned it a unique number. But that number was guessable. By changing it, Hodkasia could pull data from other users—prompt and response included.

Meta’s system didn’t verify whether the requester had permission to see the data.

Meta Responds

Hodkasia responsibly disclosed the flaw to Meta on December 26. The company fixed it by January 24 and rewarded him with a $10,000 bug bounty.

A Meta spokesperson confirmed the fix and said no evidence suggests the bug was exploited. Still, the exposure lasted long enough to raise questions about the company’s internal testing and user safeguards.

Second Privacy Issue in Weeks

This isn’t the first recent privacy misstep for Meta. Just weeks earlier, users discovered that Meta AI could accidentally publish private chatbot interactions to public feeds. Meta responded by adding a warning to notify users when they’re about to post something publicly.

These back-to-back incidents suggest the company may need stronger privacy protocols for its AI services.

Conclusion

The Meta AI leak shows just how fragile data privacy can be in the age of AI. One simple backend error exposed private conversations that users believed were secure. As AI tools become more integrated into our lives, companies must ensure users’ data stays private—no matter how fast innovation moves.


0 responses to “Meta AI Leak Exposed Private Prompts Due to Security Bug”