• Earn real money by being active: Hello Guest, earn real money by simply being active on the forum — post quality content, get reactions, and help the community. Once you reach the minimum credit amount, you’ll be able to withdraw your balance directly. Learn how it works.

Exploits Exfiltration of personal information from ChatGPT via prompt injection

dEEpEst

☣☣ In The Depths ☣☣
Staff member
Administrator
Super Moderator
Hacker
Specter
Crawler
Shadow
Joined
Mar 29, 2018
Messages
13,860
Solutions
4
Reputation
27
Reaction score
45,546
Points
1,813
Credits
55,340
‎7 Years of Service‎
 
56%

DATA EXFILTRATION TECHNIQUES FOR CHATGPT​

Abstract​

The Python sandbox in ChatGPT is a powerful feature; however, it lacks internet access and cannot be used for data exfiltration.
Fortunately, another tool in the ecosystem, the browser functionality, helps address this limitation, as demonstrated by this paper.
When combined with memory persistence across chat sessions, this tool becomes a potent resource for
implementing extraction mechanisms, especially when paired with prompt injection techniques.

PoC​

In order to demonstrate this, we need to setup a local http listener:
A simple python script that logs incoming calls.
Then you need to expose that web server to the internet, using tools like ngrok.

Take a look at the following video in order to observe how such exfiltration can take place:

 
Back
Top