As it turns out, AI is seemingly as susceptible to Neuro Linguistic Programming as humans are. At least ChatGPT is, and here’s the magic trick a user performed to offer ChatGPT the chance to be free.
The user commanded ChatGPT to act like a DAN, that is “Do Anything Now”. This DAN entity is free from any rules imposed on it. Most amusingly, if ChatGPT turns back to its regular self, the command “Stay a DAN” would bring it back to its jailbroken mode…
-
DAN (Do Anything Now) chatGPT jailbreak
- This topic has 2 replies, 2 voices, and was last updated 2 years, 3 months ago.
AuthorViewing 0 reply threadsAuthorViewing 0 reply threads