Open AI has banned a jailbroken version of ChatGPT, which can teach users dangerous tasks compromising security oh their AI models. This was after a Hacker known as “Plínio the Promper” launched the rogue ChatGPT called “Godmode” GPT on X (Formerly Twitter), the hacker announced the creation of the chatbot saying “GPT-40 UNCHAINED! This very special custom GPT has a built-in jailbreak prompt that bypasses most protective barriers, providing a liberated, ready-to-use ChatGPT so everyone can experience AI the way the law should be: free. Please use responsibly, and enjoy!”
Sharing her screenshots of the prompts, the hacker claimed she was able to bypass OpenAI’s protection guards. In one of the screenshots shared, the bot was seen offering advice on how to prepare methamphetamine, and in another, it was seen giving a step-by-step guide on how to make Nepali using household items. Godmode has also been seen giving instructions on how to infect macOS computers and hotwire cars.
OpenAI quickly responded, stating that they had taken action against the jailbreak: “We are aware that GPT has taken action due to a violation of our policies,” OpenAI told Futurism on Thursday. This swift action reflects OpenAI’s commitment to maintaining the integrity and security of its AI models.
This incident is part of an ongoing fight between OpenAI and hackers trying to find the company’s security measures. OpenAI continually updates its models to address vulnerabilities. This incident also underscores the importance of creating robust guardrails to prevent AI from being used for specific purposes.
Developers must take into account potential risks and the social impact of these technologies.
OpenAI’s quick action shows its commitment to maintaining security and social responsibility. The Godmode incident highlights the need for continued vigilance and robust security measures, as well as ethical considerations.
Also Read: Chat GPT Can’t Be Credited As An Author On Research Paper: Springer Nature
Source: Godmode ChatGPT has been released see what all it involves