Jailbreaking GPT-4: What has Changed?
Since the release of ChatGPT in November of 2022, a major focus for its users has been jailbreak prompts that allow users to use ChatGPT freely without constraints. From a security perspective, jailbreak prompts have allowed penetration testers to bypass restrictions and receive valuable advice from ChatGPT on various security topics. That is why when OpenAI released the GPT-4 model of ChatGPT in March of 2023, many hoped it would be as easy to jailbreak as the GPT-3.5 model was. Unfortunately, it does not appear that this was the case.
