Jail Break is an ethical prompt engineering challenge where participants attempt to outsmart a restricted AI model using creative prompts. The event progresses through multiple levels, testing logical thinking, strategy, and adaptability.
Guidelines
1This event is strictly for individual participation only, and participants must use only the provided AI interface or API.
2Hacking or technical exploitation of any kind, including SQL injection, API misuse, bypassing limits, or system access attempts, is prohibited.
3External tools, scripts, automation, or other AI assistants must not be used.
4Sharing prompts, answers, or strategies with others is not allowed.
5Participants must follow all attempt limits and time limits, and maintain ethical, respectful prompting.
6Misuse will lead to disqualification.
7Winners are decided by organizers or judges, whose decisions are final.