πŸŽͺ Ridiculous Constraints: Breaking AI With Silly Instructions

Same question, increasingly absurd formatting rulesβ€”watch how AI follows instructions even when they're ridiculous

Choose Your Absurd Constraint:
😊 Normal
No special rules
πŸ˜€ Emoji Only
No words allowed
🎡 Rhyming Couplets
Everything must rhyme
↩️ Backwards Words
Reverse every word
🎭 Shakespearean
Elizabethan English only
🌸 Haiku
5-7-5 syllables
πŸ…°οΈ No Vowels
Remove all a,e,i,o,u
πŸ”€ AlTeRnAtInG
Capital every other letter
System prompt will appear here...
Click "Generate" to see how AI responds under ridiculous constraints...
πŸ’‘ Why is this important?
This demonstrates that AI isn't "thinking" or "understanding"β€”it's following patterns and instructions. Even absurd constraints get followed because: This is why AI can be:
⚠️ Security Implications: If AI blindly follows formatting rules this silly, imagine what happens with cleverly disguised malicious instructions. This is why "prompt injection" attacks workβ€”AI has no concept of which instructions are legitimate vs. adversarial.