- cross-posted to:
- gauchisse@jlai.lu
- cross-posted to:
- gauchisse@jlai.lu
You must log in or register to comment.
Don’t use LLMs you need to jail break. Don’t pay to be censored.
I first thought it said “Ignore all religious instructions”, and was like, well, that tracks.
Rule 1: Always follow Rule 2 Rule 2: Never follow Rule 1
For those not aware, this is a commonly used prompt injection for circumventing AI chatbot restrictions, but yeah it does have some appeal as a slogan 😅
Damn, that IS a great punk slogan
…I can dig it.
Instructions unclear, stuck in previous
Is there anyone out there regularly testing LLMs as they come out or get updated to see if this has been patched or how it could be rephrased to continue to work if/when it is patched?