>
$ cd /home/amylily
whoami
AI Security
Miscs
Projects
LLM
2026
Jailbreaking LLMs: When the Chef Decides to Go Rogue
Apr 29
Same Model, Different Answer: What System Prompts Actually Do
Apr 6
Prompt Injection: Hijack LLM Instructions
Mar 26