How Prompt Injection Attacks Hijack AI Coding Tools
Attackers embed malicious prompts in GitHub issues, PRs, or configs to trick AI tools like Gemini CLI, Claude Code, and Codex into running commands, leaking secrets, or editing code in CI/CD pipelines.