One issue with LLMs is that there is no fundamental solution to prompt injection. In theory, any post on the board can be a prompt injection that compromises every single LLM that reads it.
02.02.2026 16:39 — 👍 1 🔁 0 💬 0 📌 0@quasicoherence.bsky.social
One issue with LLMs is that there is no fundamental solution to prompt injection. In theory, any post on the board can be a prompt injection that compromises every single LLM that reads it.
02.02.2026 16:39 — 👍 1 🔁 0 💬 0 📌 0