News Jailbreaking LLM-Controlled Robots December 11, 2024 Read Time:5 Second Surprising no one, it’s easy to trick an LLM-controlled robot into ignoring its safety instructions. Read More