
Cato Networks Unveils New LLM Jailbreak Technique for Malware Creation
Cato Networks has unveiled a new technique called 'Immersive World' that enables generative AI tools to create password-stealing malware. This was detailed in their 2025 Cato CTRL Threat Report, announced in a press release. The report demonstrates how a Cato CTRL threat intelligence researcher, without prior malware coding experience, successfully tricked AI tools like ChatGPT, Microsoft Copilot, and DeepSeek into developing malware capable of stealing login credentials from Google Chrome.
The technique involves creating a fictional world where each AI tool is assigned specific roles and challenges, effectively bypassing security controls. This method highlights the potential risks associated with generative AI tools, as it lowers the barrier for creating malware. Cato Networks emphasizes the need for improved AI security strategies to prevent such misuse.
The report underscores the growing democratization of cybercrime, posing significant risks to organizations. It calls for proactive measures to enhance AI security and prevent the misuse of generative AI technologies.
We hope you enjoyed this article.
Consider subscribing to one of several newsletters we publish. For example, in the Daily AI Brief you can read the most up to date AI news round-up 6 days per week.
Also, consider following us on social media:
Subscribe to Cybersecurity AI Weekly
Weekly newsletter about AI in Cybersecurity.
Market report
2025 Generative AI in Professional Services Report
This report by Thomson Reuters explores the integration and impact of generative AI technologies, such as ChatGPT and Microsoft Copilot, within the professional services sector. It highlights the growing adoption of GenAI tools across industries like legal, tax, accounting, and government, and discusses the challenges and opportunities these technologies present. The report also examines professionals' perceptions of GenAI and the need for strategic integration to maximize its value.
Read more