Thursday, October 16, 2025
HomeTechnologyHow many malicious docs does it take to poison an LLM? Far...

How many malicious docs does it take to poison an LLM? Far fewer than you might think, Anthropic warns

Anthropic’s study shows just 250 malicious documents is enough to poison massive AI models.

Tribune Arab
Tribune Arab
We provide you with the latest breaking news straight from the different industry.
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

TOP CATEGORIES

Most Popular

POPULAR TAGS

Recent Comments