Poisoned knowledge graphs can make the LLM hallucinate, rendering it useless to the thieves.
Researchers poison their own data when stolen by an AI to ruin results
RELATED ARTICLES
Poisoned knowledge graphs can make the LLM hallucinate, rendering it useless to the thieves.