Study Suggests 250 Poisoned Documents Could Backdoor AI Models Up to 13 Billion Parameters, Urging New Defenses
Study Suggests 250 Poisoned Documents Could Backdoor AI Models Up to 13 Billion Parameters, Urging New Defenses
CoinOtag 2025-10-13 17:27:49
AI model poisoning occurs when attackers insert malicious training data to implant backdoors in models. A recent study found just 250 poisoned documents can reliably backdoor models up to 13