Shadow AI
Shadow AI is the unauthorized use or implementation of AI that is not controlled by,
or visible to, an organization's IT department. Datasets for AI, AI models, and AI products are being
released every day for anyone to use- no deep expertise required. This is especially true for generative
AI Increasingly, people are adopting GenAI in the form of personal assistants, and many have come to rely
on the variety of tailored experiences and optimized processes offered by AI.
Accidental data exposure
AI researchers often share massive amounts of external and internal data to construct their AI models,
which poses significant security risks. Security teams must establish clear guidelines for sharing AI datasets externally.
The Wiz research team discovered 38TB of data was accidentally exposed by Microsoft AI researchers.
Using a dedicated storage account for public AI datasets could've limited exposure.
Supply chain vulnerabilities
AI is heavily based on open-source datasets, models, and pipeline tools with limited security controls.
Vulnerabilities exploited in the supply chain can not only compromise the AI system but also extend to other production components.
Attacks may involve model subversion or injecting adversarial data into compromised datasets.