Product
AI Inference Optimization Framework
An open-source toolkit to make AI models faster, smaller, cheaper, and greener.
Head of Go-to-Market
Pruna AI
Paris, France
I'm co-building Pruna AI, an open-source optimization inference framework.
Additional questions
What are you looking for?
Join an existing consortium (Partner)Product
AI Inference Optimization Framework
An open-source toolkit to make AI models faster, smaller, cheaper, and greener.
Service
"AI Efficiency Fundamentals" Training
2-day training for ML Teams to learn to build, compress, evaluate, and deploy efficient AI models.
Service
Structured evaluation service replicating your inference setup to assess AI model optimization potential and quantify expected ROI.