PH Deck logoPH Deck

Fill arrow
Object store built for AI workloads
Brown line arrowSee more Products
Object store built for AI workloads
High performance, AI-native objectstore with S3 API included
# Developer Tools
Featured on : Nov 20. 2025
Featured on : Nov 20. 2025
What is Object store built for AI workloads?
Anvil is an open-source, AI-native object store designed for modern workloads. We built it after hitting the limits of Git LFS, Hugging Face repos, S3, and others when working with multi-GB model files. It is S3-compatible & gRPC-native, supports: * Model-aware indexing - so it understands safetensors, gguf, and ONNX. * Tensor-level streaming * Erasure-coded storage * Open source (Apache-2.0) If you’re storing large models, versioning fine-tunes, running local inference, we want your feedback.
Problem
Users managing large AI model files face inefficiencies with current solutions like Git LFS, S3, or Hugging Face repos, which lack model-aware indexing, struggle with multi-GB file handling, and offer limited versioning capabilities for fine-tuned models.
Solution
An open-source AI-native object store (S3-compatible tool) enabling users to store, version, and stream large AI models efficiently via model-aware indexing (supports safetensors, gguf, ONNX) and tensor-level streaming. Example: stream model layers during inference without full downloads.
Customers
AI/ML engineers, data scientists, and developers working with multi-GB models, fine-tuning workflows, or local inference pipelines.
Unique Features
1. Model-aware indexing for AI file formats 2. Erasure-coded storage for reliability 3. Apache-2.0 open-source license 4. Native gRPC/S3 API compatibility 5. Tensor-level data streaming
Traction
Launched on Product Hunt (exact metrics unspecified). Open-source repository available with Apache 2.0 license. Actively seeking user feedback for scaling.
Market Size
The global AI infrastructure market, including storage solutions, is projected to reach $309.4 billion by 2032 (Allied Market Research).