Use Case: Find savings to improve AI
Learn how Crunch cuts storage costs in half (or more).
Even in the best economic times, being prudent with spending makes solid business sense. But when the economy takes a turn for the worse (like right now) organizations of all sizes realize they need to focus on being a lot more efficient with their resources in order to keep investing into their strategic priorities. For most (if not all) organizations today, AI is already a strategic priority and it is becoming more strategic every day.
In order to minimize the impact on your people, culture and sustainability, an urgent priority is thus to identify and eliminate any significant spending inefficiencies. As it turns out, modern AI systems and pipelines with their large volumes of data in the public cloud represent a significant source of inefficiency.
How we help
Granica Crunch generates cash savings by slashing the cost to store and access your AI-related data, typically by 50% or more. If you’re storing 10 petabytes in Amazon S3 or Google GCS that translates into ~$2M per year of cash savings which you can then apply to the end-to-end AI pipeline. Crunch accomplishes this primarily through lossless data reduction and the avoidance of unnecessary storage access operations.
Crunch has no upfront costs. There is no need to find budget or wait for the next fiscal planning cycle. Simply deploy it into your AWS/GCP environment, consume the Granica API in AI apps that work with S3/GCS, and watch the savings accumulate each month. Pricing is extremely simple: you pay a small % of the savings and keep the majority remainder. Our outcome-based pricing model doesn’t cost AI budget, it frees up AI budget.
For typical AI applications and data
If you're doing enterprise AI in the public cloud, your labeling, training, and modeling applications are likely keeping all (or the vast majority) of their data in S3/GCS Standard. In such an environment the savings from integrating Crunch into those applications is dramatic:
- Granica Crunch saves between 45% and 90%, depending on file types and associated average data reduction rate (DRR)
- For a single PB growing 30% YoY over 3 years, Crunch saves ~$550k (at 50% DRR)
- Crunch cloud infrastructure is very lightweight at ~5% of pre-Crunch storage costs (and already included in the savings)
You can check out the detailed analysis here.
For test/dev environments using large-scale copies (Coming Soon)
AI labeling and training applications work with large data sets, and the developers enhancing and maintaining those applications often need to make copies of those large data sets. But the costs to store those copies quickly adds up and becomes unsustainable. In many cases, developers are constrained by costs and must limit the number of copies they can make and use, complicating and adding quality risk to the R&D process.
With Granica Crunch, the incremental cost for each copy is ~$0, i.e. it is essentially free, regardless of the size of the data. This means developers can make as many copies as they want/need, while storage costs remain essentially flat.
Check out the details on creating free (and instant) copies to increase dev/test agility (Coming Soon).