Snowflake Optimization is a Job for Robots, Not Humans. Here’s Why.
Snowflake optimization isn’t like fixing a leaky faucet – you don’t fix it once and call it done for good. It’s more like landscaping: mow the lawn, trim the hedges, rinse and repeat. And don’t be surprised by the random weeds that will pop up the next day. When considering how to optimize Snowflake, it’s critical to keep this distinction in mind.
Why? Because the cloud moves fast. Your user count, query count & complexity, and cloud resource needs fluctuate. If you get a Snowflake optimization recommendation, it only applies to a single moment in time. Within hours, even minutes, it’ll be outdated.
A human Snowflake optimizer can quickly turn the constant changes and adjustments into a full-time job. And even then, they’ll miss other cost reduction opportunities that, while small, compound over time.
That’s why Snowflake optimization isn’t a job for humans — it’s a job for AI-powered “robots.” Read on to learn about how this works, and why cloud platforms who are serious about reducing costs are taking the plunge.
Why Snowflake optimization should be a top priority
Despite the cloud being resource-light, organizations of all sizes struggle to control costs. This is as true for Snowflake as any other cloud software.
While Snowflake offers a pay-as-you-go model, the most common approach is to purchase Snowflake credits at Capacity. This involves pre-purchasing a set number of discounted credits (which translate to resource usage), paying a significantly higher amount for credits above that number.
However, organizations who adopt this approach run the risk of either over-provisioning (which 59% of organizations do), or under-provisioning. And because expectations rarely align 100% with reality, identifying that Goldilocks amount in advance is difficult to the point of impossible.
If you’re overprovisioned, you need to make sure you’re using all the resources you’ve already paid for. If you’re underprovisioned, you need to keep your usage as small as possible to avoid incurring extra costs. In either case, a robust Snowflake optimization strategy is critical.
The problems with human-driven Snowflake optimization
Given the ongoing and continuous nature of Snowflake optimization, the question I often ask organizations is: are constant adjustments the best use of your data engineers’ time?
Because optimization isn’t something you can just hand off to an intern. The data is too complex and the tools require too much in-depth knowledge for an entry-level employee to handle.
But you’re also not paying a data engineer to spend their time tweaking and adjusting Snowflake. You want them building new data pipelines, managing your database, maintaining overall system security, etc. Optimization, while important, can’t become their full-time focus. They’re too valuable for that.
And believe me, Snowflake optimization often does become a full-time job. Because you can optimize for what you know today, but tomorrow’s different. Any number of changes—most of which are outside your control—can change your resource needs:
- User count fluctuates (ideally, grows)
- Queries become more (or less) frequent and complex
- Your rapidly scaling platform requires additional cloud resources
- You outgrow your current warehouse space and need to provision more resources
Unless you’ve got a crystal ball in your back pocket, your optimizations will quickly fall into irrelevance. You can get your hands around this once or twice, but as your organization scales, the complexity and frequency of change become too much for one person to handle. And, frankly, that’s assuming your data engineers make it a priority in the first place.
What’s more, these changes happen quickly, often minute by minute. If your engineers don’t act quickly enough, they’ll miss the boat. Sure, these fast changes may only save you $0.50. But compound that over multiple warehouses over hours, days, and weeks, and you can save tens of thousands of dollars per month.
How to optimize Snowflake: the power of AI
So now you can probably see where most cloud optimization tools fall short: all of them, without exception, rely on humans to receive and implement recommendations.
And let’s give credit where it’s due. These companies do a pretty decent job giving advice (some more than others). But it’s like having a talkative backseat driver. Sure, they can be helpful. But you’re still the one driving the car.
So in reality, they aren’t so much optimization tools as they are recommendation tools. The question is: once you’re up to your ears in advice, how are you going to optimize it all?
If a human engineer isn’t the best choice when it comes to optimizing Snowflake, what’s the best alternative? The answer: a “robot.”
Specifically, I’m talking about advanced AI algorithms that constantly monitor dozens of parameters, adjusting them dynamically to avoid over- or under-provisioning. Most importantly, these algorithms work constantly, saving you pennies here, and dollars there. Compound that over 24 hours, days, weeks, and months, and it can have an outsized (positive) effect on your budget.
How to optimize Snowflake with Keebo
This isn’t theoretical. It’s real. I know it’s real, because companies of all sizes and across industries are putting it into practice with Keebo. They’re reducing stress, limiting their resource use, and maximizing their cloud potential at the same time. Here are three examples.
Barstool Sports
Barstool Sports used Snowflake to improve users’ self-service experience. However, increased dashboards and filters increased their usage. By optimizing Snowflake warehouses in real time, Keebo helped Barstool Sports cut cloud costs by 50 to 70%, enabling them to explore new solutions and engage their audience without worrying about incurring unnecessary costs.
Read the full Barstool Sports case study here.
Hyperscience
Hyperscience began experiencing an increasing use of dashboards but didn’t have the resources to tune or optimize Snowflake workloads, resulting in a 400% increase in Snowflake costs. Once implementing Keebo, Hyperscience cut their Snowflake costs by 50% without reducing dashboard response time.
Read the full Hyperscience case study here.
The cloud is meant to be elastic. Snowflake is meant to be elastic. Unfortunately, when most organizations consider how to optimize Snowflake, they choose the least elastic option possible.
Don’t make that mistake. Instead, leverage AI and machine learning to make the cloud work as intended: to make resourcing and scalability easier, not harder.
Ready to get started? Reach out to our team to start saving Snowflake costs now.