Snowflake Summit 2025 Keynote Recap: Agents for Everyone
We’re back in San Francisco for the second year in a row and we’re covering the Snowflake Summit keynotes.
Typically we stick to the Platform Keynote for our recaps (check out 2024 and 2023), but this year, Snowflake kicked things off with an opening keynote featuring a pretty heavy hitter in the world of data + AI: Sam Altman, CEO of OpenAI. We cover some takeaways from his conversation with Snowflake’s CEO, Sridhar Ramaswamy, below.
The official theme of the Summit this year? Building the future of AI and apps. The unofficial theme everyone’s talking about? Agents, agents, and more agents.
Read on for the major announcements from the Snowflake Summit 2025 keynotes this year.
Table of Contents
Opening Keynote, featuring OpenAI and NYSE
Day One of Summit was pretty buzzy, and a lot of that energy seemed to stem from excitement over the approaching keynote. When the evening finally rolled around, crowds were ushered in and greeted with a 7-piece(!) live band performance – and then it was time to get down to business.
Snowflake’s CEO Sridhar Ramaswamy takes the stage and shares that this year’s Snowflake Summit is actually the biggest one yet. He teases that attendees are going to be “blown away” by the new platform capabilities they’re set to announce the following day (and he wasn’t wrong – scroll down!).
It’s pretty exciting, and there’s an air of anticipation in the room.
He sets the stage (pun intended) with an important disclaimer: despite all of the AI innovations we’re going to see and hear about at Summit, the foundation still lies in the data and its quality. The most important component of getting your AI in order is getting your data in order. Sridhar sums it up nicely: “There’s no AI strategy without a data strategy.”
His call to action centers around simplicity. Working with complex AI should feel simple, deriving value from your data should feel simple, and so on. Kind of funny, given the anything-but-simple production behind these keynotes. But, I digress.
Lynn Martin, President of the NYSE
Next, he ushers Lynn Martin, President of the NYSE onto the stage, and they have an insightful conversation about one of the most tangible pain points for most data professionals: data quality.
NYSE has trillions of incoming order messages a day – an extraordinary constant influx of data – so ensuring the integrity, quality, and accuracy of those messages is tantamount for effectively evaluating risk. Her team also needs to have a constant eye on the markets to surveil for nefarious behavior—another data quality imperative.
“It’s all about the sanctity of data. What really powers AI is a good source of truth. Without that, you’re going to have unfortunate outcomes,” says Lynn.
Her advice for effective AI development? “Stick to your core principles and your use cases will show up,” she says. “Be deliberate about how you can drive efficiencies.”
Simply put: your AI use cases should actually drive value for the end user. We couldn’t agree more.
Sam Altman, CEO of OpenAI
Next up, we have the most anticipated panel of the night. Sridhar welcomes Sam Altman onto the stage, along with Sarah Guo, Founder of venture capital firm Conviction, who moderates their discussion.
Sarah starts out with the tactical, asking for their advice when it comes to AI development. Sam keeps it short and sweet: “I think just do it.” Easy enough, right?
He elaborates on the importance of speed in an industry that’s changing quickly, and emphasizes that the companies who make early bets and iterate quickly are the ones that win. He shares that there’s been an inflection point in OpenAI’s model usability and reliability, and that’s enabling more enterprise businesses to enter the race. Buckle up!
They discuss AGI, both technically and philosophically. It’s clear that Sam is pretty impressed by Codex, OpenAI’s recently announced software engineering agent. He says that if we’d seen Codex five years ago, we’d all think humanity has achieved AGI. He and Sri agree that we all continually adjust our expectations when it comes to AGI based on the consistent incremental development of AI. So, maybe we’ll never get there?
Sarah asks a question that all engineers would love to consider: “What would you do with 1000x more compute?”
Sri says he’d ask the model to work on some of our most difficult problems, like curing major diseases. Sam says: “I’d work super hard to build a better model, then I’d ask the model what we should do with 1000x more compute.” So, he’d dogfood it—spoken like a true engineer.
Sam leaves us with this nugget of foreshadowing: “The models over the next two years are going to be quite breathtaking.” Even if AGI is still a ways off, it sounds like the models solving our hardest problems isn’t so far away…
After a much-need night of rest, we reconvene for the awaited Product Keynote.
Product Keynote
Benoit Dageville: Full-time co-founder, part-time conductor
This keynote opens with another live band, but in a surprise turn of events, it’s revealed that the conductor is actually Benoit Dageville, Co-founder & Head of Product at Snowflake! He finishes up his musical orchestration to head onto the stage to orchestrate the beginning of the big event.
He opens with the theme we’re all expecting: AI is changing everything – and fast. It’s streamlining innovation in new ways, and noticeably, the first innovation he calls out is unstructured data – turns out, it’s foreshadowing for some of the announcements to come.
It’s clear he’s excited about the future. He lays the groundwork for Snowflake’s data + AI development, boiling it down into three primary foundations:
- Easy: AI should be easy, and a unified platform like Snowflake provides that simplicity by enabling users to innovate faster.
- Connected. The data and business orgs shouldn’t be siloed. You have to be able to share data and run AI applications across the entire organization.
- Trusted. You have to be able run LLMs and AI applications with built-in governance workflows and processes.
Build with these foundations in mind, says Benoit, and the possibilities for innovation with Snowflake’s data + AI cloud are limitless. With that, he ushers onto the stage a familiar face: Christian Kleinerman, EVP Product at Snowflake.
Christian Kleinerman: A product announcement marathon
Christian brings a warm and friendly energy onto the stage, and it feels like the audience collectively leans in to get closer so as to not miss a beat. He takes the opportunity to dive into the announcements right away. A 2-hour keynote means he has more than a few announcements to unveil, so I don’t blame him.
He structures each batch of product announcements into buckets of “I want” statements. It’s like the new and improved, conference-keynote-style boolean! Let’s go through each one.
1. I want a data architecture that’s future-proof.
In other words, you probably want to make sure your data architecture is interoperable and doesn’t become inflexible as you scale. Snowflake solves that problem by providing a single unified system to provide that flexibility, whatever the topology of your architecture—data mesh, data lakehouse, data warehouse, or a hybrid stack.
Apache Iceberg innovations
This flexibility starts with Snowflake now bringing their performance, secure data sharing, and data protection to Apache Iceberg tables. This means Snowflake customers can accelerate their open lakehouse strategies, unlocking data access and analysis across open and managed environments to build, scale, and share advanced insights and AI-powered apps faster.
This also means Snowflake now contributes to open source projects like Apache Iceberg, Apache Nifi, Modin, Streamlit, and more, and they’re incubating Apache Polaris.
“We want to enable you to choose a data architecture that evolves with your business needs,” says Christian. No more lock-ins – good news for all.
2. I want to get better economics from my data platform.
Cost management is an increasingly important topic of conversation in data cloud land, especially with larger jobs and AI workloads. Understanding and managing resource consumption and budgeting is a huge undertaking – and often a blind spot – for many organizations.
Christian shares their plans to make cost management easier via several new features, including:
- Organization Usage View: A single pane of glass to see all of your spend and consumption in Snowflake.
- Spend Anomalies: Insights and alerting to know if and when spend is, as Christian put it, “out of whack.”
- Query Tags & Insights: Classify consumption with tags to make chargeback capabilities easier.
Christian then brings Julia Morrison, VP Data & Analytics at Marriott International onto the stage to share the story of their digital transformation, including the democratization of their data and analytics. One thing they needed? An easier way to manage compute resources. Luckily, they were able to work with Snowflake to launch…
Adaptive Compute
Adaptive compute! Snowflake has been serverless since 2012, but not anymore. With Adaptive Compute, organizations can manage adaptive resources at scale. It helps users understand which sources, types, scaling properties, etc they need to enable better performance, and most importantly, better utilization.
It automatically selects the appropriate cluster size, the number of clusters, and auto-suspend/resume duration for jobs on your behalf to minimize required configurations. Query routing is even done intelligently to the right-sized clusters without any user action.
On the back of Adaptive Compute, Christian unveiled another feature built to get more from your Snowflake economics: the Simplified Ingest Pricing Model.
Simplified Ingest Pricing Model
Since the team launched Snowpipe in 2017, it’s been built on a cost per file compute cost. With the Simplified Ingest Pricing Model, organizations can be charged based on the volume of data ingested—resulting in ~50% better “economics.”
3. I want to govern all of my data.
What do we talk about when we talk about trusted AI? Data governance, data quality, data + AI observability.
Christian’s keynote is no exception to the rule – data governance is a huge focus area for Snowflake’s innovation. Christian elaborates on their latest updates.
Out-of-the-box security, governance, and discovery with Snowflake Horizon Catalog
Christian announced the latest security and governance enhancements for Snowflake Horizon Catalog, Snowflake’s connected catalog for the entire data estate, including:
- Trust Center extensions
- New MFA methods and account security updates, like passkey support and authenticator apps
- Sensitive data monitoring and reporting through automatic detection of sensitive data tagging and reporting
- Synthetic data generation
- Enhanced Private Link support
He also announces that Horizon Catalog will federate across Iceberg REST catalogs through Catalog-linked databases, which automatically syncs Iceberg objects managed by remote catalogs in Horizon Catalog. Horizon will also be able to discover additional assets outside of Snowflake, including SQL Server, Postgres SQL, PowerBI, Tableau, Airflow, dbt, and more.
Plus, the newly announced Copilot for Horizon Catalog will help more users get critical information about data sets faster and with fewer dependencies – simplifying data product management within Snowflake. It’s in private preview currently.
4. I want to integrate all types of data.
Benoit mentions this during his introduction, and Christian reiterates: eliminating siloes is essential for effective data + AI product development. But, as data moves through fragmented legacy systems and pipelines, siloes can be common. This pain point sets the stage for one of the bigger announcements he makes: Snowflake Openflow.
Snowflake Openflow
Snowflake Openflow is an open, extensible, managed service for multimodal data integration that makes data movement between data sources – structured and unstructured – and destinations effortless. With Openflow, customers can more easily manage unstructured data movement and build data products with the same methods as structured data.
Users have the choice between using Snowflake Openflow as a Snowflake managed service or customer-managed with a bring your own cloud option. Christian mentions a few other announcements in private preview:
- Snowpipe Streaming: Includes a new SDK, access to different clients, high throughput (10gb per second), and data queryable from the moment ingested.
- Dbt projects in Snowflake: Users can now build, test, deploy, and monitor dbt pipelines in Snowflake.
All of these lead to another big announcement – Workspaces, lightweight development environments for managing code artifacts – and our first demo of the keynote, where we see Snowflake Workspace being leveraged to:
- Build and deploy an ETL pipeline
- Activate the data
- Transform it with dbt directly in Snowflake
- Make edits, compare changes, and push to git
- Run it via an aforementioned adaptive warehouse
It’s pretty cool to see it all in action, and it garners a solid reaction of claps and cheers from the audience.
5. I want more business impact.
Don’t we all? Christian reflects on how enabling better and easier data usage can make the data more valuable for the business. Already, zero ETL and zero copy sharing have helped teams collaborate without having to make copies of data, and they’ve reduced SLAs significantly, but Snowflake has even more enterprise support up their sleeves.
Snowflake Postgres
He takes this moment to expand on the previously announced acquisition of Crunchy Data and Snowflake Postgres — an enterprise-ready PostgreSQL solution combining open source flexibility with secure scale. It’s another step toward Snowflake’s mission to become the ultimate destination for enterprise data workloads and accelerate innovation for… you guessed it… AI.
6. I want faster insights.
There’s some palpable excitement around this one, and it’s because 99% of the data analysts in the audience are being tasked to drive more value from their data every single day.
For starters, Christian announces that SnowConvert, which helps teams migrate from legacy systems to Snowflake, is now available to everyone free of charge. To add icing on the cake, they’ve also introduced SnowConvert AI, to make that migration even easier. More AI!
Little do we know that he’s gearing up for some additional exciting announcements.
Gen2 Warehouses
A Gen2 warehouse is an updated version of Snowflake’s current standard warehouse with upgraded hardware and additional performance enhancements. Already, it’s delivered 2.1x faster performance for core analytics workload and it’s 1.9x faster than managed Spark.
Data Science Agent
It’s hard to believe that this is the first real “agent” we’ve seen from the announcements so far! The Data Science Agent helps users build ML pipelines from ideas to production. We’ve launched our own Observability Agents, so we see the value here. Enabling data professionals to leverage AI to drive more business value is a core pillar that all teams should embrace and employ.
Cortex AI SQL
Potentially one of the most exciting announcements, Cortex AI SQL allows users to use natural language in SQL queries to do things like aggregate and filter in a multimodal format.
With Cortex AI SQL, users get:
- Expressive and composable AI operators
- Native support for multimodal data
- Significant performance and cost improvements
We get a pretty cool demo that shows how a user might take unstructured data points, put them into an LLM, and then organize them into a multimodal table based on an NLP query (like sentiment).
7. I want to leverage AI with our data.
We hear the phrase “AI-ready data” a whole lot, but what does it really mean? For Christian, “a lot of data is not AI-ready.” But that doesn’t mean it can’t get there. He introduces a few product features that help bolster AI-readiness:
- Semantic Views: Views geared toward a specific use case by defining the metrics, definitions, etc used by business users and translating them into schemas.
- Semantic SQL: A set of query constructs using the specified metrics and definitions above as context, leading to better, more accurate response for both AI and BI use cases.
- Cortex Knowledge Extensions. Unstructured data also needs to be AI-ready. Cortex Knowledge Extensions help users leverage published data that’s already vectorized using publicly available datasets available on Snowflake Marketplace.
8. I want to accelerate business growth with AI agents.
You didn’t think we’d go this entire keynote session with Agents being a major announcement, did you? They saved the best for last, in my opinion, because the opportunities for innovation and productivity with Agents are huge.
Cortex Agents
Christian starts with Cortex Agents, which orchestrate across both structured and unstructured data sources to deliver insights. Agents use Cortex Analyst (structured) and Cortex Search (unstructured) as tools, along with LLMs, to analyze data. Cortex Search extracts insights from unstructured sources, while Cortex Analyst generates SQL to process structured data.
They’re also bringing Cortex Agents to Microsoft Teams to bring data to business users.
Cortex Agents are especially exciting for our team – we just announced data + AI observability support for Cortex Agents to provide more reliable AI-driven decisions within the Snowflake Cortex AI platform.
With data + AI observability monitoring Cortex Agents, teams confidently scale their AI initiatives with reliable, high-quality data and address the growing demand for trustworthy, production-ready data + AI systems. Check that out here.
Snowflake Intelligence
With Snowflake Intelligence, we’ve finally reached the climax of the keynote. This is cool and they know it, so they bring out the big guns to demo this one: Jeff Hollan, Director of Product at Snowflake. He’s high-energy and his enthusiasm is contagious. Right away, the room is brought back to life after a significant amount of sedentary time.
Snowflake Intelligence is a secure Agentic capability built to enable business transformations and promote productivity. The tagline? Agents for everyone!
It’s built on top of Snowflake to:
- Identify sales trends using Agentic AI
- Create semantic views for accurate insights
- Take immediate action through workflows
So, a business user can query the data using NLP to get insights and analysis faster. And, not only can it surface insights – it can then spin up an email to your exec team explaining the insights and sharing where it found them.
As Jeff says, “It’s like having a trusted analyst by your side.”
It’s a purpose-built assistant for bringing insights to life. It’s in private preview now, so we’ll all wait with bated breath for the day it becomes GA.
At the core of all of these updates? Data quality
From the introduction of Sridhar’s Opening Keynote through to every product announcement in Christian’s Product Keynote, there was one core through-line for everything to be possible: your data has to be high-quality, accurate, and reliable.
You might think it goes without saying, but it’s actually foundational and hugely important to the innovation we’re hearing during the Summit. Getting “AI-ready” doesn’t just happen – it’s an ongoing process that requires clear visibility into the health of your data estate from end-to-end. If enterprise organizations want to invest in AI, they need to invest in a data quality program with data + AI observability, which operationalizes the workflows and processes to build and promote data trust.
Everything we’ve seen in these keynotes is inspiring, but the first step is getting your data quality house in order. Then you can get your Agents into production!
From “I want” to “You can” to “What’s next”
At the end of an exhaustively long keynote, Christian wraps up all these announcements with a bow. With all of Snowflake’s new features and announcements, every “I want” statement can be replaced with an “I can” statement. It’s like a smooth pickup line!
It’s clear that Snowflake is driving innovation at all parts of the data + AI lifecycle. “I can” definitively say that I’m excited for what’s to come.
Be sure to swing by Booth #1508 to chat with the Monte Carlo team while you’re at Snowflake Summit!
Our promise: we will show you the product.