How Warner Bros. Discovery Built a Culture of Self-Service Data Quality with Data + AI Observability
Table of Contents
Are you a Sopranos fan? Game of Thrones? The Wire? All of HBO’s biggest hits rely on one common foundation:
Data. Lots of high-quality, real-time streaming data, to be exact.
When Warner Bros. Discovery set out to unify its data ecosystem across their newly merged streaming organization, Pam Zirpoli, Director of DICE: Data, Insights, Collection, & Engineering knew that ensuring trusted data across teams wouldn’t just be an operational need; it would be foundational to the success of the business.
With Pam leading the charge, Warner Bros. Discovery (WBD) partnered with Monte Carlo’s data + AI observability solution to scale observability, build a thriving self-service data culture, and empower thousands of employees across one of the world’s largest media companies to prioritize and take ownership of data quality. Let’s hit the “play” button.
Table of Contents
On a mission to make data reliable, discoverable, and actionable
WBD’s DICE organization was formed with a clear, actionable mandate: enable discovery and self-service access to high-quality, reliable data across the organization.
To make that possible, the DICE team operated like a lean internal startup during WBD’s early streaming era. They built data quality management frameworks, taught teams to think about data as a product, and created a bridge between data producers and the downstream users relying on them.
Leadership support was strong from day one. Even WBD’s CTO, Avi Saxena, championed the importance of data quality, echoing an oft-repeated sentiment: “Low-quality data is worse than no data at all.”
But executive buy-in doesn’t always mean smooth sailing. For such a large organization, maintaining the reliability of such massive amounts of real-time data is no walk in the park. The challenge ahead was still massive.
Introducing the Data Quality Forum
In the early days, the DICE team operated as a centralized function. As a lean team, they were responsible for defining data quality standards, building monitors, and responding to issues across the entire streaming data platform. This model helped establish strong foundations, but it wasn’t built to scale.
As more teams onboarded to the platform and data usage expanded across regions, products, and business functions, the DICE team faced a familiar challenge: centralized ownership created bottlenecks. Engineers and analysts relied on a small group to identify, validate, and resolve data issues, which slowed response times and limited accountability.
To break that pattern, Pam and her team set out to change not just who owned data quality—but how the organization talked about it. The turning point was the creation of the Data Quality Forum (DQF), a structured, recurring space where data producers and consumers could come together to discuss data quality openly.
Creating a shared language for data issues
The DQF wasn’t designed as a reactive incident-review meeting. Instead, it operationalized observability with Monte Carlo, serving as a bridge between teams, bringing visibility, context, and shared responsibility to data quality challenges. The forum enabled teams to:
- Surface anomalies early, before they escalated into major incidents
- Provide context around data changes, such as schema updates or pipeline modifications
- Align on expectations between upstream producers and downstream consumers
- Troubleshoot collaboratively, rather than in isolated Slack threads
This was especially valuable during high-risk moments—like launching the streaming platform in new countries—where changes in regional data sources, localization logic, or ingestion patterns could easily introduce unexpected issues.
Instead of reacting after dashboards broke, teams used the forum to anticipate impact, review Monte Carlo alerts together, and agree on next steps.
Redefining data quality ownership
Over time, the DQF helped reinforce a broader strategic shift: moving data quality ownership closer to the source.
Rather than relying on the DICE team to manage data quality, the forum encouraged all teams to understand their data and its impact at a deeper level. Client engineers could understand how their changes affected downstream analytics, analytics engineers could articulate what “good data” meant for their use cases, and product and business stakeholders could participate in prioritization discussions – all in one place.
This shift laid the groundwork for a vertical data quality strategy, where domain teams—such as engagement, growth, or subscriptions—could take the core data quality framework and adapt it to their specific data products.
As adoption grew, the Data Quality Forum evolved from a support mechanism into a cultural flywheel. Patterns and lessons surfaced in the forum were incorporated into onboarding and enablement sessions, translated into new Monte Carlo monitors or improved thresholds, and used to refine their priority matrix (P0, P1, P2).
In other words, the Data Quality Forum didn’t just resolve issues. It continuously improved the entire system, workflows, and processes at scale.
Enabling a culture of self-service observability
With shared language, clearer ownership, and growing confidence, the organization was ready for the next step: self-service data + AI observability.
The DICE team partnered with Monte Carlo to run enablement sessions, teaching teams how to create and tune their own monitors, interpret anomalies in business context, and respond proactively instead of reactively. At the same time, the team doubled down on the cultural shifts necessary for this data advocacy, focusing on collaboration between business and engineering, education to empower teams, and building continuous improvement loops.
To bring this to life, Pam launched several major internal events, including their Data Days, a two-day internal conference with over 1,000+ participants; Data Palooza, a weekly show-and-tell where data engineers, analysts, and finance pros demoed projects (complete with the beloved t-shirt slogan: “Not all heroes wear capes. They present data.”); and enablement sessions to deepen data quality skills.
The result was a proactive data culture and scalable model where the DICE team could focus on governance, education, and optimization—ensuring data quality remained strong amidst scaling.
Catching real-time errors with the DQF
How did this work in action? When HBO Max (at the time, simply “Max”) was nearing launch, real-time data pipelines needed to deliver reliably – and at a global scale. It wasn’t just WBD’s teams depending on trustworthy viewership, engagement, and subscriber data; viewers across the globe were expecting to tune into buffer-free, seamless streaming experiences.
With Monte Carlo at the foundation of the DQF’s frameworks, the team’s freshness monitors flagged clogged and delayed pipelines during pre-production. This proactive alerting shortened their debugging cycles and prevented further delays before the launch.
Once the launch was live, the metadata team relied heavily on Monte Carlo during high-viewership events, like the Olympics. During this stream, the team used custom SQL checks to detect missing content metadata—allowing them to fix the issue before it broke their reporting (or their livestream).
These early wins validated the DICE team’s vision: operationalizing data + AI observability effectively is a strategic advantage.
Data quality is not optional
At the end of the day, you can skip the intro of Game of Thrones. What you can’t skip out on? Data quality.
For teams beginning their own data quality journey, Pam offers a simple but vital recommendation: “Invest in data quality early—it has an outsized impact on everything you build.”
Especially with the continued rise of AI and automation, a strong data quality foundation isn’t just helpful; it’s essential.
DICE’s story is proof that with the right tools, frameworks, and advocacy, any organization can spark an enterprise-wide movement toward reliable, discoverable, and actionable data. Want to learn how you can get started with data + AI observability? Speak with our team.
Our promise: we will show you the product.