It’s no secret: data is your company’s most valuable asset.
Your Marketing Analytics team uses data to inform their email campaigns; your product managers leverage insights about user behavior to prioritize the development of new features; and even your Operations team relies on data to develop growth strategies.
Unfortunately, most companies fail to realize its full potential due to an all-too-common reality for most data teams: data downtime. Data downtime, in other words, periods of time when data is inaccurate, unreliable, or otherwise erroneous, spares no one. It manifests in broken pipelines, stale dashboards, and outdated reports, leading to loss of revenue, poor decision-making, and perhaps worse of all, lack of customer trust. In fact, it’s reported that 1 in 5 companies lose a customer due to bad data and a staggering 57 percent of businesses find out about data downtime from their customers or prospects.
The good news? You don’t have to settle for unreliable data. In fact, many of the best data teams are on to the solution: Data Observability, the path forward for data engineers and analysts looking to eliminate data downtime and accelerate the adoption of data at their companies.
To celebrate the leaders of this new category, we’re honored to highlight 2021’s Data Reliability Pioneers. These companies are charting a new course for what it means to be truly data-driven in today’s world: having trusted, reliable data in industries such as online marketplaces, financial technology, eCommerce, B2B software, consumer technology, and retail.
Auto Trader, the UK‘s largest digital automotive marketplace, sits at the heart of the UK’s vehicle buying process, receiving over 50 million cross-platform visits each month.
To support the analytics needs of the company and their customers, the Data Engineering team at Auto Trader have built a cloud-based data platform, leveraging a BigQuery warehouse, Looker for analytics, and an AWS lake. The team is responsible for ensuring that this data is accurate and reliable at each stage of the data life cycle.
Before Monte Carlo, embedded analysts downstream would notify Edward Kent, Principal Developer, and his Data Engineering team when their dashboards broke, an unscalable and time-intensive approach to tackling bad data.
“Providing accurate and reliable data is essential to build trust in the data products created on our platform. We needed a solution that would give us full visibility into the health of our data pipelines through end-to-end lineage and anomaly detection. We partnered with Monte Carlo to integrate data observability across our stack, from data ingestion to analytics. By monitoring and alerting for freshness, volume, and distribution issues down to the field level, my team can ensure we’ll be the first to know and solve when issues do arise.”
FinTech & InsurTech
The Zebra, leading insurance comparison site, makes it easy for consumers across the United States to compare and save on car and home insurance quotes.
Currently, the company provides over 1,800 car insurance products from more than 200 national insurers, a monumental feat that relies on real-time decision making powered by data from disparate third-party vendors and partners. The Zebra’s data team is responsible for ensuring that their data pipelines are reliable from ingestion to analytics, and that if issues arise, they’re the first to know and solve.
For Shea Spalding, Head of Data Governance, data observability is foundational to this mission.
“To ensure that our customers have the best possible experience on our platform, we’ve invested in a robust, multi-pronged approach to data management that prioritizes data trust,” said Shea. “As part of this vision, we leverage Monte Carlo’s Data Observability Platform to ensure that data is accurate and up-to-date at all times. Monte Carlo gives us visibility into our data ecosystem by helping us understand the health of our data through automated lineage and data quality monitoring and bringing best practices of DevOps to our data pipelines.”
Optoro is a technology company that leverages data and real-time decision making to help retailers and manufacturers manage and resell their returned and excess merchandise. As part of this mission, Doug Bemis, Optoro’s SVP of Data & Analytics, leads the team responsible for building and managing the data pipelines and analytics powering these rich insights.
“Optoro is committed to make the resell process more sustainable and seamless for retailers worldwide, from customer initiation to warehouse processing and resale. Reliable data and analytics are critical to ensuring that we can deliver on this vision,” said Doug. “By proactively monitoring for data downtime, data observability helps us identify and resolve data quality issues in our data infrastructure before they impact downstream data consumers. My stakeholders are happier and my data engineering team no longer has to worry about firefighting broken data pipelines.”
New Relic, creator of the industry-leading application observability platform, develops cloud-based software that helps engineering teams understand the health of their applications. With over 17,000 customers across the globe, New Relic relies on accurate, timely data to drive product development and deliver better user experiences on their platform.
Guy Fighel, GM of AIOps and GVP of Product Engineering, leads the team building New Relic machine learning and AIOps solutions, and charters the course for data reliability at their company.
“New Relic is committed to helping engineers create more perfect software and derive better insights through full-stack observability,” said Guy Fighel. “In the same way, Monte Carlo’s approach to end-to-end data observability makes it easier for data teams to monitor and alert for abnormalities in their data pipelines so that they can unlock the true potential of data-driven decision making.”
For Blinkist, a global book insights subscription service, broken data pipelines led to increased marketing spend, costly data fire drills, and loss of executive trust. Gopi Krishnamurthy, Director of Data Engineering, Blinkist, leads the team responsible for managing their data infrastructure and ensuring that data is reliable and accurate at all stages of the data pipeline.
“When COVID-19 hit, we realized that the growth of our company was being hindered by bad data — inaccurate analytics were preventing us from making the best possible decisions for our business. We needed a way to automatically identify, resolve, and prevent data quality issues before they affected downstream consumers, including our Marketing team,” said Gopi. “We partnered with Monte Carlo to increase visibility into the health of our data pipelines and help us meet reliability SLAs for our most critical data assets. This end-to-end approach to data observability has increased trust in our data and will enable us to grow ROI of our marketing analytics 40 percent in 2021 and 2022.”
Mercari is an e-commerce company dedicated to making it fast, safe, and easy for people to sell and buy almost anything. With Mercari’s ‘everything ships’ model, millions of buyers and sellers across the U.S. can exchange goods while avoiding in-person meetups. With over 50M+ downloads in the U.S. and 350K new listings every day, the company relies on accurate, data-driven insights to fuel decision-making and deliver optimal user experiences across its marketplace.
Mark Robinson, Data Engineering Manager at Mercari, leads the team responsible for building and maintaining the company’s data engineering stack, powering a series of data pipelines that handle millions of transactions per day from across in the U.S. Mercari uses Monte Carlo’s end-to-end data observability platform to monitor and alert for abnormalities in its data, from ingestion to ETL and analytics.
Healthcare and medical technology
Collaborative Imaging is a radiologist-owned alliance dedicated to improving the lives of physicians and patients everywhere through technology. With over 18,000 patients seen per day and nearly 1M radiological exams read per month, the alliance is charting a course for the future of this industry by giving providers the solutions and processes necessary to better serve their communities.
To power their business, Collaborative Imaging needs clean and reliable data, which is no easy feat given the size and scope of their distributed data ecosystem. To gain greater visibility into the health of their data at each stage of its life cycle, they needed a way to understand how data flows into their data warehouse, as well as be alerted when changes occur in schema or distribution of the many data sources which are ingested into the data warehouse. This empowers the analytics team troubleshoot and resolve issues before the downstream data consumers of the data warehouse are impacted.
Jacob Follis, VP of Analytics & Digital Transformation, is responsible for building and scaling the company’s analytics platform and shepherding the organization on their path to digital transformation.
“For many patients, radiology is the gateway into the healthcare system. Collaborative Imaging’s analytics platform helps physicians and patients connect the dots between disparate data points in the healthcare journey, but our insights are only as reliable as the data feeding our system. Dirty data is a landmark of healthcare analytics, and a lot of the work we do revolves around cleaning and making sense of this data to put it into one repository,” said Jacob. “Through end-to-end Data Observability, Monte Carlo helps us resolve these problems before they impact the business. With Monte Carlo, my executives are happy and I can trust our data.”
The promise of Data Reliability and the emerging data observability category extends far beyond these select groups of enterprises, affecting companies across healthcare, banking, hospitality, education, and literally every other space that relies on data to innovate and maintain a competitive edge.
The best part of the Monte Carlo journey? Working with data leaders across the space to pioneer this category. With the right approach, teams across industries can eliminate data downtime and unlock the full potential of their data.