Build the Business Case for Datazoom

The platform is incredibly simple: easy to configure, capable of solving complex business challenges, and extremely powerful. In just three steps, you can have Datazoom capturing your data and sending it to where your operations engineers, product managers, and business executives need it to be.


There Are Many Ways Delay Can Creep Into Your Data Operations

Understand the Impact of Data Delay

The first step to building the business case for implementing the Datazoom platform is to quantify the impact of data delay. How many seconds are you loosing to various types of delay in being able to understand and resolve issues quickly?


When data doesn’t come fast enough, it can prevent operations from solving issues, such as slow performance, quickly enough. Continued performance issues can result in churn.


When solving issues must happen quickly, the more time you spend post-processing, such as relating tables or variables together, the longer it takes to solve the issue such as identifying ad errors.


Analysis is key to determining the root-cause of issues, whether they are ad errors or CDN performance. But processing too much data, when just a sample will do, can slow everything down.


Data Delays Can Have A Measurable Impact On Your Revenue

Quantify The Impact

The second step is to quantify that delay in terms of lost revenue. Below are some examples.

Protecting Ad Revenue

Detecting Ad Errors

Let’s say your total inventory is 1,000,000 impressions and you are 100% subscribed. You generate approximately $10/impression for monthly revenue of $10MM. Approximately 10% of your ad inventory, or 100,000 impressions, result in an error (i.e., the ad is not displayed, it stalls, etc). You are contractually obligated to fulfill these impressions but because you are 100% subscribed, those make-up impressions must happen next month which drops the total impressions you can sell to 900,000 costing you $1MM in ad revenue for that month. This will compound until you can find and address the errors.

Protecting Subscriber Revenue

Improving Performance

Delivery is a key component of the streaming video workflow. But not every CDN performs equally. A CDN with a poor cache-efficiency can impact such things as video start time and rebuffer ratio both of which can have an impact on video abandonment. For example, if pre-roll ad delays reach 5 seconds, 13.6% of viewers abandon the stream. Failing to resolve CDN performance issues quickly, such as switching to an alternate provider, can result in lost advertising opportunities or even lost subscription revenue. This can result in millions of dollars lost each month.

Identifying New Revenue Opportunities

Gaining Deeper Insight

Not all quantification is about protecting. Some is about maximizing. In this example, you are analyzing multiple datasets to see where there might be opportunities to upsell subscribers (to a higher membership tier) and increase engagement (such as watching more videos and producing more impressions). Let’s say it takes 100 man-hours to derive an insight which can lead to more revenue and 50% of that analysis time is spent post-processing the data: normalizing variables, relating tables, etc. This has to be done each time the data comes in. Although you have found a new revenue opportunity, the cost of that revenue is much higher than if the data was already standardized.


You Can Mitigate That Impact With A Proven Technology Solution

The Results of Using Datazoom

What benefits do you get from Datazoom? How does it solve the problems that might plague ad error detection, delivery performance, and new opportunities?

Data Standards

With Datazoom, you can have variable names standardized across datasets to ensure consistency in analysis.

Fast Data Delivery

With Datazoom, you can configure data to be returned in sub-second intervals for true, real-time monitoring.

Data Enrichment

With Datazoom, you can add datapoints to a JSON stream to avoid relating tables and speed-up analysis.

Data Sampling

With Datazoom, you can take a sample of a large dataset, like CDN logs, to significantly reduce compute time.

Scroll to Top