Get the video data you need to power Observability & Optimization
There Are Many Ways Delay Can Creep Into Your Data Operations. Identify How You Can Remove Delay to Improve Observability.
Understand the Impact of Data Delay
Data Delays Can Have A Measurable Impact On Your Revenue. Improve Observability To Protect Revenue.
Quantify The Impact
The second step is to quantify that delay in terms of lost revenue. Below are some examples.
Protecting Ad Revenue
Detecting Ad Errors
Let’s say your total inventory is 1,000,000 impressions and you are 100% subscribed. You generate approximately $10/impression for monthly revenue of $10MM. Approximately 10% of your ad inventory, or 100,000 impressions, result in an error (i.e., the ad is not displayed, it stalls, etc). You are contractually obligated to fulfill these impressions but because you are 100% subscribed, those make-up impressions must happen next month which drops the total impressions you can sell to 900,000 costing you $1MM in ad revenue for that month. This will compound until you can find and address the errors.
Protecting Subscriber Revenue
Delivery is a key component of the streaming video workflow. But not every CDN performs equally. A CDN with a poor cache-efficiency can impact such things as video start time and rebuffer ratio both of which can have an impact on video abandonment. For example, if pre-roll ad delays reach 5 seconds, 13.6% of viewers abandon the stream. Failing to resolve CDN performance issues quickly, such as switching to an alternate provider, can result in lost advertising opportunities or even lost subscription revenue. This can result in millions of dollars lost each month.
Identifying New Revenue Opportunities
Gaining Deeper Insight
Not all quantification is about protecting. Some is about maximizing. In this example, you are analyzing multiple datasets to see where there might be opportunities to upsell subscribers (to a higher membership tier) and increase engagement (such as watching more videos and producing more impressions). Let’s say it takes 100 man-hours to derive an insight which can lead to more revenue and 50% of that analysis time is spent post-processing the data: normalizing variables, relating tables, etc. This has to be done each time the data comes in. Although you have found a new revenue opportunity, the cost of that revenue is much higher than if the data was already standardized.
You Can Mitigate That Impact With A Proven Technology Solution That Will Reduce Delay and Improve Observability.
The Results of Using Datazoom
With Datazoom, you can have variable names standardized across datasets to ensure consistency in analysis.
Fast Data Delivery
With Datazoom, you can configure data to be returned in sub-second intervals for true, real-time monitoring.
With Datazoom, you can add datapoints to a JSON stream to avoid relating tables and speed-up analysis.
With Datazoom, you can take a sample of a large dataset, like CDN logs, to significantly reduce compute time.