The Video Data Platform

Improving the experience, efficiency, and profitability of streaming video requires data. Datazoom enables video publishers to better operate distributed architectures through centralizing, standardizing, and integrating data in real-time to create a more powerful data pipeline and improve observability, adaptability, and optimization solutions.

Datazoom is a video data platform which continually gathers data from endpoints, like a CDN or a video player, through an ecosystem of collectors. Once the data is gathered, it is normalized using standardized data definitions. This data is then sent through available connectors to analytics platforms like Google BigQuery, Google Analytics, and Splunk and can be visualized in tools such as Looker and Superset. Datazoom is your key to a more effective and efficient data pipeline.

01
What Can Datazoom Do For Your Data Pipeline?

Increase Speed of Issue Resolution

Get the data you need in real-time. Don’t wait for your data when you need to resolve an issue immediately.

Get Data Consistency Across Sources

Use the Datazoom data dictionaries to automatically standardize variable names across sources.

Reduce Data Post-Processing

Enrich data gathered through collectors to reduce the need for table-relating and post-processing.

Save Resources on Data Processing

It costs time and money to process big data sets. Select samples instead to analyze  faster and cheaper.

02
Datazoom Is More Than Just Technology

How The Platform Improves Your Data Pipeline

The Datazoom self-service platform is easily accessed and managed through an intuitive drag-and-drop portal. With some basic configuration settings for each connector and collector placed onto the canvas, you can be up-and-running in just a matter of minutes, on your way to a better data pipeline.
Step 1: Collect

Gather Data From Endpoints

Via the Datazoom Collectors Library, you specify from where in the video stack you want to collect data. It could be a CDN. It could be a player. It could be both, or neither. Collectors make gathering the insight you need for your data pipeline plug-and-play.

Step 2: Standardize and Enrich

Reduce Data Post-Processing

Datazoom standardizes incoming data according to our Data Dictionaries which serve as a data language and provide a common definition resource. Our Data Dictionaries enforce CTA-2066 and CTA-5004 video data standards. You can also enrich a captured event with data from another source. All of this cuts down on post-processing for a more efficient data pipeline.

Step 3: Distribute

Send Your Data Where It Needs to Go

Via the Datazoom Connector library, you can send your data to common third party tools such as BigQuery, Splunk, Google Analytics, Amplitude, or even a custom system. New Connectors are continually being added to make building your data pipeline even easier.

03
Normalize Your Data Before It Gets To Your Visualization Tools For Faster And Better Observability

Data Dictionary

The Datazoom Data Dictionary is a powerful component of the platform. These event, metadata, and fluxdata metrics power the Datazoom platform normalization capabilities. Without data normalization, without standardizing similar elements from different data sources against a common lexicon, your observability will be hampered by the need to spend valuable time and resources post-processing data.

Streaming Protocol

This metadata value indicates the streaming format which is being used to deliver the content to the viewer.

04
A Growing Library of Integrations To Build a Better Data Pipeline, Faster.

Ecosystem

Scroll to Top

Learn How to Improve Your Streaming Service Through Better Data

Download this exciting new whitepaper to improve your observability and build a better streaming service.