Fast And Efficient Data Delivery With Datazoom

When time is of the essence, any delays in being able to use data to make critical decisions, such as identifying a performance problem, can have significant business impacts. When you rely on APIs from third party services to get access to data, you are limited by the frequency and speed by which other services expose and refresh new data. Data freshness challenges, combined with the other data storage and different visualization tools and the barriers to making business decisions can create serious problems. When you use Datazoom, you get the data you need, where you need it, when you need it, leveraging pre-integrated connectors to the tools you already use.


You probably already have visualization tools, dashboards, and data lakes which form the foundation of your streaming video operations. The Datazoom Platform includes a library of connectors to many of the systems you might already use, so you can deliver your data where it needs to be.

Get Data When And Where You Need It

There are many challenges associated with getting data from endpoints, like video players, audio players, and web/native apps, but none greater than delivery. When operators rely on third-party providers for analytics, monitoring, and other data tools, they can become locked into systems that don’t provide the delivery configuration and flexibility they need to improve and speed up their decision making.

Without Datazoom
With Datazoom
Time it takes for data to be available
Depends on when the provider determines it. Delivery could be delayed by additional processing.
Data is collected and delivered at the rate and speed you need.
Data is stored in separate locations
Third-party providers may store data in their own cloud data lakes, making it difficult to relate data to other sources
Through off-the-shelf connectors, data is delivered to the storage locations and tools you already use
Proprietary visualization tools
Third-party providers may require that data they collect is only viewed through their proprietary tools which obfuscates calculations and complicates operations
Through off-the-shelf connectors, data can be delivered right to locations against which existing operational tools (such as Looker and Datadog) are already connected.

With delivery that can be configured and is flexible to meet an operator’s specific business and technology stack requirements, businesses can experience significant value:

Improved data relationships

When data collected from end-points is delivered to existing data storage (such as an enterprise data lake), data can be better related to generate improved insights which, in turn, may generate new revenue opportunities.

Faster time-to-decision

When data is delivered as quickly as it’s needed and to the tools that are already used, business decisions can be made faster.

Fewer Errors

When data is delivered in a consistent manner, there is less chance of errors creeping into the datasets ensuring improved data governance.


Delivery of data through the Datazoom DaaS platform is flexible, configurable, and highly reliable

How Data Delivery Works Through Datazoom

Datazoom’s data delivery feature is scalable, reliable, and resilient. The platform ensures that the data users rely on to make business decisions is uninterrupted. And the flexibility is always available: users can simply drop new or different Connectors onto the visual data pipe builder canvas to deliver data somewhere else.


We collect a superset of behavioral events you want to receive and route them to different Connector tools based on what data is needed in a specific tool. As events are processed in our system, we will confirm the current routing rules and direct these messages to the correct destinations.


We also allow metadata & fluxdata contained within each event to be filtered so that the customer can control who has access to different data points in their various Connector destinations. The best example of this would be that an Operations team may want different data in their tool than the Marketing team needs in their tool. They may share some but not all of the user client data.

Learn More About Filtering


Once the events have been properly filtered, our Connector converts the received Datazoom message into the format that is compliant with what the destination needs to receive. This can include anything from changing the structure of the message before it leaves Datazoom to renaming keys or values to conform to a customer or Connector’s preferred data model.

Learn More About Data Transformations


Our Connectors are optimized to deliver the data payloads in the most efficient way possible for each Connector destination. We support batching & multi-thread delivery to minimize latency. For Object Store connectors like S3, GCS & Azure Blob storage that perform better with larger size objects, we have a Bulking process that balances optimal object size with customer latency goals. For these connectors we will create large batches of up to 1,000 messages but not wait more than 30 seconds if traffic is low.


If we run into any issues delivering data to the Connector destination we have multiple processes in place to attempt to redeliver the failed payloads. We will immediately retry up to 3 times within our Connectors in case of an intermittent internet interruption to ensure minimal lag added to message delivery. If those attempts also fail, we have a longer term retry mechanism that will come back and attempt to deliver up to 6 more times over a couple of hours. This allows us to handle some outages that the customer’s Connector destination may be experiencing.

Scroll to Top