Deliver Your
Data

Most technology components in the streaming video stack which provide monitoring or other data do so at their time. You don’t have a choice. But with Datazoom, you can specify how quickly the data gets to where you need it to be.

01
Are You Waiting For Your Data?

Monitoring Latency Is a Big Issue

In the streaming industry, there is a lot of discussion about latency. But this latency is really only concerned with how long the viewer has to wait for the video to start, to begin after a buffering event, for the next segment to arrive, and how long behind the stream is versus the broadcast. But there is another latency discussion that needs to be had: how long are you waiting for your monitoring data to reach your visualization tools? Is it every few seconds or every few minutes? The answer to that question can significantly impact subscriber retention and engagement.

02
Get The Data You Want When You Want It

Deliver Example: Real-Time Configuration

Real-Time Explained

Each Datazoom collector supports a flexible time configuration allowing you to specify the frequency at which data is collected. When datapoints are configured for collection in under 1-second intervals, Datazoom employs scalable and resilient high-speed transmission protocols to ensure that the data is processed and delivered to the Connector in near real-time.

03
The Technology Of Delivery

How Does It Work?

Delivering data to a Connector involves five aspects: Route, Filter, Package, Deliver, and Retry.

  • Route

    We collect a superset of behavioral events that a customer wants to receive and route them to different Connector tools based on what data is needed in a specific tool. As events are processed in our system, we will confirm the current routing rules and direct these messages to the correct destinations.

  • Filter

    We also allow metadata & fluxdata contained within each event to be filtered so that the customer can control who has access to different data points in their various Connector destinations. The best example of this would be that an Operations team may want different data in their tool than the Marketing team needs in their tool. They may share some but not all of the user client data.

  • Package

    Once the events have been properly filtered our Connector will convert the received datazoom message into the format that is compliant with what the destination needs to receive. This can include anything from changing the structure of the message before it leaves Datazoom to renaming keys or values to conform to a customer or Connector’s preferred data model.

  • Deliver

    Our Connectors are optimized to deliver the data payloads in the most efficient way possible for each Connector destination. We support batching & multi-thread delivery to minimize latency. For Object Store connectors like S3, GCS & Azure Blob storage that perform better with larger size objects, we have a Bulking process that balances optimal object size with customer latency goals. For these connectors we will create large batches of up to 1,000 messages but not wait more than 30 seconds if traffic is low.

  • Retry

    If we run into any issues delivering data to the Connector destination we have multiple processes in place to attempt to redeliver the failed payloads. We will immediately retry up to 3 times within our Connectors in case of an intermittent internet interruption to ensure minimal lag added to message delivery. If those attempts also fail, we have a longer term retry mechanism that will come back and attempt to deliver up to 6 more times over a couple of hours. This allows us to handle some outages that the customer’s Connector destination may be experiencing.

Scroll to Top