Real Time Data Controls

Configure which and how much data travels through each of your data pipes, which data to combine it with, and how to transform it as needed for your data tools.

From data pipe to data pipe, choose which data points you want to collect from which Collectors.

1. Choose data points

2. Optimize your data volume

Sampling: specify the percentage of sessions to sample. Example: I want 5% of sessions on my QoE data pipes and 100% of sessions on my ad data pipes.

Filtering: collect only events with certain metadata or remove metadata from certain events. Example: I only want to collect geographic and device metadata on the first event of the session.

Enrichment: add information from backend systems to event streams. Example: I want CPMs from my Google Ad Manager report correlated to my ad engagement events.

3. Avoid post processing

Transformations: change any event or dimension name or its value based on logical operators. Example: when Milestone event has a Milestone Percent dimension that equals ‘25’, change the event name to First Quartile.

An Excellence in Data Quality

Guaranteed under SLA

Hundreds of validations and alerts ensure that expected messages are received in the expected time, format and location.

Expected number of messages (prevent loss)

Expected message types (selected events, metadata, fluxdata)

Expected enrichment source data is updated and enriching messages

Remove duplicate messages

Failed delivery to connector is retried

Transformation rules are applied

Event message chronology

Data type validation

Ingress and egress lag does not exceed max threshold

Errors (4XX, 5XX) do not exceed max threshold

Expected infrastructure capacity for load trajectories