How To Configure The Mezmo Telemetry Pipeline

    4 MIN READ
    3 MIN READ

    Mezmo Telemetry Pipeline helps organizations ingest, transform, and route telemetry data to control costs and drive actionability.

    Modern organizations are adopting telemetry pipelines to manage challenges with telemetry data (logs, metrics, traces, events) growth and to get the most value from their data investments. 

    Some of the common business outcomes organizations look for include:

    • Cost Control:  Manage the increasing cost of observability data
    • Enable Access: Route the right data to the right observability tools and platforms, ensuring that the team can access the data they need to identify and solve issues.
    • Ensure data usability: Transform data to provide context and make sure that it is formatted for easy consumption by downstream observability tools and systems, increasing developer and SRE productivity
    • Reduce Compliance Risk: Reduce risk by masking or redacting sensitive data

    To learn more about the benefits of a Telemetry Pipeline, read this eBook “Getting The Most Value From Telemetry Data”  from Ventana Research.

    In this product tour, we’ll take you through the steps to build a telemetry pipeline in the Mezmo platform.

    Step 1: Set Your Telemetry Data Sources

    Your sources can include various systems and applications such as Splunk HEC, Fluentd, AWS S3, HTTP, or Mezmo agent.

    Step 2: Establish the Destinations

    Your destinations may include Kafka, Splunk, DataDog, Graphana, Mezmo Log Analysis for downstream analysis, or AWS S3 for lower-cost storage. You do not want to send all your data to high-cost platforms. Processors for filtering, sampling, and routing send the right and only the necessary data to the downstream systems. You would also want to sample data from a live stream and run a simulation to ascertain that the right data is flowing into the right systems.

    Step 3:  Add Processors for Data Formatting and Transformation

    The telemetry data is seldom ready to use, It needs processing before the downstream system can ingest it for any useful analysis. This is where you add transformations like date formatting, deduping, aggregating or encrypting.

    To learn more about Data Transformations, read this whitepaper.

    Step 4:  Pipeline Management

    As you deploy your pipeline you want to have visibility and insight into the overall volume of the pipeline data or deep dive into data management to control, troubleshoot, encrypt, and more.

    If you are considering implementing a telemetry pipeline for your organization, we recommend reading through these Best Practices before you start. 

    Interested to learn more? Start your free trial of Mezmo Telemetry Pipeline now!

    You can then visit our documentation to learn about how to get started. 

    We always love to hear your ideas. If you want to share your ideas with our product team contact us here.