EMPOWER YOUR TEAMS WITH DATA INSIGHTS
Mezmo helps organizations derive more value from their telemetry data. Bring together data from various sources via our open platform, apply out-of-the-box and custom processors to transform data , and route it to any destination for monitoring, visualization, and further analysis.
Reduce Control data volume by identifying unstructured data, removing low-value and repetitive data, and using sampling to reduce chatter. Use intelligent routing rules to send certain data types to low-cost storage. Get data insights before it reaches high-cost destinations.
- Filter: Use the Filter Processor to drop events that may not be meaningful or to reduce the total amount of data forwarded to a subsequent processor or destination.
- Reduce: Takes multiple log input events and combines them into a single event based on specified criteria. Reduce will combine many events into one over a window of time.
- Dedupe: Reducing “chatter” in logs. The overlap of data across fields is the key to having the Dedup Processor work effectively. This processor will emit the first matching record of the set of records that are being compared.
Increase data value and optimize data flows by transforming and enriching data. Modify data as needed for compatibility with various end destinations. Enrich and augment data for better context. Scrub sensitive data, or encrypt it to maintain compliance standards.
- Event to Metric: This processor provides an easy way to create a new metric event within the pipeline, typically from an existing log message. The new metric event can use data from the log to generate the metric, including the desired value.
- Aggregate: Metric data can have more data points than needed to understand the behavior of a system. Excess metrics can lead to a higher storage cost without an increase in value.
- Sample: The Sample Processor is useful when the number of events exceeds the number required to understand the data.
Manage Audit, Risk & Compliance
Enable timely identification of risks due to new code releases/deployments. Know when any compliance-related event happens (code checked in, etc), and trigger an action in the pipeline. Scrub, mask, or redact sensitive data and manage data routing to comply with data residency laws.
- Encrypt: Use the Encrypt Processor when sending sensitive log data to storage, for example, when retaining log data containing account names and passwords.
- Route: Separate events from a single stream into multiple streams. This allows you to choose the next processor or destination to which an event is sent. A typical use of this processor would route processed log data to different destinations.