Overview
Processors are fundamental components in log processing pipelines that perform specific operations on log data. They are responsible for transforming, enriching, and manipulating log entries as they flow through the system. Each processor is designed to handle a specific type of operation, from simple field modifications to complex data transformations.
๐ง AIโ
AI processors harness the power of artificial intelligence APIs for sophisticated content analysis and processing. These processors utilize various AI services to perform advanced text analysis, classification, and generation tasks. They enable intelligent processing of content, making it possible to extract insights and meaning from complex data.
๐ญ Anthropic
Processes content with Anthropic's Claude API
โก Azure OpenAI
Processes content with Azure OpenAI API
๐ OpenAI
Uses OpenAI's API for content analysis
๐น Analyticsโ
Analytics processors gather and manipulate data to render the data points suitable for metrics and analyses. They select the data points that reveal critical information about the generators of data, and process them to make the relevant information contained in them more visible.
๐ Confidence
Calculates confidence scores from scoring data with multiple normalization methods
๐ Debug
Logs debugging information
๐ Dynamic Sample
Adjusts sampling rates
๐ฒ Sample
Reduces data volume by sampling
๐ฏ Score
Evaluates and scores data against configurable rules for pattern recognition and classification
๐ Suppress
Suppresses duplicate events within a sliding time window using a key expression.
๐งฎ Arithmeticโ
Arithmetic processors perform mathematical operations and calculations on numeric field values within log data. They support basic mathematical functions like addition, subtraction, multiplication, and division, as well as more complex operations such as calculating percentages, averages, and statistical computations. These processors enable quantitative analysis of log data by transforming raw numbers into meaningful metrics and derived values.
๐ Abs
Absolute value of a field
โ Add
Adds numeric values
๐ผ Ceil
Rounds numbers up
๐ข Checksum
Calculates cryptographic and non-cryptographic checksums of field values
โ Divide
Divides values
๐ฝ Floor
Rounds numbers down
๐ต Math
Performs mathematical operations
๐ผ Max
Calculates the maximum value
๐ฝ Min
Calculates the minimum value
๐ Modulo
Calculates the remainder
โ๏ธ Multiply
Multiplies two numeric values
๐ข Ordinal
Converts numbers to ordinal format in multiple languages
โก Power
Raises a numeric value to a power
๐ Round
Rounds numeric values
โ Sqrt
Calculates the square root
โ Subtract
Subtracts numeric values
๐ Flow Controlโ
Flow Control processors manage the execution paths and logic within processing pipelines. They direct how documents move through the system, handle conditional processing, filtering, and organize pipeline structure. These processors are essential for creating sophisticated processing logic and maintaining efficient pipeline organization.
โ Break
Halts execution of remaining processors in the current pipeline chain and forwards the log entry to its target
๐ Case
Conditional field assignment using case-when logic
๐ Check Schema
Validates event data against ASIM or OCSF schema definitions
โก Commit
Finalizes staged routes from reroute processor
โก๏ธ Continue
Continues to the next processor in the pipeline chain
๐ Contains
Checks the presence of a value
๐ Date Index
Generates time-based index names
๐ฆ Discard
Removes staged routes from pipeline processing flow
๐ซ Drop
Conditionally stops processing a document
โ Fail
Raises failures when conditions are met
๐ Final
Terminates a pipeline
๐ Foreach
Applies processors to arrays
๐ฏ Go To
Jumps to specific points in the processing pipeline
๐ฆ Group
Groups multiple processors together for conditional execution and organization
โ IFF
Conditional field assignment processor
๐ Matches
Tests a field value against a text pattern or regular expression
๐ Pipeline
Executes another pipeline
โ Recover
Terminates the pipeline successfully, ignoring any previous errors
๐ฏ Regex Filter
Filters events using regexes
โคด๏ธ Return
Finalizes processing and prevents further pipeline execution
๐ Reroute
Directs logs to specific destinations
๐ Script
Executes scripts
๐ฏ Select
Extracts a specific element from arrays by position
โ๏ธ Slice
Extracts a portion of an array field
โ๏ธ Take
Extracts a specified number of characters or elements from strings and arrays
โ Date and Timeโ
Date and Time processors handle temporal data operations including parsing, formatting, and manipulating date and time values. They convert between different date formats, extract time components, calculate time differences, and manage timezone conversions. These processors are essential for standardizing temporal data and performing time-based analysis on log entries.
๐ Date
Parses dates from date fields
โฑ๏ธ Duration
Converts durations to seconds
โฐ Time Shift
Shifts timestamps by specified amounts with timezone conversion
โธ๏ธ Wait
Introduces a time delay
๐ Enrichโ
Enrichment processors enhance log data by incorporating additional context and information from external sources. They add value to existing data by integrating geographical information, performing DNS lookups, and adding domain intelligence. These processors connect with external databases and services to provide comprehensive context to your log data, making it more valuable for analysis and understanding.
๐ AAD Error Code
Converts Azure Active Directory error codes to human-readable descriptions
๐ Attachment
Extracts content and metadata
โญ Circle
Converts circles to polygons
๐ DNS Lookup
Performs and caches DNS lookups
๐ Error Code
Decodes Windows system error codes into human-readable descriptions
๐ Enrich
Enriches documents using lookup tables and SQL queries
๐ Geo Grid
Converts geo-grid definitions to shapes
๐บ๏ธ Geo IP
Adds geographic information
๐ Lookup
Enriches documents using lookup tables
๐ Registered Domain
Extracts domain components
โ๏ธ Snowflake
Generates a unique Snowflake ID
๐ง Data Manipulationโ
Data Manipulation processors modify existing data fields and values to ensure proper formatting and structure. They handle tasks such as appending values, converting data types, managing field structures, string manipulation, and data transformation. These processors are fundamental for maintaining data consistency and preparing information for further processing or analysis.
โ Append
Appends values to fields
๐ Bag Pack
Creates a map (bag) from key-value pairs with template support
๐ Bytes
Expresses values in bytes
๐ช Camel Case
Converts strings to camelCase format
๐งน Clean
Removes unwanted characters from string fields with configurable cleaning modes
๐ Coalesce
Returns the first non-null, non-empty value from a list of fields
๐ฏ Compact
Removes empty fields from documents
๐ Convert
Converts values between types
๐ณ Dot Expander
Expands dot notation field names into nested object structures
๐ Dot Nester
Flattens nested objects into dot notation fields
๐ Dot Case
Converts strings to dot.case format
๐ Enforce Schema
Validates and enforces data schemas on log entries
๐ Expand Range
Expands range expressions into arrays of individual values
๐ Gsub
Regular expression-based replacement