How to parse json structured logs with jsonpath expressions
- Step 1Copy structured log lines — Copy a batch of JSON structured log lines from your terminal, log file, or observability platform's export. Each line should be a complete JSON object (NDJSON format).
- Step 2Paste into the extractor — Paste the NDJSON batch. The tool parses each line as a separate JSON object and applies the JSONPath expression to all lines in the batch.
- Step 3Write the extraction expression — Use $.msg to extract message text, $.traceId for trace identifiers, or $.err.message for nested error messages. Use filter expressions like $[?(@.level == 'error')] to select only error lines before extracting.
- Step 4Review and export matched values — The result shows matched values from all log lines. Export as a flat list for frequency analysis, or as filtered NDJSON containing only the matching log lines for further review.
Frequently asked questions
How do I find the most frequent error messages across 1000 log lines?+
Extract $.msg from all lines, then enable 'Frequency count' in the results panel. The tool groups identical messages and shows the count for each unique value — the highest counts reveal your most frequent error conditions.
Can I use this to build alerting rules for Datadog or Grafana?+
Yes, for field discovery. Use the tool to identify the exact field paths in your log structure, then use those paths in your observability platform's query language — Datadog's @field_name or Grafana Loki's label_filter syntax. The JSONPath syntax is not directly used in alerting platforms but the field paths translate directly.
Is structured log data — including trace IDs and user information — transmitted to JAD Apps?+
No. JSONPath evaluation runs entirely in your browser. Log line content, trace IDs, user identifiers, and error messages are never transmitted to JAD Apps servers.
Privacy first
Conversion runs locally in your browser. No file is uploaded — only metadata counters are saved for signed-in dashboard stats.