Search patterns in log data and parse results

Log files typically contain a lot of text. One way to handle a large amount of text is to group similar log entries together and parse them. The Log Viewer enables you to present raw log data in a sortable, filterable table that is easy to work with.

Use the Log Viewer to browse through the contents of individual process log files or search selected log files using keywords. Only hosts or process groups active during the selected time frame are shown. Log results can be returned in either raw or aggregated form. You can parse the result table and narrow down the text pattern search result, show and hide specific columns, and define your own custom columns.

Bookmarks

Use a bookmark to save and reuse your search queries and filters. Any changes you make during your log analysis (log parsing and column filtering) will be saved in the bookmark. Using bookmarks, you can return to the same settings later. Remember that, the timeframe applied to bookmarked search queries and filters is the same time frame currently set on the page.

Sharing log analysis differs from bookmarks. The share link includes the selected time frame, so the person who receives the link sees exactly what you want to share in the time frame that you selected.

Search for text patterns in log files

To search log files for a text pattern, select the logs based on the host or process groups perspective, then search for a text pattern using the Dynatrace search query language (or leave the query box empty to return all results).

Using combinations of keywords, phrases, logical operators, and parentheses, the Dynatrace search query language provides you with complete flexibility over searches through important process-log content.

Show me everything

To return all results, leave the query box blank.

Sample queries

Error AND Module1?2
"Connection refused" OR Timeout
Procedure AND (started OR stopped)
Exception AND NOT repeat*

Advanced options

Select Advanced options to create columns based on values extracted from log data. The new columns are applied to the result of searching the log files for a text pattern. The extracted value will be the first matched per log entry.

Extract fields

To create a column, define a section of log data that should be extracted as a column. Provide a string directly preceding (val_pref) and directly following (val_suff) the value that you want to use in the column. If a match is found, everything between val_pref and val_suff will be extracted as a value in the column defined in the column declaration.

  • If you use any of the following special characters in a prefix or suffix, you need to escape them with a backslash (\):
    " - double quote
    % - percent
    , - comma
    \ - backslash
  • An extracted value type has to be one of the supported column types: INTEGER, NUMBER, STRING or BOOLEAN.
  • Matching for prefixes (val_pref) and suffixes (val_suff) is case-sensitive.
  • If the value between val_pref and val_suff doesn't match the type defined in the column type, no value will be returned for the defined column.
  • Only the first occurrence of a prefix counts. If there is no matching suffix, no value will be matched, even if a matching prefix-suffix combination occurs later in the log entry.
  • An empty suffix means matches from the prefix to the end of the log entry.
  • Whitespace isn't trimmed. The prefix and suffix have to match exactly. Additional spaces cause no match.

Column declaration is where you indicate the column type (col_type) and a column name (col_name) separated by a : (colon character).

Use the following syntax to define a column: "val_pref%{col_type:col_name}val_suff"

Literal column declaration

Because column declaration is literal, make sure you enclose it in quotation marks.

The column type determines how the value is matched against log data and greatly affects what is extracted as a value. The following are valid column types that you can use in column declaration:

INTEGER
Potentially extracted value:

  • May contain characters 0 - 9, and optionally at the beginning, + or -.
  • Can't contain spaces or underscores _.
  • Maximum token length is 20 characters.
  • Empty token is improper (doesn't imply column).
  • Represented value is in range of long Java type.
Custom columns for stored logs only

Custom columns can be applied only to centrally stored logs.

You can add, hide, or remove a custom column in your log display. Each column name must be unique. Don't use the name of any automatically detected column or repeat a custom column name you already used.

Example data that matches the INTEGER column type and will be extracted as a value in a defined column for the log entry in which it was matched:

"1234"
"+1234"
"-1234"

NUMBER
Potentially extracted value:

  • May contain characters +, -, ., e, E, and any digit between 0 - 9.
  • Must represent a decimal number in normal or scientific notation 123E456.

Example data that matches the NUMBER column type and will be extracted as a value in a defined column for the log entry in which it was matched:

"123"
"123."
".123"
"123.4"
"123.4e12"
"123.4E56"
"+123.4e+12"
"-123.4e-12"
Multiple column declarations

You can define multiple columns at the same time by separating each column declaration with a , (comma).
For example: "prefix1%{INTEGER:myColumn1}suffix1","prefix2%{INTEGER:myColumn2}suffix2"

STRING
All characters found between val_suff and val_suff will be treated and extracted as a string value in a defined column.

BOOLEAN
All characters found between val_suff and val_suff will be treated and matched as a boolean value in the log entry (true or false).

Field extraction examples

The following example illustrates the behavior of the extraction mechanism when specific values are used in a column definition. The output column represents the value for the log entry in the newly defined myColumn column.

Log entry Prefix Suffix Column type Column definition Output
pref123suff pref suff INTEGER "pref%{INTEGER:myColumn}suff" 123L
pref123suff pref suff NUMBER "pref%{NUMBER:myColumn}suff" 123.0f
pref123suff pref456suff pref suff INTEGER "pref%{INTEGER:myColumn}suff" 123L
some text spanning
across multiple
pref123suff
lines of text
pref suff INTEGER "pref%{INTEGER:myColumn}suff" 123L
pref suff INTEGER "pref%{INTEGER:myColumn}suff" -
pref123456"" pref "" INTEGER "pref%{INTEGER:myColumn}\"\"" 1 123456L
pref123456 pref 456 INTEGER "pref%{INTEGER:myColumn}456" 123L
pref123suff"" pref "" INTEGER "pref%{INTEGER:myColumn}\"\"" 1 -
Pref123suff pref suff INTEGER "pref%{INTEGER:myColumn}suff" -
pref 123 suff pref suff INTEGER "pref%{INTEGER:myColumn}suff" -
pref
pref123suff
pref suff INTEGER "pref%{INTEGER:myColumn}suff" -
abc123def pref suff INTEGER "pref%{INTEGER:myColumn}suff" -
pref-abc-suff pref suff STRING "pref%{STRING:myColumn}suff" -abc-
pref:true, pref: , BOOLEAN "pref:%{BOOLEAN:myColumn}\," 1 true

1 If you use any special characters (", %, ,, \) in a prefix or suffix, you need to escape them with a backslash (\).

Column values filter

Filter for Run query only

This filter can be applied only to Run query results. It doesn't apply to Create metric.

You can apply a filter to the results of the text-pattern search, but only to the columns that aren't restricted (_Source, _Timestamp, _Content). Apply the filter to all manually and automatically parsed columns (including columns with extracted values in Advanced options). Both column names and values used in the filter are case-sensitive.

Filter for stored logs only

This filter can be applied only to centrally stored logs.

You can apply a filter to the results of the text-pattern search, but only to the columns that are not restricted (_Source, _Timestamp, _Content). Apply the filter to all manual and automatically parsed columns. Both column names and values used in the filter are case sensitive.

Every column contains a value of a specific type: string, number, integer, or Boolean true/false. Depending on the value type, the syntax of your filter can vary. For example, if the column is a number type and you assign a different value type (string, decimal, or true/false) in your filter, you will generate a parsing error.

  • Column names containing special characters (including spaces) must be enclosed in backticks.
    For example:

    `integer Column # 2` = 20
    
  • Filtered values containing quotation marks must be enclosed in double quotation marks.
    For example, the value path "test" error should be entered in the filter as:

    expath = "path ""test"" error"
    
  • Column values where the sum of the column name and value length exceeds 8191 characters are shortened with an ellipsis.

Bookmarks

An invalid column filter query will not be applied to already saved or shared bookmarks.