Search patterns in log data and parse results

Log files typically contain a lot of text. One way to handle a large amount of text is to group similar log entries together and parse them. The Log Viewer enables you to present raw log data in a sortable, filterable table format that is easy to work with.

Use the Log Viewer to browse through the contents of individual process log files or search selected log files using keywords. Only hosts or process groups active during the selected time frame are shown. Log results can be returned in either raw or aggregated form. You can parse the result table and narrow down the text pattern search result, show and hide specific columns, and define your own custom columns.

Bookmarks

Use a bookmark to save and reuse your search queries and filters. Any changes you make during your log analysis (log parsing and column filtering) will be saved within the bookmark. Using bookmarks, you can return to the same settings later, but with the default time frame.

Sharing log analysis differs from bookmarks. The share link includes the selected time frame so the person who receives the link sees exactly what you want to share in the time frame that you selected.

Searching for text patterns

To search log files for a text pattern, first select the logs based on the host or process groups perspective. Then search for a text pattern using the Dynatrace search query language (or leave blank to return all results).

Using combinations of keywords, phrases, logical operators, and parentheses, the Dynatrace search query language provides you with complete flexibility over searches through important process-log content.

Show me everything

To return all results, leave the query box blank.

Sample queries

  • Error AND Module1?2
  • "Connection refused" OR Timeout
  • Procedure AND (started OR stopped)
  • Exception AND NOT repeat*
Free tier

Note that, not all parsing and filtering options are available with the free tier of Log Analytics. Some analysis tools may not be available in your Log viewer.
See What data storage advantages does Log Analytics provide?

Parsing method

Beta

  • automatic
    Dynatrace will read the log content and automatically determine the structure of the log, the type of data it contains, and how to present the results in columns. If more than 50 columns are automatically detected, the displayed results will contain the 50 columns with the highest values count.

  • JSON
    Dynatrace will read the content of selected logs, determine the structure of the log, and present the results in columns only for logs that contain JSON-format data. Each automatically parsed column will represent a JSON field detected in the log entry.

  • none
    The results reflect the actual log file content.

Too many columns

If more than 50 columns are automatically detected, the displayed results will contain the 50 columns with the highest values count.

Custom columns

Beta

You can add, hide or remove a custom column to your log display. Each column name must be unique. Don't use the name of any automatically detected column or repeat a custom column name you already used.

To indicate the section of a log entry that should be listed in your new column, type in the prefix and suffix, and select the type of value that you want to list in the custom column.

The matching value will be treated as the type you indicated. A text value is treated as text even when the value is a number. A number is used for general display of numbers (positive and negative). This is important for sorting purposes.

Column values filter

Beta

You can apply a filter to the results of the text pattern search, but only to the columns that are not restricted (.Source, .Timestamp, .Content). Apply the filter to all manual and automatically parsed columns. Both column names and values used in the filter are case sensitive.

Every column contains a value of a specific type: text (string), number (integer), number (decimal), or Boolean true/false. Depending on the value type, the syntax of your filter can vary. For example, if the column is a number (integer) type and you assign a different value type (string, decimal, or true/false) in your filter, you will generate a parsing error.

  • Column names containing special characters must be enclosed in single quotes.
    For example:

    `integer Column # 2` = 20
    
  • Filtered values containing quotation marks must be enclosed in double quotation marks.
    For example, the value path "test" error should be entered in the filter as:

    expath = "path ""test"" error"
    
  • Column values where the sum of column name and value length exceeds 8191 characters are shortened with an ellipsis.

Bookmarks

An invalid column filter query will not be applied to already saved or shared bookmarks.

Top N occurrences

Beta

Click on any column name to view the top 10 values with the most occurrences within that column. The value occurrence percentage is in relation to the result based on the column filter.

For example, if you Include in the filter only one value for a particular column, the occurrence of that value in that column will be 100%.

  • other
    The remainder of the filtered values that do not qualify for the top 10 most occurring values within the column.

  • (blanks)
    A count of the filtered log records that contain no value in the selected column.

Use the Include and Exclude buttons to add or subtract the value in the column filter syntax. Click Apply to update the filter result.

If you Include the value in the filter, the result will display only the selected (included) values. If you Exclude the value in the filter, the result will display all values except the ones you have excluded in the filter. You can include and exclude multiple values in one filter.

Show me everything again

To reset the filter, clear the filter syntax and click Apply.