Logs typically contain lots of text. And reading logs is certainly a difficult task. One Dynatrace Log Analytics capability that assists you in handling large amounts of log text is the grouping of similar log entries—in the Log viewer, we call these “summarized results.”
Another popular log management feature that is now available with licensed versions of Dynatrace Log Analytics (not the free tier) is log data parsing. If your log entries are similar in structure and carry similar information in the same place across log entries, you can use Dynatrace to convert raw log data into sortable, filterable tables—all this can now be done in the Dynatrace Log viewer. This form of presenting data is more useful for analytics tasks because it presents data in a human-readable form.
Let’s take a look at my use case as a Product Manager at Dynatrace. As a Product Manager, I want to understand the adoption of my product, its performance, and then run some business analysis to see how the product is used by our customer base. Let’s suppose that I ask our developers to build extensive logging each time a customer uses a product feature. Here’s an example server log entry:
As you can see, the server log file contains some useful information, like:
- Time selection:
Timeframe duration: 2.0h
- Analysis result:
- Number of log files and hosts selected:
Log file sources: 1, Host selection size: 1
- Query that was used:
- Time taken to complete the search:
Full analysis duration: 10.793 s
If we want to quickly get an overview of which queries were launched by users of the product, their wait times, as well as selection scope, and final results, looking at these lines in their current form could be a bit problematic; they are long and difficult to follow.
What if we could parse this information into a table that would have each of the values from the list above presented in a separate column? We can do this now in the Log viewer. If you’re running a fully licensed version of Log Analytics (not the free tier of Log Analytics) you’ll notice a small placeholder at the top of your analysis results in the Log viewer:
Use this to show or hide some of the automatically parsed columns (try it for JSON logs!), view their value distribution, or configure new columns of your choice. Here’s how we would configure the column showing the time taken to complete the search from the example above:
If we complete this for other columns from the example above, we can easily display our sample log content in the following format, which is quite useful for analysis:
We can quickly understand the distribution of different values by clicking on a column header. For example, we would click Search query to see which queries have been used to run log analyses:
We can also quickly Exclude or Include individual values—the appropriate filter is added to the filter box above the table. To exclude blank queries from our analysis, we would click Exclude next to
(blanks) in the first row. We would get this filter in the filter box:
After clicking Apply, the pie chart is refreshed (see below).
We can also edit filters manually using different logical operators like
BETWEEN. See our documentation for details and examples of filtering syntax.
The table itself is fully sortable, allowing quick analysis, for example, finding the slowest queries:
Once you’ve set up the table as you want it, you can save the view as a bookmark so that you can return to it at any time from your Log Analytics dashboard or have it visible as a bookmarked query in your Log viewer query window:
Anytime you open this bookmarked query, all definitions, filters, sorting, and overall construction of the table are preserved.
In the near future, we will add the ability to send such tables by email periodically and to create metric charts out of numeric columns that can be pinned to your custom dashboards.