The feature is available as an Early Adopter release and has the following pre-requisites:
- OneAgent version 1.187
- Dynatrace version 1.188 (SaaS or Managed)
- A JVM that supports Low-Overhead Heap Profiling, introduced with JEP 331
Memory profiling enables you to understand the memory allocation and garbage collection behavior of your applications over time. It helps you identify method calls in the context within which most memory was allocated and combine this information with the number of allocated objects. The Survivor perspective helps you to understand the context within which your long-living objects (objects that survive multiple garbage collection cycles) are created.
You have several options for accessing memory profiling
via Diagnostic tools
- Go to Diagnostic tools > CPU analysis.
- On the right-hand side of the required process of the Garbage collection CPU type, select ... and choose Memory allocations.
via process details
- In the navigation menu, select Hosts and choose the required host.
- Choose the process you're interested in.
- Do one of the following:
- Select the menu (...) and choose Memory profiling to open the All allocations tab.
- Open the JVM metrics tab and select Analyze suspension to open the Survivors tab.
You can select any timeframe for the analysis, independent of the global timeframe. We recommend to exclude third party libraries from the analysis to focus on code that is under your control.
The All allocations tab shows how many times garbage collection ran and how much memory is allocated. The table beneath shows the methods of the analyzed process that consume most of the memory.
Expand a method to see its callers. You can see how much memory the calling methods consumed.
When you see a calling method you're interested in, select Analyze in the Actions column. The Details page shows you methods that are called by the analyzed method. This page shows every method in the chain, including third party methods. You can't exclude third party libraries from this view.
On the Details page you can find the methods that contribute most to the memory consumption. Note that the Allocated types column shows objects that are allocated only in the method itself. These don't apply to any subsequently called methods.
When you find out which methods consume most of the memory, you can check their longevity on the Survivors tab. The tab shows objects that have survived one or multiple garbage collection.
The Survivors tab shows the time spent on garbage collection and the amount of memory that survived all garbage collections. Note that objects that survived multiple garbage collections contribute multiple times to the survived memory amount. The table beneath shows the methods that created objects that account for most of the survived memory.
By varying the analysis timeframe you can find the time when these objects were created.
Expand a method to see its callers. You can see how much memory that was consumed by the calling methods has survived.
When you see a caller that you're interested in, select Analyze in the Actions column. The Details page shows you the methods that are called by the analyzed method. This page shows every method in the chain, including third party methods. You can't exclude third party libraries from this view.
On the Details page you can find the methods that contribute most to the survived memory. Note that the Allocated types column shows objects that are allocated only in the method itself. These don't apply to any subsequently called methods.
Exclude third party libraries
To focus the analysis on your code, we recommend to define third-party library APIs. To do this, you must mark a user-defined API as a third-party library
- Go to Settings > Server-side service monitoring > API detection rules.
- Select Edit for the required API.
- Turn on This API defines a third party library.
- Save your changes.
If you want to exclude a library from the Built-in APIs list, create a user-defined rule for the library and mark it as third party. See Custom API definitions to learn how.