Apache Spark performance
All relevant key performance metrics about your Apache Spark instance in minutes
In-depth cluster performance analysis
Dynatrace presents Apache Spark metrics alongside other infrastructure measurements, which enables in-depth cluster performance analysis of both current and historical data.
Details about all of your Apache Spark components
Apache Spark performance monitoring provides insight into the resource usage, job status, performance of Spark Standalone clusters, and even more.
Pinpoint problems at the code level
Dynatrace automatically pinpoints components that are causing problems with big data analytics of billions of dependencies within your entire application stack.
Optimizing your Spark components in minutes
Dynatrace immediately detects your Apache Spark processes and shows key metrics like CPU, connectivity, retransmissions, suspension rate and garbage collection time.
- Manual configuration of your monitoring setup is no longer necessary.
- Auto-detection starts monitoring new hosts running Spark.
- All data and metrics are retrieved immediately.
Improve your Spark performance
Dynatrace shows performance metrics for the main Spark components. The three main metrics for Spark are cluster managers, driver programs and worker nodes.
Apache Spark monitoring provides insight into the resource usage, job status, and performance of Spark Standalone clusters. The Cluster charts section provides all the information you need regarding Jobs, Stages, Messages, Workers, and Message processing.
For the full list of the provided cluster metrics please visit our detailed blog post about Apache Spark monitoring.
Access valuable Spark worker metrics
Apache Spark metrics are presented alongside other infrastructure measurements, enabling in-depth cluster performance analysis of both current and historical data.
Spark node and worker monitoring provides metrics including the number of
- free cores
- worker-free cores
- cores used
- worker cores used
- worker executors
For the full list of the provided worker metrics please visit our detailed blog post about Apache Spark monitoring.
Just a few of our enterprise customers
Try Spark performance monitoring now!
You’ll be up and running in under 5 minutes:
Sign up, deploy our agent and get unmatched insights out-of-the-box.