Apache Spark monitoring

Quickly find key performance metrics about your Apache Spark instance

Try for free Contact us

Analyze your Apache Spark components

Dynatrace auto-detects your Spark components and shows key metrics and a timeline chart specific to each component.

Get in-depth cluster performance analysis

Dynatrace provides in-depth cluster performance analysis of both current and historical data.

Pinpoint problems at the code level

Dynatrace directly pinpoints components that are causing problems with big data analytics of billions of dependencies within your application stack.

Start monitoring your Spark components in under 5 minutes!

In under five minutes, Dynatrace detects your Apache Spark processes and shows metrics like CPU, connectivity, retransmissions, suspension rate and garbage collection time.

Apache Spark process infos

Monitor your Spark components

Dynatrace shows performance metrics for the three main Spark components:

Apache Spark monitoring provides insight into the resource usage, job status, and performance of Spark Standalone clusters.

The Cluster charts section provides all the information you need regarding Jobs, Stages, Messages, Workers, and Message processing.

For the full list of the provided cluster metrics please visit our detailed blog post about Apache Spark monitoring.

Overview of a Spark process Spark cluster charts

Access valuable Spark worker metrics

Apache Spark metrics are presented alongside other infrastructure measurements, enabling in-depth cluster performance analysis of both current and historical data.

Spark node/worker monitoring provides metrics including:

For the full list of the provided worker metrics please visit our detailed blog post about Apache Spark monitoring.

Start monitoring your Spark processes with Dynatrace!

Try for free Contact us

What is Apache Spark?

Apache Spark is an open-source cluster-computing framework. Originally developed at the University of California, Berkeley’s AMPLab, the Spark codebase was later donated to the Apache Software Foundation, which has maintained it since. Spark provides an interface for programming entire clusters with implicit data parallelism and fault-tolerance.

Dynatrace monitors and analyzes the activity of your Apache Spark processes, providing Spark-specific metrics alongside all infrastructure measurements.

With Spark monitoring enabled globally, Dynatrace automatically collects Spark metrics whenever a new host running Spark is detected in your environment.

Supported Big Data monitoring technologies

Get the full picture with Spark monitoring!

Try for free Contact us