The cloud is complex and the volume of dependencies has reached a scale no human can possibly understand. We're talking trillions. But what's the largest?
We often refer to the cloud being complex, and that the sheer volume of dependencies has reached a scale that no human can possibly understand. As a result, it’s very difficult to articulate just how much of a scale we are talking. You’ll hear us say, “Their environment is massive, out to 50,000 hosts”, but hosts alone do not represent the complexity that arises from the software running on that host, and the many interconnections.
At Perform we often share a fairly large number that our AI engine, Davis is now analyzing.
- At Perform Las Vegas 2019, our CTO and Founder Bernd Greifeneder said the number of causations analyzed by Davis per minute was 55b. Five months later at Perform Summit Barcelona, he revealed this number is now 2500b.
- And, our SVP of Product Management, Steve Tack shared that one customer had 235 billion dependencies analyzed per problem.
Today, as part of an interview with a customer for an upcoming webinar, we noticed they had a problem detection with a number of dependencies analyzed that had to be seen to be believed.
Which had me thinking, within our customer community, who has the largest number of dependencies analyzed per problem? A billion? A trillion? Is there a number bigger than trillions?
So now I’m on the hunt.
Please share with me a screenshot of your Davis dependency count, and I will publish the findings in a follow-up blog. And no, asking your marketing team for a photoshopped version doesn’t count because we can verify every entry if we really needed to.
And don’t worry, all information shared will be kept confidential, and no company names will be revealed without approval.
We will randomly select some submissions to receive custom Davis stickers and swag.
Get in touch with us at DynatraceSocial@Dynatrace.com and use the subject line Getting Smarter with Davis.