I was reading an article today that discusses managing “mission-critical” applications. I really dislike that term. It’s trite, it’s dated – even nonsensical. It suggests that applications fall into two groups – mission-critical, and…optional? marginal? unnecessary? one step away from being voted off the island?
Here’s the fallacy with that view – people that run IT organizations are smart, and they invest in stuff that matters to the business. They don’t run apps that don’t provide value because they excel at cost-efficiency. So the notion that relatively few apps are actually worth managing is illogical. Even email, the poster child for apps at the bottom of the food chain, is essential to the operation of a 21st century organization – it’s how they stay organized.
Which is why it is surprising to note that, according to industry analysts, the majority of enterprises manage fewer than 25% of their apps. I acknowledge that not all apps are created equal – some have relatively greater value than others. But surely no one would argue that the next most important 10% to 20% of apps don’t merit being managed.
So – why invest in something because it’s important to your business, but stop short of the incremental investment needed to ensure it works well? It seems irrational. Would a trucking company not monitor the oil level in their trucks, or a grocer the temperature of their freezers? At some point intervention will be required, and when not addressed on a timely basis, small problems can become big, costly ones, and business operations can be seriously disrupted.
Sure, these are not perfect metaphors, but I think my point is obvious, and obviously valid. Generally, when assets are important to the successful operation of the business, organizations invest to ensure that they keep operating effectively.
Unless they are software applications. So while people that run IT are smart, in this respect their behavior seems irrational. Attitudes towards management have always seemed a bit wonky to me. Rather than seen as additive to app value, application management has been viewed as unwanted overhead, even deleterious. Put another way, stuff that would make apps go well has often been viewed as detracting from simply making them go.
But is the lack of APM investment irrational? On balance, service delivery is pretty darned good most of the time for most apps, or at least “good enough.” Naturally, problems occur in modern, complex IT environments. So organizations invest in management technology to minimize risk and impact for the most important apps, and handle everything else as well as humanly possible. That is the status quo that seems to work reasonably well, except when it doesn’t, sometimes with serious business impact. And, in those cases, you wrestle the problem to the ground, ask “What are the chances of THAT happening again!”, and return to the status quo until it happens again.
I believe nearly all IT professionals would say that problems are inevitable, including serious business-impacting ones. So I am back to thinking that this doesn’t make good sense. Why would you not make the incremental investment in APM (for more than 25% of your apps) to reduce the incidence and impact of these inevitable events?
I can think of a few potential reasons. It may be difficult to quantify the business risk as input to a cost-justification. It may be difficult to prioritize what applications to invest in, which impedes setting technical criteria for a solution. There may be a diverse set of stakeholders with competing priorities. Any of these challenges makes it difficult to pick a strategy and move forward with it.
But in my view, none of these is a good enough reason to settle for the current status quo. There’s a bigger picture here. Those inevitable problems are affecting your business every day. Investments in apps aren’t moving in the right direction with regard to your company’s strategic customer experience focus and commitment to digital transformation. Apps are important – managing them cannot be viewed as optional.
Dynatrace has redefined monitoring to establish a new status quo way better than “good enough” for way more than 25% of your apps, regardless of their technology, and for all your stakeholders. And that makes very good sense.