Splunk Inc.

07/16/2024 | News release | Distributed by Public on 07/16/2024 03:05

When Applications Were Monolithic And Changes Were Rare…

The path to digitalisation, however rewarding the goal, has proven to be a difficult one for many enterprises. One thing that is certain, however, is that visibility into application performance is vital not only to an effective technology strategy but to business success. The use case study to follow outlines a series of events experienced by many companies and sets down some guideposts to help you ensure that you will know the kind of digital service your customers are getting and will allow you to respond rapidly to outages and brownouts and, as a result, minimise the impact on revenue and brand.

A European energy concern had deployed a second generation application performance monitoring (APM) solution in 2015, having chosen a vendor that, at the time, was a Gartner® Magic Quadrant™ (MQ) leader. Initially, it was a good fit. The number of corporate applications written in Java, according to the three tier architectural standards of the day was growing and it was anticipated that eventually the greatest part of the application portfolio would be Java-based. Another major requirement was the ability to rapidly map a single end-to-end business transaction to the succession of underlying software component state changes and the vendor in question featured that very functionality as a differentiator.

The Revolution

In the first years after the initial APM system deployment, things went well but in that same period, ideas regarding best practise application architectures and the application development process itself underwent a revolution in response to digitalisation driven changes in the energy market. Applications needed to be built in such a way that functionality could be frequently added, modified, or removed in the blink of an eye with only minimal impact on the surrounding environment. To accommodate this demand, architectures became more modular and components more ephemeral while the application development process sped up to support a continuous stream of changes flowing into the production environment. A side effect of this revolution was that application developers, heretofore unconcerned with what happened to their code once it entered into production, now had to take responsibility for that code at least in the early days after its initial release.

Things Fall Apart

With the change in application architecture and what turned out to be an order of magnitude acceleration in the rate at which changes were delivered, the deployed APM technology began to fail at its mission. Being a second generation platform, the technology relied heavily on sampling, the use of pre-defined application topology models, and byte code instrumentation based deep dive analysis of behavioural anomalies. Unfortunately, sampling was too coarse to capture all of the important change-driven events in the environment; the pre-defined models were out-of-date almost immediately and the focus byte code based analysis warped an understanding of transaction flows when most of the transaction processing took place in the passing of messages between containerised components of Java code. Finally, the inadequacies of observation and analysis at the software system state change level greatly reduced the value of any end-to-end business transaction view. In short, when it came to the newer applications, the energy concern was unable to effectively understand the impact of the thousands of changes being made monthly.

The Tool Portfolio Grows

In response to the deteriorating situation, the APM solution was supplemented by an increasing number of tools. Application logs, provided by Splunk's log management system proved to be a particularly rich source of information, confirming and enhancing the output of the second generation technology. Costs were increasing, however, as was the toil associated with maintaining and choreographing a portfolio of unintegrated technologies. Most problematic, however, was the loss of any possibility of putting together an end-to-end view of digital system behaviour that would be directly relevant to business - as opposed to technology - decision makers.

Back To The Market

The decision was made to review what the market had to offer - not with an eye to replacing the existing APM technology across the board but to put in place a division of labour where a new platform would handle the newer generation of modular, ephemeral applications, leaving the legacy portions of the portfolio (still over 50%) to the incumbent vendor. There might be a gradual migration away from that vendor over time but there was no need for a radical, across the board replacement at present.

Why Splunk Won

Splunk's Observability platform quickly rose to the top among the vendors being considered for four reasons:

  • First, the scope of the Splunk platform's functionality meant that it could, in an integrated manner, handle the broad range of requirements now being addressed by many disparate tools.
  • Second, it was clear that, unlike most of the other technologies in the competition, Splunk's Observability platform was designed from the beginning for the world of modular, ephemeral applications. It was not a retrofit of second generation APM onto the brave new world of cloud-based, micro-service, and function-driven computing.
  • Third, the energy concerned recognised that handling telemetry at virtually unlimited scale was critical to understanding the impact of change and Splunk's unquestioned log management prowess showed that it had a proven track record in this kind of data management.
  • Finally, Splunk's integration of Digital Experience Management (DEM) functionality into the platform provided a perspective on application behaviour that was not only essential for capturing the 'last mile' of a transaction's progress, it also supported a business meaningful view of what services the monitored application was providing.

 

New Realms And Beyond: What's Next

The implementation was a success and the Splunk Observability platform has now become a key enabler of the company's IT Operations and Development. In the future, it is expected that Splunk's footprint will grow in tandem with the deployment of 'new generation applications' but, beyond that, more attention will focus on using AI functionality provided by Splunk's IT Service Intelligence (ITSI) system to understand and even predict the impact of system level changes on business process execution.

The successes achieved by this European company were not a result of unique circumstances. Any company, with a focus on the digitalisation and a strategic approach to data can get there with an assist from the Splunk technology portfolio. Unfortunately, until now, the reach of Splunk's observability functionality into the German marketplace has been somewhat limited due to the lack of a German geography realm that satisfies regulatory requirements. Hence, Splunk was not able to support many German companies as they coped with the accelerating pace of digital environment change, even when Splunk was already the strategic choice for on premises IT operations log management and security event and information management. Now, however, with the new realm, our German customers will be in a great position to build out a unified, AI-enabled approach to development, operational, and security data management.

Want more details on the Splunk realms and our solutions? Don't hesitate to get in touch.