Waking up to the news this week — that Blue Prism has been bought by private equity firm Vista for approximately $1.5 billion — did not come as much of a surprise. For several years now, many of us in the RPA industry did not question if Blue Prism, and the other big 3 RPA firms: Automation Anywhere, UiPath and Pegasystems, would be bought in such a manner — only when it might occur. What may come as a surprise is the relatively low price for the once-leading RPA vendor. $1.5 billion is far below the current market valuation of competitor UiPath, at $28 billion, and it even pales in comparison to Automation Anywhere’s more modest estimated value of $6.8 billion.
What is going on here? Is such a low valuation a reflection of Blue Prism’s ongoing money hemorrhage compounded by its difficulty raising revenues? Certainly, the losses have been reduced — from $73 million down to $30 million in the six months ending 30 April 2021 versus the same period the prior year, and revenues have gone up — $108 million versus $87 million over the same time frames — but neither change represents a sustainable business model nor a clear path out of the woods for the embattled software vendor.
Rolling with the good times
I would suggest there is more in this deal than meets the eye, and the price Vista is paying may be indicative of some fundamental flaws in the RPA industry caused by its Justin Bieber/Lindsay Lohan-esque upbringing. RPA vendors suffered the same instant popularity as social media influencers, which often leads to an inappropriate and overinflated sense of self-worth — if not straight up narcissism bordering on sociopathy.
For a good five years, starting around 2014, anyone who was anyone was buying RPA software. Buying RPA, but not necessarily using it. RPA vendors were overnight sensations — rock stars of the software world. Adoption rates for RPA rapidly reached the high 90th percentile amongst the Fortune 1000. It seemed the vendors and their delivery partners had stumbled upon a money-making machine on a par with bitcoin, and investors and employees licked their collective chops at the prospect of future IPO valuations.
But, while adoption rates were extremely high, adoption volumes were lackluster at best. It was not unusual to find an enormous Fortune 100 company on a vendor’s customer list, but the same company had only a handful of licenses bought for their inevitable Proofs of Concept (PoC). At $10,000-$20,000 per license, achieving 100% adoption by the Fortune 1000 would still only yield $200 million in revenue — if each only bought ten licenses. Early industry expectation was that such clients would each rapidly grow their bot population into the hundreds and thousands and would then produce tens of millions in revenue per client. But such large adopters were most notable by their absence.
Cracks in the facade
What caused the slowdown in RPA adoption? I discussed this extensively in my book, “The Care and Feeding of Bots,” but in a nutshell, the number one problem was missed expectations. Promises of ease of use, instantaneous cost recovery, and nearly maintenance free ownership sounded great during the sales process — but were as common as hen’s teeth after implementation.
Initial PoCs typically proved the concept of RPA worked. It was the concept of return-on-investment calculators in Excel which was woefully inadequate. These models invariably failed to reflect the realities of bot design, development, deployment, and ownership, and those customers that obtained early ROI wins in their RPA adoption usually did so by accident.
I fought the law and the law won
This is not to say that RPA doesn’t work — far from it. The 4-5% success rate noted by most research and consulting firms clearly shows “success” with RPA is possible — if improbable. Why is this? There are several contributing factors to challenges with RPA, but I am left believing one of the greatest limitations to RPA success is best articulated in Ashby’s Law of Requisite Variety (or Complexity). Without going too deeply into this “Law,” the principle states that an effective solution to a given problem must be at least as complex as the problem itself. Any “solution” not as complex as the problem is, at best, only an approximation of a real solution.
As such, any solution less complex than the underlying problem is a catastrophe waiting to happen. It may appear to solve the problem at hand, but this is only an appearance, and it is only a matter of time until its shortcomings reveal themselves. As a systems engineer building — and later troubleshooting — spacecraft, I learned this lesson early in my career. The shuttle disasters, Apollo 1, Landsat, Telstar 402, Mars Observer, etc., are all examples of engineering solutions which underestimated the complexity of the missions they were intended to complete.
What does all of this have to do with the valuation of Blue Prism? By 2017, the big three RPA vendors were noticing things in their industry beginning to go awry. Acquisition was slowing down, adoption was stagnant, and customer satisfaction was tanking. The pragmatics of RPA were not measuring up to the promise, and industry leaders were starting to worry.
Again, this was fundamentally a problem of Ashby’s Law. By working through the user interface, RPA software was attempting to short-circuit the cost, complexity, and difficulty of traditional systems integration by circumventing it. If a human could navigate the logic and architecture of complex IT systems, why couldn’t a macro do it? This perspective completely discounted the fact that the human performing these tasks was applying cognition to their clickstream. They weren’t just clicking away randomly; they were performing tasks they learned to perform in a certain order for specific reasons.
If designers did not accurately, and comprehensively, capture user knowledge, it would not be replicated in the RPA code and would therefore be absent from its operation. When this occurred, it was only a matter of time until the code failed to perform as the process intended, and it would hit the wall of Ashby’s Law; the requisite complexity would not be met.
I’ll take what’s behind curtain C
The big three RPA vendors had three options for dealing with Ashby’s at their disposal. First, they could attempt to address the need to manage greater complexity by increasing the complexity, and functionality, of their software. This was the path chosen by UiPath. Second, they could attempt to reduce the level of systemic complexity customers needed to address, specifically by assuming ownership and control of the bots and their runtime environment, aka Bots-as-a-Service. This was the path taken by Automation Anywhere.
The third approach for dealing with systemic complexity in RPA implementations was to ignore it — only more emphatically. This was the path Blue Prism followed. In the 2017 timeframe, UiPath expanded its core functionality, Automation Anywhere re-platformed to be cloud native and delivered in a SaaS model, and Blue Prism invested in an enormous “Customer Success Organization” led by their new Chief Customer Officer. This team was tasked with helping customers achieve success, presumably by providing them with greater access to more experienced RPA resources.
These resources were measured by how long they had been working with RPA software and how many bots they may have previously deployed. But by 2017, nearly everyone was attempting to implement bots the same old way: Proofs of Concept (PoCs), Centers of Excellence (CoEs), Governance Boards, and ROI calculators. It was the same old approach from the prior decade — only more so. They pushed against the wall of missed expectations — only harder.
This customer success team rapidly scaled into a large professional services organization, burning through hundreds of millions of dollars, and producing remarkably little “customer success” while consuming investment capital that could have been used for product innovation. Further, this internal services team began to directly compete with the channel represented by systems integrators, resellers, etc., aka the traditional means by which RPA vendors sold their products and grew their revenues. Rather than engage with “experts” in the channel, Blue Prism was on the hook to use their own “experts” to solve customer problems; they needed to utilize their own bench of resources instead of filling the bench maintained by their service partners.
Within a year or so, it became obvious this approach to increasing customer adoption was not working, and it was massively expensive to boot. Blue Prism eliminated most of this new organization and attempted to refocus its effort on improving its core products. But by then, they were falling further behind their competitors in terms of technical competence, had burned bridges with reseller and integrator channels, and hemorrhaged investment capital in building, and then terminating, an extensive professional services organization. By 2019, the game was arguably up, the die was cast, and while it took a pandemic and a couple of years of continued lack of “customer success” for the end to manifest itself, the eventual acquisition of Blue Prism was inevitable.
The future for RPA?
While the result of Blue Prism, and indeed the tone of this post, may imply that I am bullish on RPA, this is not the case. I view RPA as almost inevitable in the world of enterprise software. Adoption of “intelligent automation” is effectively non-optional for major organizations. Any technology with the potential to provide a 5-15% structural savings in the cost of performing core business processes MUST be adopted — whether organizations like it or not. Those who choose not to adopt RPA will simply be out-competed into oblivion Blockbuster-style. This view also deeply discounts the other benefits of RPA, with speed, quality, and outcome reliability chief amongst them.
In effect, choosing not to use RPA in the 2020’s would be equivalent to choosing not to have a website in the 2000’s or choosing not to use social media for business in the 2010’s. It’s possible but not recommended. This is true for the same reasons that led to the inevitable adoption of these two example technologies: It’s just better. Those burned by RPA over the last decade need to reengage with the technology. But this time, they should use it differently — and better. This was the real lesson learned from the first decade of RPA. The technology can work, but it works differently than believed or expected, and it’s harder and more costly than anticipated — which was also true of nearly all its information technology predecessors.
This was all completely predictable according to Ashby’s Law, but with the clarity of hindsight, we can understand why — and how we might do better going forward. The emphasis on increased ability to deal with complexity, followed by UiPath, and the emphasis on reducing total systemic complexity, followed by Automation Anywhere, are the more viable approaches to dealing with RPA hangovers. I am certain time will show these as much more viable approaches to RPA adoption, and the valuation of these companies will continue to reflect this reality.