From the Blogosphere
Bursting Gartner’s ‘Algorithm Economy’ Bubble | @CloudExpo #Cloud #MachineLearning
Gartner's hype around algorithms is mostly obvious and nothing new
By: Jason Bloomberg
Jun. 17, 2016 04:07 PM
Remember Mad Libs? You'd get a book that contained paragraphs with key words missing, replaced with hints as to what might fill the blanks. You and your friends would come up with silly words to complete the sentences, with predictably hilarious results.
Well, it's time for some Gartner Mad Libs. For this game, the same word belongs in all the blanks. See if you can figure out what the missing word is (here's the source if you can't resist peeking).
The current software ecosystem allows only whole products to be commercialized - not functions or features.... While the number of existing _________ is significant, very few engineers or scientists commercialize their _________, mostly due to the limited prospects of real success.... Another profound aspect of a marketplace is its ability to allow _________ to be reused.
If this quotation came from 2005, you'd probably guess the missing word was services. Back in the day, we talked about services as ways of breaking up monolithic applications in order to support the commercialization and reuse of those services.
Perhaps this quotation came from 2012, in which case the missing word is likely to be APIs. People still talk about the API Economy, after all - what might happen when enterprises and others expose application functionality in order to commercialize and reuse it.
If you guessed either services or APIs, however, sorry, you're wrong. This quote is from 2016, and the buzzword Gartner is hyping today is algorithms.
Gartner is now treating algorithms like they are some kind of innovative addition to the modern digital discussion. Presumably the brilliant minds there have some novel insight into algorithms and, yes, the Algorithm Economy that CIOs should sit up and take notice of.
Well, it's time to let some hot air out of this balloon, and if Intellyx won't do it, who will?
Nothing New Under the Sun
The point to the Mad Libs exercise, of course, is to underscore how repetitive Gartner's thinking about algorithms actually is. If they can simply switch out one buzzword for another every few years, then are they actually bringing any new insight to the table at all?
Let's see if we can peel away the layers of hype to see if we can find such a hidden nugget of insight. Our starting point: the definition of algorithm.
According to Wikipedia, an algorithm is "a self-contained step-by-step set of operations to be performed." Your grandma's muffin recipe, therefore, is a type of algorithm, although the history of algorithms focuses primarily on mathematical algorithms.
Algorithms, therefore, aren't new. In fact, they date to antiquity, as the well-known Sieve of Eratosthenes algorithm for creating a list of prime numbers illustrates.
The story of algorithms gets more interesting, however, with the invention of machines that could follow them - or at least, theorizing about such machines, as computing pioneers like Ada Lovelace, Charles Babbage, and Alan Turing did.
In any case, while algorithms are unquestionably fundamental to computer science generally - not to mention your grandma's cooking - we still have the question as to why Gartner is focusing their hype engine on them now. Let's go to the source and see what we can uncover.
Gartner on Algorithms
In particular, he distinguishes algorithms from data (in particular, big data), because "big data is not where the value is." Instead, "algorithms are where the real value lies, because algorithms define action." He then ties this notion of defining action to digital transformation by stating that "dynamic algorithms are at the core of customer interactions."
For Gartner, algorithms are central to the "post app era," where by 2020, Gartner predicts that "smart agents" will facilitate 40% of interactions. According to Sondergaard, "Agents enabled by algorithms will define the post-app era," and calls out Apple Siri and Microsoft Cortana as prototypes for this new era of virtual assistants and other smart agents that will replace apps.
Siri and her brethren are unquestionably innovative, and no one doubts such technology will continue to improve rapidly. But let's take a closer look at Gartner's thinking and see how well it holds water.
The Obviousness Department
By stating that data alone have no value without algorithms, he's essentially stating that computing breaks down into data on the one hand and code on the other. This division is at the core of digital computing ("digital" in its original sense of using zeroes and ones), as it has been since ENIAC.
Seventy years later, Gartner suddenly discovered that digital computing divides the world into data and code, and that data aren't really that useful without code. Good for it. However, the fact you have to do something with data to get value out of them shouldn't come as a surprise to anyone.
Sondergaard also calls out algorithmic stock trading as an example of how algorithms can be disruptive. To be sure, using software to trade stocks disrupted the earlier, manual approach - but isn't he just saying that using computer programs to automate manual processes is disruptive? Again, true, but both obvious and nothing new.
The bigger picture here is that software continues to improve, and enterprises are becoming increasingly software-driven, in part because of such advancements. Algorithms are part of this story, but are not a particularly interesting or insightful angle on the broader trend of software-driven business transformation.
Heading in the Wrong Direction
I'm afraid not - in fact, they're a big step backwards. The reason why service marketplaces were possible - and why API marketplaces are increasingly feasible - is because services and APIs are ways of abstracting software functionality into loosely coupled units that support the consumability and reusability of the underlying software.
In the case of algorithms, however, we're stripping away all such abstraction layers and even the fundamental abstraction of written code itself, leaving the underlying patterns such code must follow laid bare for all to see.
As the limited success of Web Services in the 2000s would suggest, there are perhaps better ways of packing up algorithms for consumption and reuse than services. RESTful APIs improved matters to be sure. What, then, would be the next step on this path of evolution? Not algorithms, but microservices.
In fact, you could think of a microservice as a properly encapsulated algorithm if you like, as they are in essence units of execution with well-constructed APIs. But strip away the encapsulation from any microservices-based system and you end up with a tightly coupled, unimplementable tangle.
Are We Nearing the ‘End of Applications'?
The crux of this issue is how we might define application. Today when we use the word app we're more likely than not referring to a mobile application, a simple bit of code we might download from an app store to our smartphone.
But of course, the world application has long meant more than that. Any tool, after all, has an application - the application of a hammer is hammering nails, for example. Early digital computers were purpose built with a single application, like cracking Nazi codes or calculating missile trajectories.
Only with the development of programmable computers like ENIAC did the word application come to mean computer program, as these devices were essentially general purpose, and thus had many possible applications.
Today we still think of an application as a computer program, although some uses of the word are narrower than others. But certainly, wouldn't Siri or Cortana be just another kind of application?
The Intellyx Take: Are We Nearing a Post-Algorithmic Age?
This question is a philosophical one that attempts to divine the essence of modern artificial intelligence (AI) - as well as where AI is heading over the next several years.
If we look at machine learning or deep learning algorithms, we're essentially teaching computers to learn on their own so that they can come up with new ways of doing things - with resulting behavior that may look nothing like "a self-contained step-by-step set of operations" that is the essence of an algorithm.
In other words, AI broadly speaking introduces two levels of computing: the human level where people code the AI programs (which are still algorithmic), and the self-learning behavior of those programs themselves that potentially leads to novel behaviors (which I posit may become post-algorithmic).
Seen in this light, we may in fact desire post-algorithmic behavior from our virtual assistants. After all, if they're simply following recipes - even complicated ones - then they're not really doing anything more than a traditional, human-coded application can do.
Only when the agent can come up with novel behaviors based upon unpredictable, independent learning will our virtual assistants become truly useful. At that point, algorithms - and Gartner's poorly thought out opinions - will become things of the past.
Copyright © Intellyx LLC. Intellyx advises companies on their digital transformation initiatives and helps vendors communicate their agility stories. As of the time of writing, none of the organizations mentioned in this article are Intellyx customers. Image credit: Lindsey Turner.
Latest Cloud Developer Stories
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
SYS-CON Featured Whitepapers
Most Read This Week