AI Gets the Glory but ML is Quietly Making Fortunes

When we think about the future of data science, it’s easy to get carried away. For a couple of decades now, if you believe the hype, we’ve been on the verge of a revolution. A revolution in which “Artificial Intelligence” – as a vaguely defined but enormously powerful force – is always on the verge of solving the world’s problems. Or at least make data analysis easier.

The reality is, of course, more nuanced. Truly “intelligent” systems are still a few years off, no matter how one defines the term. And though the future of many industries is likely to include advanced, adaptive computer systems, these will not be AIs, except in the most limited sense of the term. Instead, they will likely be based on existing machine learning (ML) systems that are already changing our world.

In this article, we’ll look at the differences between AI and ML, look at why AI gets all the attention, and why we shouldn’t overlook the everyday revolution that ML is already creating.

AI and ML

First, a couple of definitions. One of the reasons why AI is talked about in such inflated terms, in fact, is because we still lack a good definition of what it means for a computer to be “intelligent.” The closest that we can get, and the basis for most of the definitions of the term, is that “artificial intelligence is where a machine seems human-like and can imitate human behavior,” according to Bethany Edmunds, associate dean and lead faculty for Northeastern’s computer science master’s program. To state the obvious, this is not a clear definition, and this is likely one of the reasons why most AI projects fail. 

Machine learning, on the other hand, refers to a fairly well-defined set of techniques and systems. Whereas AI is taken to refer to a computer system that is intelligent, ML is the most well-developed way we have of building such systems. ML refers to the various ways in which computers can be programmed to look at real-world data, extrapolate patterns, and provide actionable insights.

If that sounds an awful lot like the “AI” systems we already have, it’s because it is. As AI pioneer Michael I. Jordan recently wrote, most of the time when we’re talking about AI, we really mean ML. “ML is an algorithmic field that blends ideas from statistics, computer science and many other disciplines to design algorithms that process data, make predictions and help make decisions,” he wrote in the Harvard Data Science Review article. 

Real-World Capabilities

It’s unlikely, of course, that observations like those of Jordan are going to change the way that AI is presented in popular culture, where it seems there is a deep-rooted fear of intelligent computers turning on humanity. Some of the best end of the world movies feature scary “intelligent” computers, in fact, while simultaneously overlooking the fact that ML is already changing the world and, like all technology eventually, is getting cheaper and more accessible all the time

The reason this is not obvious – at least to those who work outside the software development industry – is that where ML works well, it is almost invisible. The best ML systems, and those that are most widely used, are deployed alongside humans in order to take advantage of the strengths of both.

There are many examples of this. For instance, people can perform low-level pattern-matching but this comes with a significant time cost. ML systems can do the same mundane tasks at relatively little cost. And it never gets bored. Another example is when ML is broadly used for fraud detection in financial services – it would be possible to have people poring over millions upon billions of transactions, but it makes more sense to point ML systems at the problem.

In other words, ML is best used to augment human capabilities, not replace them. Since, and again in the words of Jordan, “ML is an algorithmic field that blends ideas from statistics, computer science and many other disciplines to design algorithms that process data,” he says, it is best used to “make predictions and help make decisions,” not to replace human intelligences.

Making Fortunes

It’s also worth pointing out that Jordan’s opinions are not speculation – in many areas of business, ML has already led to a quiet revolution. Most readers are aware of how AI and ML contribute to development processes. There are a number of other ways in which the technology has already had a massive impact, though, without the fireworks that are expected of “AI” systems. Let’s take just a few examples:

  • For a start, average interest rate for a conventional 15-year fixed-rate mortgage (the cheapest type of mortgage) have dropped to 2.69% in the last decade, at least partially as the result of ML systems being better able to predict risk for potential homebuyers.
  • At a broader level, researchers estimate that ML has the potential to add $2.6 trillion in value to the marketing and sales industry by 2020, as well as another $2 trillion to manufacturing and logistics fields.
  • McKinsey predicts that machine learning will help manufacturing businesses reduce material delivery times by 30% and achieve 12% fuel savings by optimizing their processes. The firm also estimates that companies can increase gross revenue by 13% if they fully integrate AI-driven technologies into their business.
  • Deloitte estimates AI-driven programs can help businesses reduce unplanned downtime by 15% to 30% and save 20% to 30% in maintenance costs.

The Bottom Line

None of these statistics match, of course, the apocalyptic tone of most of the movies about AI, and nor do they match the utopia that many researchers invoke when they talk about the future of intelligent machines. But for most people – and certainly most businesses – reducing overhead and maximizing efficiency are far more important in the here and now world. And that’s why, despite ongoing research into “true” AI (such as that into a more environmentally friendly way to train them), ML is likely to be far more important in the coming decade.

About the Author

Bernard Brode has spent a lifetime delving into the inner workings of cryptography and now explores the confluence of nanotechnology, AI/ML, and cybersecurity.

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1