Artificial intelligence vs machine learning: can you tell the difference?


Artificial intelligence (AI) and machine learning (ML) have been around since the 1950s, yet they turned into popular buzzwords just recently. Do we use these terms correctly? A global study released by Pegasystems Inc. showed that although 72% of respondents confidently claimed they understand AI, far fewer could correctly define what it is or what it can do. Therefore we thought it is worth explaining the difference and make you an expert on the topic in just 4 minutes of reading. 

The first person ever asking the question: "Can machines think?" was Englishman Alan Turing who helped to end the WW2 by cracking the code of legendary Enigma machine. This man was clearly ahead of its time as he anticipated machine learning and concerns of potential AI impact on the jobs at the time when the first machines were just being built. 

Since then AI went through a major evolution both as a field and the definition itself. It comes in all shapes and stages of developments. However, the "myth" that machines could be taught typical human traits like intuition, emotions or as fans of the TV-show Westworld know, a human-like conscience, corresponds more to the futuristic term of Artificial General Intelligence (AGI).

Even though we frequently hear that AI may one day overtake humankind, our current perception of what is AI, is more down-to-earth. In fact,  AI is just umbrella term covering multiple disciplines such as machine learning, robotics, simulations or computer vision. Therefore fear no more - you already interact with AI daily - you just probably do not know about it.

So what is the machine learning? 

Have you ever wondered how come Netflix is able to recommend you the next show for binge-watching? The answer is simple: machine learning. Netflix's algorithms analyze your activity and based on your previously watched movies are able to recommend a TV-show that you may like. 

In a nutshell, machine learning (ML) is a subfield of AI. It that gives computers the ability to learn without being explicitly programmed and originally rooted in statistics and mathematical optimization. ML focuses on algorithms, allowing the machine to learn how to process large amounts of data (known as "big data"), spot correlations, patterns and report anomalies. As any student, the program should get smarter and more efficient at completing tasks and decision-making with each set of data. Simply put, the program learns on its own and each time delivers more reliable results and decisions. You can encounter ML everywhere - from spam filter in your mailbox, through personalized marketing newsletter to self-driven cars. At Intelecy we use ML in the industrial sector, so we can reduce the cost of maintenance, carbon footprint and waste in a production process. 

If it sounds a little bit too familiar to deep learning, you are not mistaken. Deep learning is, in fact, a relatively new technique used in machine learning. In other words, computer system autonomously learns to imitate human thought patterns through artificial neural networks -which are composed of cascading layers of information. Just like a human mind, the system works with a large amount of data. 

ML is indeed an incredibly powerful tool - it would take us decades to process so much unstructured data from so many resources and extract the information that we need. With the boom of e-commerce and social media, the market for AI technologies is growing significantly and even small players on the market started to realize their potential - after all these technologies help companies to interact with each consumer individually in a real-time and tailor the customer experience to perfection; from a curated, personalized newsletter to intelligent, always available chatbot. 

Why the hype only now? 

Even though AI became the field of study already in 1956, the real boom around AI technologies picked up the steam just a few years ago. After decades of failures, errors and severe cutbacks in state funding, a miracle happened. An article published in 2012 by University of Toronto, proved that computers managed to classify 1,2 million of high-resolution images into the 1,000 different classes with an incredible minimal error rate - thanks to deep learning technique. 

Practically overnight, this paper sparked a global revolution and AI became the hottest field in the technology. If you can teach machines to be better then humans in object recognition, what else can they learn? The long-term skepticism in the AI community has suddenly been replaced by a wave of optimism. 

Wrapping things up, AI applications are here to stay - especially now with the industries undergoing digitalization. Just the term AI as we know it will most likely change - as it already did in past fifty years. 

Share on: