Artificial Intelligence Improves Itself

We've seen it dozens—or even hundreds—of times in movies, novels, and science fiction series: suddenly, an artificial intelligence (AI) takes control, and things go off the rails for humans. With this unsettling precedent, we move toward a world where machines gain more and more abilities.

We’ve seen it dozens—or even hundreds—of times in movies, novels, and science fiction series: suddenly, an artificial intelligence (AI) takes control, and things go off the rails for humans. With this unsettling precedent, we move toward a world where machines gain more and more abilities.

A brief historical overview takes us back to that moment in 1997 when Deep Blue, an IBM creation, defeated none other than chess grandmaster Garry Kasparov. The next leap came in 2014, when Eugene, a simple chatbot, managed to pass the Turing Test.
Defined by the renowned mathematician and father of modern computing, this test determines whether an algorithm can exhibit behavior indistinguishable from that of a human, without the human realizing the responses are coming from a machine. In 2017,
Google’s AlphaGo defeated Ke Jie, the Go champion, setting a new milestone.

These are just a few examples of how certain algorithms, by organizing data through algebraic logic and matrices, can continuously improve and achieve increasingly ambitious goals. This occurs due to several factors. The first is, without a doubt, ever-
greater and more affordable processing power, a phenomenon accelerated and democratized thanks to cloud architecture. Added to this is the sheer volume of data— the training source that makes these robots become “smarter”—reaching incomprehensible limits for the human brain.

The World Economic Forum estimates that by 2025, about 450 exabytes will be generated daily. To grasp the scale, that is equivalent to nearly 16 billion average-sized games. Our mobile devices track our every move 24/7, corporate systems record all transactions, content is uploaded to social media, messages are exchanged incessantly, and increasingly rich IoT sensors, for example, all contribute to this figure.


The conclusion is simple: there are more places to test those algorithms (at affordable prices, by the way) and more data to train them. This enables technological advancements that, in a virtuous circle, stimulate investments and drive improvements. This is how we reach astonishing research.

Continuous Evolution

For example, the problem of matrix multiplication represented as tensor decomposition.
For this, researchers tasked the artificial intelligence AlphaTensor with searching for “better algorithms” using reinforcement learning—a machine learning-based method that rewards desired behaviors and penalizes those that don’t meet the parameters, motivating the agent to perceive and interpret its environment and learn through trial
and error. The conclusion? It managed to improve the Strassen Algorithm, the best known so far, which, after fifty years of use, was considered unbeatable as a mechanism for matrix multiplication.

But in this world of continuous beta, AlphaTensor managed to find the best way to program the functions used for artificial intelligence and unlocked a virtuous circle: it processes data in less time, optimizes data usage, and generates a better algorithm. The details can be found at https://github.com/deepmind/alphatensor, which is open-source.

We are perhaps at the beginning of a path where algorithms start to improve themselves.
Far from the threatening stance we described in science fiction, artificial intelligence is solidifying itself as an ally that allows us to exploit our human skills to the fullest at work, predict climate disasters, or accelerate scientific research. Therefore, it’s exciting
to imagine a future where more and more algorithms, like Eugene once did, pass the Turing Test. That said, I confess it would scare me a little if one day they decided to intentionally “fail” it.

By Manuel Allegue – October 14, 2022

Be part of the Cloud world

Subscribe to our periodic Technology News digest.