DeepMind AI can multiply quicker than ever imagined, beating a previous record

The newest version of this AI has broken a 50-year record in matrix multiplication.
Brittney Grimes
Crunching numbers.
Crunching numbers.

kertlis/iStock 

What if there was a way to improve and speed up computation by 20 percent? Engineers have created a quicker way in computing that can perform matrix multiplication, therefore speeding up the process for completing computing tasks.

The research was presented in the journal Nature.

Fast and accurate computation

The artificial intelligence, called DeepMind, is speeding up the process of computing and has even beaten a 50-year-old record set in computer science.

Its latest version, AlphaTensor, did not have knowledge of solutions when presented with the problem of creating a working algorithm that completed a task in the least number of steps. “AlphaTensor discovers from scratch many provably correct matrix multiplication algorithms that improve over existing algorithms in terms of number of scalar multiplications,” the study said. It was trained to find the best way to transform the matrix multiplication, or matrix factorization, to find the algorithms.

AlphaTensor itself is an upgraded version of DeepMind’s AI predecessor, AlphaZero.

“AlphaTensor is built on AlphaZero, where a neural network is trained to guide a planning procedure searching for efficient matrix multiplication algorithms,” the study said.

This discovery allows for AI to perform algorithms faster with accuracy for matrix multiplication.

History of multiplying matrices

For hundreds of years, it was believed that the best way to multiply matrices would be proportional to the number of elements being multiplied, making the task much longer and more difficult for larger matrices.

The study created improvements to the Strassen algorithm, that, in 1969, allowed for a matrix of two rows of two numbers with another of exact size to involve seven multiplications rather than eight, reducing the steps.

Improvements with AlphaTensor

This previous algorithm had been the standard approach to matrices for the last 50 years. But DeepMind’s AI AlphaTensor has created a much quicker technique.

“AlphaTensor scales to a substantially larger algorithm space than what is within reach for either human or combinatorial search,” the study stated. “We also adapt the algorithm discovery procedure to finite fields, and improve over Strassen’s two-level algorithm for multiplying 4 × 4 matrices for the first time, to our knowledge, since its inception in 1969,” it continued.

Most Popular

For example, the AI found an algorithm for multiplying two matrices of four rows of four numbers using 47 multiplications instead of Strassen’s 49 multiplications, shortening the process. The older and most basic method would take 64 steps to complete. Therefore, AlphaTensor was able to shorten the steps, with every faster method being an improvement.

Using this AI in the future

AlphaTensor also has proven to be the best at what it does, beating the top existing algorithms for more than 70 different sizes of matrices.

In doing so, researchers predict an improvement in energy savings, scientific simulations, and endless possibilities for other applications. “Through different use-cases, we highlight AlphaTensor’s flexibility and wide applicability,” the researchers noted. These findings can be applied to many sectors of physics, engineering and statistics, making the process of multiplying numbers a lot quicker.

Thousands of algorithms have been discovered

AlphaTensor has already discovered 14,000 functional algorithms for the 4x4 matrices alone, along with thousands of other algorithms for various other sizes of matrices.

In a time where everyone wants to get their information quickly and accurately, implementing this system could really be a game-changer in speeding up the process.

message circleSHOW COMMENT (1)chevron