Two mathematicians have reportedly managed to find a wholly new and faster approach to multiplying large numbers.
The pair of scientists from Australia and France have solved an algorithmic mystery that has gone unsolved by even some of the leading math minds for almost half a century. For nearly all of us, the way we multiply fairly small numbers is by remembering the time tables. However, when the numbers get bigger, most of us would use long multiplication- assuming we don’t have a calculator on hand.
There’s just one issue with long multiplication: it is incredibly slow. The reason is that for every single digit in the number, you have to solve a separate multiplication function, before adding all the sums together.
This is a tedious and way too difficult process for school kids, which are laboriously trudging through calculations as they learn how to multiply.
In addition, it’s a problem for computers, as their own systems in performing calculations are levied by the limits of the abstract mathematics humans can understand.
As mathematician David Harvey from UNSW in Australia demonstrates in the video below, in a multiplication when both the numbers have three digits (n = 3), the number of individual operations implicated is, in fact, 9, which is n2.
The issue with this concept is that as the numbers get bigger, the volume of work requested levels up as well, always being defined by n to the power of 2. Even though it is ineffective, the long multiplication algorithm was, in fact, the most advanced multiplication algorithm that was discovered until the 1960s by Russian mathematician Anatoly Karatsuba.
Years later, a couple of German mathematicians discovered the Schönhage–Strassen algorithm, which presumed, but never proved, that advanced improvements were possible.
Multiplying two numbers together with a billion digits each using the process of long multiplication would take a computer a few months to reach a result. However, when using the Schönhage–Strassen algorithm, the process would sum up to under 30 seconds.
Is the new algorithm useful, though?
It is, but only when multiplying very big numbers together. “How big?” you might ask.
“We have no idea how,” the researchers explain in an FAQ, even though an instance they explain in the paper equates to 10214857091104455251940635045059417341952, which is an incredibly big number.
The findings were published in the HAL open access archive.