Why is Multiplication Harder than Addition?

I’m in the process of writing a couple of blog posts about slide rules, and an interesting questions occurred to me, which spawned this post. Why is addition so much easier than multiplication? The power of slide rules comes from the fact that multiplying two numbers together is equivalent to adding two logarithms. The slide rules made life easier for scientists, engineers, and financiers for centuries by reducing multiplication problems down to simple addition.

I decided to figure out how many operations adding and multiplying takes. The model I’m using is based on Feynman’s model of a simple computer from his book Feynman Lectures on Computation. Imagine that you have a person who doesn’t know how to add or multiply numbers, but they are very organized and very good at following directions. They have a sheet of paper in front of them with the problem they are trying to solve (either adding or multiplying numbers) and the only tool they have is an addition table which allows them to look up the answer to x+y for all numbers between 0 and 9. Additionally, they have a multiplication table for all numbers between 0 and 9. Every time they have to access the additional table or multiplication table it is considered a single operation. The question I want to know is, in general, how many operations does it take to add or multiply a series of numbers together? I’m sure one of my mathematician friends could rattle the answer off the top of their head, but I wanted to see if I could figure it out on my own.


Let’s start simple, with two 2 digit numbers (to hopefully avoid confusion I’ll use numerals to denote the number of digits and words to denote the number of numbers).

\begin{array}{@{}cr} & 31 \\ + & 17 \\ \cline{2-2} & \\ \end{array}

The computer (note that the word computer used to refer to a person who calculates numbers) pulls out his chart to look up 1 + 7 = 8 and writes 8 down in the first column.
\begin{array}{@{}cr}  & 31 \\  + & 17 \\  \cline{2-2}  & \ 8  \end{array}
Next he looks up 3 + 1 = 4 and writes that down.
\begin{array}{@{}cr}  & 31 \\  + & 17 \\  \cline{2-2}  & 48  \end{array}

Since the computer used the look-up tables twice, it took two operations to add two 2 digits numbers. Let’s extrapolate to adding more numbers. For simplicity, I’ll assume that all numbers that are being added have the same number of digits. If you add three 2 digit numbers together it ends up taking a total of 3\times 2 = 6 operations. Remember that your addition tables only allow you to add two digits together at a time so you will need to add the first two numbers together first, then add the third number to this results.
\begin{array}{@{}cr}  & 31 \\  + & 17 \\  + & 12 \\  \cline{2-2}  &  \end{array}
The first two numbers give you 48 and took four operations, and adding 12 to 48 takes another two operations, for a total of six.

We can extrapolate to see that adding M 2 digit numbers together takes a total of 2M operations. You can also easily see that if you expand this even further to consider numbers with N digits, it will take N\times M operations to add those numbers together. This is actually a lower bound on the number of operations, because frequently you will add two numbers and get something larger than 10, which requires you to add a carry digit to the next column over. It turns out that, on average, adding M numbers with N digits, it will take \left( M + \frac{1}{2} \right) N operations.


Does multiplication take more operations to complete? Let’s take a look.
\begin{array}{@{}cr}  & 13 \\  \times & 21 \\  \cline{2-2}  &  \end{array}

Our computer starts off by looking up 1\times3 = 3 and then 1\times1 =1 . The next step is then looking up 2\times3 = 6 and 2\times 1 = 2 so we’ve completed four operations and have the following results:
\begin{array}{@{}cr}  & 13 \\  \times & 21 \\  \cline{2-2}  & 13 \\  & 260 \\  \end{array}

Now we need to add the two numbers together. The number of operations is either one or three, depending on whether you count addition by zero as an operation. I’m going to count it as an operation since we assume our computer doesn’t understand anything about numbers and needs to look up even this simple process. This means that multiplying two 2 digit numbers together takes seven operations, which is clearly larger than the four operations taken to add two 2 digit numbers together. To come up with an equation for multiplying more numbers with more digits we need to consider that adding another digit means you now have to multiply that one digit by all of the other digits in the other number. Multiplying two 2 digit numbers yields 2^2 = 4 multiplication operations so multiplying two N digit numbers will yield N^2 operations. Once you are done multiplying the digits together, you have two numbers with 2N digits that you need to add together so these combining two N digit numbers takes roughly N^2 + 4N operations. Note that for N = 2 the equations says that it should take 8 operations but I only counted 7 in my example. This is because I assumed two 2 digit numbers yields a four digit product, which only occurs if all of the digits are larger than 3.

Things start to get a little tricky when I try to extend this to multiple numbers so I’m going to leave that for another day. The end message is that the number of operations to multiply numbers together scales as N^2 while addition scales as N. As the number of digits increases, multiplication rapidly starts taking more individual operations to get a results. This means that if you can somehow simplify a multiplication problem so it only requires additions, you might be able to save some time and effort. Stay tuned for my upcoming posts on slide rules.

This entry was posted in Just for Fun and tagged , . Bookmark the permalink.

2 Responses to Why is Multiplication Harder than Addition?

  1. Gabriel Hanna says:

    When I was a kid I invented “addiplication”. It’s easier to do than describe. Let the addiplication symbol be @. So 13 @ 21 would be


    But I couldn’t figure out what this told me about 13 and 21.

    What happens to the scaling as you change your number base? What if you don’t use a positional representation, but some other kind–would multiplication still require more steps?

    • I should point out that this was a back-of-the-envelope type calculation and not a proof of any sort. I think that changing the base would only affect the number of digits so it wouldn’t change the relative “complexity” of multiplication as compared to addition. On a slide rule multiplication is comparable to addition so there are other representations that change the relative “complexity” of addition and multiplication.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.