# Log Rhythms

I’m nostalgic for an era that never included me–the days of the log table.

Eli Maor’s book e: The story of a number, turned my attention to the history of logs.  He tells the story of the Scottish wizard Napier with easy but brainy discursions into the nature and imperfections of the prototypical logarithm idea.  Here is a summary, cribbed from Smith’s History of Mathematics, of early suggestions for a logarithm function (expressed in terms of the modern definition).

$\text{Napier} \log{y} = r(\ln{r} - \ln{y}), r= 10^7$
$\text{Briggs} \log{y} = 10^{10}(10 - \log_{10}{y})$
$\text{Napier (later)} \log{y} = 10^{9}\log_{10}{y}$

As a sample of human flavor, relating to Napier’s relationship with Henry Briggs:  Upon their first meeting, these two men regarded each other in silent admiration for fifteen minutes before speaking.

I did not learn from Maor (but rather from Wikipedia) that Napier was also thought to carry a black spider in a small box, and that the black rooster was his familiar spirit.

But there are a few things I’ve come across that I find equally strange.  One is a quote from Hume:

“[Napier is] the person to whom the title of ‘great man’ is more justly due than to any other whom his country has ever produced.”

This is strong praise from a great historian living over a century after Napier died.

There is the well-known comment of Laplace, describing the logarithm as:

“[a]n admirable artifice which, by reducing to a few days the labour of many months, doubles the life of the astronomer, and spares him the errors and disgust inseparable from long calculations.”

Both of these remarks make clear that the advent of logarithms were something of considerable significance in the history of science, if not civilization (the logarithm idea rapidly spread around the globe).

I have been guilty of presenting logs to my students in their now watery vestigial form as just one more purposeless graph to learn (or not) and manipulate (or not).  But they have a character of their own and a history that makes them into something more than just an interesting curve with curious properties.

The logarithm is (or was) useful. Napier was responding to a particular problem: multiplication is hard.

How hard is it?

If you multiply two six digit numbers, such as 329874 and 718765, you can see that the traditional method requires you to do 6×6 = 36 single digit multiplications, and then roughly another 36 single digit additions.

So in short, multiplying $n$ digit numbers requires about $n^2$ operations.  Anyone who has ever graphed a parabola knows that this is going to present problems as $n$ becomes large (as it very well may in problems based in the sciences).

So what did Napier propose to do about it?  Roughly, he hoped that each number $n$ could be somehow be encoded into a number $f(n)$ in such a way that $f(n\cdot m) = f(n)+f(m)$.  If this were true, and $f(n)$ is not much bigger than $n$, then you can compute $n\cdot m$ by decoding $f(n)+f(m)$.  If the encoding/decoding can be done quickly, then the benefit is that the arduous process of multiplication has been transformed into the relatively easy process of addition.

Napier is said to have been inspired by the trigonometric identity

$\sin(A)\sin(B) = 1/2(\cos(A-B)-\cos(A+B))$

which has a suggestive similarity.

Something that strikes one when reading Napier is the hard-as-nails functionality of what he was constructing.  Messy (and large) constants abound.  There is no easy theoretical elegance in his writing, but rather the feeling that something is being made to work.

The reduction in computational complexity accomplished by logarithms is of course not free.  Much of it is hidden in the construction of enormous lookup tables, which Napier had to compute by hand (and with the help of his other aid-to-computation, his eponymous rods or, corruptly,  bones.)

But the computational price must only be paid once, and literally for all, by the willing sacrifice of those such as Napier, Briggs, and many others who constructed the tables that were in use by scientists and engineers for centuries.

The $n^2$ complexity of multiplication is reduced to the linear complexity of addition and the (ironically) logarithmic complexity of looking up an element in a sorted list.

This entry was posted in Uncategorized. Bookmark the permalink.

### 8 Responses to Log Rhythms

1. I like this article! It talks about all the benefits of shipping something quickly. I really hope this will help me when it comes to my work.

2. Thanks for sharing. Very insightful

3. shabe yalda says:

4. Rhythm, in music, the placement of sounds in time. In its most general sense, rhythm (Greek rhythmos, derived from rhein, “to flow”) is an ordered alternation of contrasting elements.

5. ical logarithm idea. Here is a summary, cribbed from Smith’s History of Mathematics, of early suggestions for a logarithm function (expressed in terms of the modern definition).

\text{Napier} \log{y} = r(\ln{r} – \ln{y}), r= 10^7
\text{Briggs} \log{y} = 10^{10}(10 – \log_{10}{y})
\text{Napier (later)} \log{y} = 10^{9}\log_{10}{y}

As a sample of human flavor, relating to Napier’s relationship with Henry Briggs: Upon their first meeting, these two men regarded each other in silent admiration for fifteen minutes before speaking.

I did not learn from Maor (but rather from

6. How hard is it?

7. How hard is it?

8. casa delia says:

this is my first time and i like this blog thanks 🙂