The Lumber Room

"Consign them to dust and damp by way of preserving them"

Archive for April 2010

Some incredibly beautiful things

with 2 comments

First, a totally completely absolutely incredible fact about convergents of continued fractions: the Khinchin-Levy theorem, seen here: For almost all real numbers \alpha, the denominators q_n of the continued-fraction convergents of \alpha satisfy

\displaystyle \lim_{n \to \infty} q_n^{1/n} = e^{\pi^2/(12 \log 2)}

Why should such a thing be a true?! Why should the limit even exist, why should it be the same for almost all real numbers, and where do e, \pi, \log 2 and 12 come from?!!

Intrigued by these questions, and embarrassed by my near-complete ignorance of continued fractions (each time I had encountered them before, I had been intimidated by the notation and skipped pages), I started reading a couple of books on continued fractions (including Khinchin’s)… other things intervened and I had to give up for now, but I hope to return to their study soon. But from what I’ve already seen, continued fractions are awesome, and it is a real shame they seem to be out of fashion these days and not more widely taught and known. Neither of these books proves the above theorem actually, but an editor’s footnote in Khinchin’s book gives the reference: it was proved by Khinchin in 1935 that the limit exists and is the same for almost all real numbers, and Lévy found the constant soon after (published 1937).

I suspect (based on nothing at all) that it bears some relation to, among other things, (some generalization of) the following well-known fact, itself quite surprising (probably the most surprising fact I know a proof of):

\displaystyle \sum_{n=1}^{\infty} \frac{1}{n^2} = \frac{\pi^2}{6} = \prod_{p \text{ prime}}{\left(1-\frac1{p^2}\right)}

This fact is known as the Basel problem, was proved by (who else?) Euler, and fourteen proofs of it are collected here. But let this not distract you from reading about continued fractions!


Second, pointed out at John Baez’s This Week’s Finds in Mathematical Physics (Week 230), and from this paper by Sondow, are these (quoting Prof. Baez directly, just typeset in LaTeX):

eerily similar formulas for some of our favorite math constants. First, one for e:

\displaystyle e = \left(\frac{2}{1}\right)^{1/1} \left(\frac{2^2}{1\times3}\right)^{1/2} \left(\frac{2^3\times4}{1\times3^3}\right)^{1/3}  \left(\frac{2^4\times4^4}{1\times3^6\times5}\right)^{1/4} \dots

Then, one for π/2:

\displaystyle \frac{\pi}{2} = \left(\frac{2}{1}\right)^{1/2} \left(\frac{2^2}{1\times3}\right)^{1/4} \left(\frac{2^3\times4}{1\times3^3}\right)^{1/8} \left(\frac{2^4\times4^4}{1\times3^6\times5}\right)^{1/16} \dots

Then one for eγ, where γ is Euler’s constant:

\displaystyle e^\gamma = \left(\frac{2}{1}\right)^{1/2} \left(\frac{2^2}{1\times3}\right)^{1/3} \left(\frac{2^3\times4}{1\times3^3}\right)^{1/4} \left(\frac{2^4\times4^4}{1\times3^6\times5}\right)^{1/5} \dots

He also points out Wallis’ product for π/2 and Pippenger’s for e:

\displaystyle \frac{\pi}{2} = \left(\frac{2}{1}\right)^{1/1} \left(\frac{2\times 4}{3\times3}\right)^{1/1} \left(\frac{4\times6\times6\times8}{5\times5\times7\times7}\right)^{1/1} \dots

\displaystyle  e = \left(\frac{2}{1}\right)^{1/2} \left(\frac{2\times 4}{3\times3}\right)^{1/4} \left(\frac{4\times6\times6\times8}{5\times5\times7\times7}\right)^{1/8} \dots

What does it all mean? I haven’t a clue! Another mystery thrown down to us by the math gods, like a bone from on high… we can merely choose to chew on it or not, as we wish.

Indeed!


Third, my favourite way to prove the Möbius inversion formula. Well, it’s actually the only way I know, besides induction, but I think it’s beautiful. (Ignore if you already know what Dirichlet convolutions are.) It’s from Herbert Wilf’s generatingfunctionology, one of my favourite books.

Let us consider a kind of generating function known as a Dirichlet generating function. (“A generating function is a clothesline on which we hang up a sequence of numbers for display.”) For a sequence a_n, the Dirichlet generating function is defined as the (formal) series (in s)

\displaystyle G_a(s) = \sum_{n=1}^{\infty} \frac{a_n}{n^s} = a_1 + \frac{a_2}{2^s} + \frac{a_3}{3^s} + \frac{a_4}{4^s} + \dots.

(A formal series means that we can ignore issues of convergence etc.) Now if we have the generating functions G_a(s) and G_b(s) for two sequences a_n and b_n, what is G_a(s)G_b(s)? Well, it’s easy to see that the coefficient of 1/n^s in G_a(s)G_b(s), which comes exactly from the terms of the form i^sj^s with ij=n, is \displaystyle \sum_{ij=n} a_ib_j, or equivalently \displaystyle \sum_{d \vert n} a_db_{n/d}. So the product is the generating function for a special kind of convolution, the sequence c_n defined as

\displaystyle c_n = \sum_{d \vert n} a_db_{n/d}

Let us call this fact “(*)” and put it aside for now; we will use it shortly. Instead, let us look at the generating function for a very special sequence, the sequence of all 1s. It is, by definition, \displaystyle \sum_{n\ge 1} \frac{1}{n^s}. While we could call this G_1(s), it goes by the more familiar name \zeta(s) — it is (when viewed as a function instead of a formal series) the Riemann zeta function. But we also know (this is the Euler product formula, and follows simply from the unique factorisation of integers):

\displaystyle \zeta(s) = \sum_{n\ge1}\frac{1}{n^s} = \prod_{p\text{ prime}}\left(1+\frac1{p^s} + \frac1{p^{2s}} + \frac1{p^{3s}} + \dots\right) = \prod_{p\text{ prime}}\frac1{\left(1-\frac1{p^s}\right)}.

This means that

\displaystyle \frac1{\zeta(s)} = \prod_{p\text{ prime}}{\left(1-\frac1{p^s}\right)} = G_\mu(s).

Why did I write G_\mu(s) on the right? Is this a generating function of anything? Why, yes! By expanding the product, you can see that it is the generating function for the sequence \mu(n), defined as

\displaystyle \mu(s) = \begin{cases} 0 \quad \text{if }n\text{ is not squarefree,} \\ (-1)^k \quad \text{if }n = p_1p_2\dots p_k\text{ is the product of k primes}\end{cases}.

(This function “mu” is known as the “Möbius function”.) With this background, the Möbius inversion formula falls out entirely naturally: if G_f(s) is the generating function of a sequence f(n), and the function F(n) is defined as

\displaystyle F(n) = \sum_{d\vert n} f(d),

then its generating function G_F(s) is, by (*), simply

\displaystyle G_F(s) = G_f(s)\zeta(s),

which means that

\displaystyle G_f(s) = G_F(s)\frac{1}{\zeta(s)} = G_F(s)G_\mu(s),

or in other words, again by (*),

\displaystyle f(n) = \sum_{d\vert n} F(d)\mu(n/d),

which is precisely the Möbius inversion formula. How cool is that?

Advertisement

Written by S

Fri, 2010-04-30 at 01:12:02

Posted in mathematics

E&OE

leave a comment »

Ogden Nash poem, Portrait of the Artist as a Prematurely Old Man, with characteristically long lines that leave you breathless.

It is common knowledge to every schoolboy and even every Bachelor of Arts,
That all sin is divided into two parts.

One kind of sin is called a sin of commission, and that is very important,
And it is what you are doing when you are doing something you ortant,
And the other kind of sin is just the opposite and is called a sin of omission and is equally bad in the eyes of all right-thinking people, from Billy Sunday to Buddha,
And it consists of not having done something you shuddha.

I might as well give you my opinion of these two kinds of sin as long as, in a way, against each other we are pitting them,
And that is, don’t bother your head about sins of commission because however sinful, they must at least be fun or else you wouldn’t be committing them.

It is the sin of omission, the second kind of sin,
That lays eggs under your skin.
The way you get really painfully bitten
Is by the insurance you haven’t taken out and the checks you haven’t added up the stubs of and the appointments you haven’t kept and the bills you haven’t paid and the letters you haven’t written.

Also, about sins of omission there is one particularly painful lack of beauty,
Namely, it isn’t as though it had been a riotous red-letter day or night every time you neglected to do your duty;
You didn’t get a wicked forbidden thrill
Every time you let a policy lapse or forgot to pay a bill;
You didn’t slap the lads in the tavern on the back and loudly cry Whee,
Let’s all fail to write just one more letter before we go home, and this round of unwritten letters is on me.

No, you never get any fun
Out of things you haven’t done,
But they are the things that I do not like to be amid,
Because the suitable things you didn’t do give you a lot more trouble than the unsuitable things you did.
The moral is that it is probably better not to sin at all, but if some kind of sin you must be pursuing,
Well, remember to do it by doing rather than by not doing.

[Youtube]

Written by S

Wed, 2010-04-28 at 20:19:05

Posted in procrastination, quotes

Snip

with 13 comments

Some random cute or frivolous verses, dumped here so I can close those tabs. (Though inevitably I ended up opening more tabs…)


एकवस्तु द्विधा कर्तुम् 
          बहवः सन्ति धन्विनः ।
धन्वी स मार एवैको 
          द्वयोः ऐक्यः करोति यः ॥

eka-vastu dvidhā kartum
     bahavaḥ santi dhanvinaḥ —
dhanvī sa māra evaiko
     dvayoḥ aikyaḥ karoti yaḥ

Cute and clever! Here, māra is kāma, often depicted with an arrow (as Cupid is). Saw here, see here.
[Edit: Fixed एकवस्तुम् (eka-vastum) → एकवस्तु. ]

If you’re too lazy to click, here’s a rough translation that loses the one-two-many-one-two-one play on words:

To split a single thing in two
     There’s many an archer under the sun—
But Love’s the only bowman who
     Can start with two and make them one.

There’s no “under the sun” in the original; I just couldn’t think of a better rhyme. :P


कमले कमला शेते      हरः शेते हिमालये ।
क्षीराब्धौ च हरिः शेते   मन्ये मत्कुणशङ्कया ॥

kamale kamalā śete, haraḥ śete himālaye,
kṣīrābdhau ca hariḥ śete — manye matkuṇa-śaṅkayā !

kamalā = Lakṣmī (I think, though one source gave it as Brahmā), śete = sleeps, manye = I guess, śaṅkayā = out of suspicion/fear, of matkuṇa = bedbugs :D
Originally saw here,[dead link] so see here or here.
Read the rest of this entry »

Written by S

Fri, 2010-04-16 at 21:15:10

On “trivial” in mathematics

with 8 comments

One aspect of mathematicians’ vocabulary that non-mathematicians often find non-intuitive is the word “trivial”. Mathematicians seem to call a great many things “trivial”, most of which are anything but. Here’s a joke, pointed out to me by Mohan:

Two mathematicians are discussing a theorem. The first mathematician says that the theorem is “trivial”. In response to the other’s request for an explanation, he then proceeds with two hours of exposition. At the end of the explanation, the second mathematician agrees that the theorem is trivial.

Like many jokes, this is not far from the truth. This tendency has led others to say, for example, that

In mathematics, there are only two kinds of proofs: Trivial ones, and undiscovered ones.

Or as Feynman liked to say, “mathematicians can prove only trivial theorems, because every theorem that’s proved is trivial”. (The mathematicians to whom Feynman said this do not seem to have liked the statement.)

A little examination, however, shows that all this is not far from reality, and indeed not far from the ideal of mathematics. Consider these excerpts from Gian-Carlo Rota‘s wonderful Indiscrete Thoughts:

“eventually every mathematical problem is proved trivial. The quest for ultimate triviality is characteristic of the mathematical enterprise.” (p.93)

“Every mathematical theorem is eventually proved trivial. The mathematician’s ideal of truth is triviality, and the community of mathematicians will not cease its beaver-like work on a newly discovered result until it has shown to everyone’s satisfaction that all difficulties in the early proofs were spurious, and only an analytic triviality is to be found at the end of the road.” (p. 118, in The Phenomenology of Mathematical Truth)

Are there definitive proofs?
It is an article of faith among mathematicians that after a new theorem is discovered, other simpler proofs of it will be given until a definitive one is found. A cursory inspection of the history of mathematics seems to confirm the mathematician’s faith. The first proof of a great many theorems is needlessly complicated. “Nobody blames a mathematician if the first proof of a new theorem is clumsy”, said Paul Erdős. It takes a long time, from a few decades to centuries, before the facts that are hidden in the first proof are understood, as mathematicians informally say. This gradual bringing out of the significance of a new discovery takes the appearance of a succession of proofs, each one simpler than the preceding. New and simpler versions of a theorem will stop appearing when the facts are finally understood. (p.146, in The Phenomenology of Mathematical Proof, here/here).

For more context, the section titled “Truth and Triviality” is especially worth reading, where he gives the example of proofs of the prime number theorem, starting with Hadamard and de la Vallée Poussin, through Wiener, Erdős and Selberg, and Levinson. Also p. 119, where he feels this is not unique to mathematics:

Any law of physics, when finally ensconced in a proper mathematical setting, turns into a mathematical triviality. The search for a universal law of matter, a search in which physics has been engaged throughout this century, is actually the search for a trivializing principle, for a universal “‘nothing but.” […] The ideal of all science, not only of mathematics, is to do away with any kind of synthetic a posteriori statement and to leave only analytic trivialities in its wake. Science may be defined as the transformation of synthetic facts of nature into analytic statements of reason.

So to correct the earlier quote, there are three kinds of proofs from the mathematician’s perspective: those that are trivial, those that have not been discovered, and those niggling ones that are not yet trivial.

Written by S

Sat, 2010-04-03 at 14:40:30

Posted in mathematics

Tagged with ,