Wednesday, January 12, 2005

Music Fractality

Larry Solomon has a fascinating discussion of the fractal structure of music. As a classical example, he subjects Beethoven's Eccosaisen to analysis, and shows its similarity to a Cantor fractal. Then there's another analysis of the composer's Piano Sonata No 15, Op 28, which is similar to a Sierpinski fractal structure.

Tuesday, January 11, 2005

Fractal structure transmits information

Self-similarity independent of scale: that's fractal behavior.

Besides clouds, trees, shorelines, earthquake magnitudes, mammalian circulatory systems ...

There are human creations that are also fractal, especially music (and not just fabricated "fractal" music). A Bach fugue is fractal. The typical form a popular song (ABAA) is fractal. A phrase (A or B) may be four measures (sixteen measures comprise a chorus). Within each measure (in 4/4) there are full, half, quarter ... notes. Look at the musical patterns, and quite often there is repetition (not only A and A or B and B) of note sequences at different rhythms.

Then there's a chord. A major chord includes harmonically consistent notes, with compatible harmonic vibrations. They "fit" together.

Then, consider an arrangement. Say, Nelson Riddle of Fifties Sinatra. There's the repetition, and then background melodies that contrast with the primary melody, but a fractal content is very much there.

Is Tolkien's Creation Fractal?

From the Simarillion:

Then the voices of the Ainur … began to fashion the theme of Ilúvatar to a great music; and a sound arose of endless interchanging melodies woven in heights … and the music and the echo of the music went out into the Void, and it was not void. …

But as the theme progressed, it came into the heart of Melkor to interweave matters of his own imagining that were not in accord with the theme of Iluvatar; for he sought therein to increase the power and glory of the part assigned to himself. …

Some of these thoughts he now wove into his music, and straightway discord arose about him, and … some began to attune their music to his rather than to the thought which they had at first. Then the discord of Melkor spread ever wider, and the melodies which had been heard before foundered in a sea of turbulent sound. …

Then Ilúvatar arose, and … a new theme began amid the storm, like and yet unlike to the former theme, and it gathered power and had new beauty. But the discord of Melkor rose in uproar and contended with it, and again there was a war of sound more violent than before…

Then again Ilúvatar arose, and … behold! A third theme grew amid the confusion, and it was unlike the others. … [I]t could not be quenched, and it took to itself power and profundity. And it seemed at last that there were two musics progressing at one time … and they were utterly at variance. The one was deep and wide and beautiful, but slow and blended with an immeasurable sorrow, form which its beauty chiefly came. The other had now achieved a unity of its own; but it was loud, and vain, and endlessly repeated; and it had little harmony, but rather a clamorous unison as of many trumpets braying upon a few notes. And it essayed to drown the other music by the violence of his voice, but it seemed that its most triumphant notes were taken by the other and woven into its own solemn pattern. …

… Ilúvatar spoke … ‘And thou, Melkor, shalt see that no theme may be played that hath not its uttermost source in me, nor can any alter the music in my despite.’

Tolkien, J.R.R., The Silmarillion, Christopher Tolkien, editor, Houghton Mifflin Company, Boston, 1977, p. 15-17.


So where is the fractal? In the Music: Theme, endless interchanging melodies, echo, beauty, power and profundity

And the anti-fractal? Interwoven discord, foundered melodies, turbulent sound, variance, loud, vain, endlessly repeated, little harmony, clamorous unison.

But, can't a fractal emerge from turbulence? ... triumphant notes were taken by the other and woven into its own solemn pattern.

Fractals Maximize Entropy

Derivation based on Pastor-Satorras, R., and Wagensberg, J., PHYSICA A (Statistical and Theoretical Physics), Volume 251, No.3 and 4, March 14, 1998, p. 291-302.(Link)

Shannon Entropy:

H(P) = −S(k) [p(k) ln p(k)]

Probabililities, by definition sum to unity:

S(k) = ln p(k) = 1

An element of order k, has a magnitude ℓ(k). N(k) = ℓ(k) / ε indistinguishable atoms of size ε, arranged in a certain way, comprise the element. The information needed to specify the arrangement of the N(k) atoms in the element is equivalent to selecting N(k) objects with the same probability. From information theory[Shannon, 1948] this information is ln N(k). The generating information for specifying an element of order k is ℓ(k) = ln N(k) = ln [ℓ(k)/ε]. The average information over the entire iteration process P is:

= S(k) p(k) I(k) =S(k) p(k) ln [ℓ(k)/ε]

F = −S(k) [p(k) ln p(k)] + β { −S(k) p(k) ln [ℓ(k) / ε ]}+ β′ [1 − S(k) p(k)]

∂F/∂pk = 0 = −ln p(k) − 1 − β ln [ℓ(k) / ε] − β′ = 0.

p(k) = exp {− 1 − β′ [ℓ(k) / ε] −β}

p(k) ~ n(k),

n(k) = const. × (ℓ(k) / ε) − β

“[T]he occupation numbers scale as a power of ℓ(k), which implies a self-similar behaviour” (Pastor-Satorras and Wagensberg, 1998).

Thus, Fractals are evidence for maximization of entropy (T-Rex's conclusion).

Information Entropy: A Derivation

Given a set of M symbols, x(i), they can be combined together to make up a set of messages. The probability of occurrence of any symbol, xi, is given by p[x(i)] with total probability of unity.

S { p[x(i)], i=1,M } = 1 (1)

A measure of the information gained from a message is the minimum number of yes and no questions required to identify the message. Thus if the message is a number less than 64, and each number has an equal probability of 1/64 of occurring, we would first ask if the number is less than or equal to 32. If the answer is “yes”, we could ask if it is less than or equal to 16. If “no”, we could ask it is greater than 16 (and, implicitly, less than r equal to 32), and so on. Thus the number of questions with two possible answers is six. Note that each question adds one unit of information and a convenient measure in a binary system which is additive is the logarithm to base 2 [“log2”; 1 unit = log2 (2)]. The addition of 6 such units is obviously (^ implies exponent)

6 log2 (2) = log2 (2^6) = log2 (64) = 6.

Define a quantity called self information, S, which is log of the reciprocal of the probability of occurrence.

S[x(i)] = log {1/p[x(i)]} = - log {p[x(i)]} (2)

Assigning base 2, for the example above S[x(i)] = 6.

Properties of S[x(i)] and implications for message interpretation:
a) Decreases if a symbol occurs with greater probability.
b) A rare event contains a more significant message.
c) If the p(xi) = 1, then the S[x(i)] = 0.
d) If a message consists of an infinite string of yeses, there is no information or significance to this.
e) Further, if all probabilities are bracketed, 0 <> 0.

Over a long period of time, T, occurrence of x is p(xi)T; thus the total information obtained about the system is
–p[x(1)]T log2{p[x(1)]} – p[x(2)]T log2{p[x(2)]} – ….

The average information per unit time, also called the expectation value, E, of S(x), in which x = [x(1), x(2), x(3), … , x(M)], is then

E [S(x)] = S { p[x(i)] S[x(i)], i=1, M } (3)

And this termed, by analogy with Gibbs thermodynamics, entropy.

H = – S {p[x(i)] log p[x(i)], i=1, M} (4)

In thermodynamics H (with a multiplicative constant) measures the degree of disorder in a physical process. Here it measures the uncertainty of the system in a particular state or the “ignorance”. The base of the logarithm can be any convenient number but is logically 2 if a binary arithmetic is used. In the example, yes = 1 and no = 0.

Entropy of a discrete distribution is a maximum when probabilities are all equal. Thus for a binary system that only transmits a 0 and 1, the probabilities are p and 1-p.

H = – p log2(p) – (1 – p) log2(1 – p) (5)

The entropy is a maximum when p = ½; H is, therefore, 1.

If the variance of x is a constant, s^2, then the entropy is a maximum if the probability has a normal or Gaussian distribution.

p(x) = (2ps^2) ^(-½) exp ( –x^2/2s^2) (6)

The entropy for the Gaussian case is

H = (½) ln [2p exp (s^2)] (7)


(Based on p. 146-147 of Kanasewich, E. R., 1981, Time sequence analysis in geophysics, University of Alberta Press, Edmonton, Alberta, Canada, Third Edition, 480 p.)