Thursday 20 May 2010

StatPhys and ECCs #2: Statistical Physics Overview

For those who still remember, I have decided to talk about my work on statistical mechanics (statphys) applied to error-correcting codes (ECCs). Yes, that is the work that kept me away from writing this blog for the last three years... In the first part of this "series" I wrote about the basic concepts involved in linear error-correcting codes. The discussion was all about information theory. Today I am going to write a bit about the other side of the coin: statistical physics.

Probably many of you have already studied statistical physics and know the fundamental concepts. It is interesting however how statistical physics is not as popular for a greater non-specialist audience as other areas of physics. I cannot blame anyone, as I ended up studying statistical physics for chance and just realised how interesting it is afterwards.

Statistical physics was born with the theory of gases. It was a very logical step. The idea was that, if matter was made of atoms obeying the laws of Newtonian mechanics, then it should be somehow possible to derive the laws of thermodynamics from a microscopic description of the system based on these premises. Well, the truth is that many people did not believe in atoms at that time, for instance, Mach was one of them.

It is well known that Boltzmann (the guy at the right) was the great name of statistical mechanics, although many other great physicists like Maxwell and Einstein also contributed largely to the field. And is also known that Boltzmann killed himself after dedicating his life to statistical physics due to the disbelief in the notion of atom, until of course Einstein published his paper on Brownian motion.

Statistical physics is the area of physics (at least it started in physics...) that deals with the behaviour of a large number of interacting units and the laws that govern it. It started with gases, but the most famous model of statistical physics is called the Ising model (see here the curious story of Ernst Ising). The Ising model is a mathematical model for many units interacting among themselves. Is strikingly simple, but surprisingly general and powerful. It is defined in terms of a Hamiltonian, the function defining the energy of a system. It was devised to explain magnetic materials and so is a function of the spins of the the electrons in those materials, which are symbolised by $\sigma_i$, where $i$ runs from 1 to N, the total number of spins. In terms of spins, the interaction that is importante is called exchange interaction and favours spins to be aligned with each other. As a spin can have two values that can be taken to be +1 and -1, the interaction favours the case when two spins multiplied give the value 1. As all systems in nature want to minimise their energies, we can then write the Hamiltonian as

\[H=-\sum_{\left\{i,j\right\}} \sigma_i \sigma_j,\]

where the symbol under the summation sign says that we must sum over all the pairs that are interacting.
It turns out that this simple model of interaction, properly generalised, can describe not only the interaction between two spins, but also between two or more persons, robots in a swarm, molecules in a gas, bits in a codeword and practically EVERY interaction between ANYTHING. And I really mean it. Of course there is no free lunch and the calculations become more difficult when you increase the sophistication of the system, but the fundamental idea is the same. For example, the usual way the Ising model is defined is with spins in a regular lattice, which in one dimension is a straight line with spins located at equally spaced points, in two dimensions is a square grid, in three is a cubical one (as the one at the side) and so on.

The simplest thing is also to consider that only first neighbours interact, which means that for instance in the cubic lattice above, spins will interact only if there is an edge linking them. The one dimensional model is solvable exactly, the two dimensional also, but it took many years and  a huge mathematical effort by Lars Onsager (a Nobel Prize winner of Chemistry) to be able to solve. The three dimensional one is already NP-complete and therefore practically hopeless (until some alien comes to Earth and print the proof that P=NP in the countryside with crop circles).

Then things got complicated when people started to wonder what happens when the interaction between the spins is different for each edge in the lattice. The most interesting case turned out to be when it is randomly distributed on the lattice. The Ising model conveniently generalised is written as

\[H=-\sum_{\left\{i,j\right\}} J_{ij} \sigma_i \sigma_J,\]

where now the $J_{ij}$ controls the characteristics of the interaction. Note that if $J$ is positive the interaction, as before, favours alignement and is called a ferromagnetic interaction,while if now $J$ is negative, it favours anti-alignemnt and correspondly is called anti-ferromagnetic. If we restrict ourselves to a simple situation where $J$ can be either +1 or -1 randomly in the lattice, we discover that this is a highly complicated case which give rise to a kind of behaviour of our system called a spin glass state.

This kind of randomness is called disorder. In particular, if for every measure we do in our system we fix the interactions to one configurations and do our measurements, than change it, fix it and do it again and so on, the disorder receives the name of quenched disorder. I will write a longer post about disorder soon, as it is a huge and interesting topic. There is another two places in the Ising model where disorder can appear. One is in the form of the lattice. It does not need to be a regular one, it can be any configuration of points linked by as many lines as you can imagine, what we call a random graph. You can also imagine that the interaction involves more than two spins, it can involve three or more generally $p$ spins multiplied together like $\sigma_{i_1}\sigma_{i_2}\cdots\sigma_{i_p}$. Finally, it can appear in the form of random local fields, a field being a variable $h_i$ multiplying the $i$-th spin, something that can be written for $p=2$ (our usual two spins interaction) as

\[H=-\sum_{\left\{i,j\right\}} J_{ij}\sigma_i\sigma_j -\sum_i h_i \sigma_i \]

Now, there are two important facts about statistical mechanics that must be known. Statistical mechanics is concerned with something which is called the typical behaviour of a system, which means the most common realisation of it. Also, it is interested in knowing what happens when the number of units N is really huge, what can be understood by thinking that the number of atoms in any macroscopic sample of a material is of the order of $10^{23}$. The nice thing is that in this large N limit, called appropriately the thermodynamical limit, typical and average coincide and we can always look at averages over our systems. 

Statistical mechanics is a giant topic and I will not attempt to cover it in just one post. I will write other posts explaining parts of it with time, but the main idea of many units interacting will be our link with information theory and codes. Don't give up, we are almost there.

No comments:

Post a Comment