The butterfly effect

By | 19 January 2015

Edward LorenzEdward Lorenz (23 May 1917 – 16 April 2008) was an American mathematician, meteorologist, and a pioneer of chaos theory. He coined the term butterfly effect.  Lorenz was born in West Hartford, Connecticut. He studied mathematics at both Dartmouth College in New Hampshire and Harvard University in Cambridge, Massachusetts. From 1942 until 1946, he served as a meteorologist for the United States Army Air Corps. After his return from World War II, he decided to study meteorology. Lorenz earned two degrees in the area from the Massachusetts Institute of Technology where he later was a professor for many years. He was a Professor Emeritus at MIT from 1987 until his death.1

The following his an extract from Chaos by James Gleick, pages 15 to 17:

With his primitive computer, Lorenz had boiled weather down to the barest skeleton. Yet, line by line, the winds and temperatures in Lorenz’s printouts seemed to behave in a recognizable earthly way. They matched his cherished intuition about the weather, his sense that it repeated itself, displaying familiar patterns over time, pressure rising and falling, the airstream swinging north and south. He discovered that when a line went from high to low without a bump, a double bump would come next, and he said “That’s the kind of rule a forecaster could use.” But the repetitions were never quite exact. There was pattern, with disturbances. An orderly disorder. To make the patterns plain to see, Lorenz created a primitive kind of graphics. Instead of just printing out the usual lines of digits, he would have the machine print a certain number of blank spaces followed by the letter a. He would pick one variable – perhaps the direction of the airstream. Gradually the a’s marched down the roll of paper, swinging back and forth in a wavy line, making a long series of hills and valleys that represented the way the west wind would swing north and south across the continent. The orderliness of it, the recognizable cycles coming around again and again but never twice the same way, had a hypnotic fascination. The system seemed slowly to be revealing its secrets to the forecaster’s eye.

One day in the winter of 1961, wanting to examine one sequence at greater length, Lorenz took a shortcut. Instead of starting the whole run over, he started midway through. To give the machine its initial conditions, he typed the numbers straight from the earliest printout. Then he walked down the hall to get away from the noise and drink a cup of coffee. When he returned an hour later, he saw something unexpected, something that planted a seed for a new science.

This new run should have exactly duplicated the old. Lorenz had copied the numbers into the machine itself. The program had not changed.  Yet as he stared at the new printout , Lorenz saw his weather diverging so rapidly from the pattern of the last run that, within just a few months, all resemblance had disappeared. He looked at one set of numbers, then back at the other. He might as well have chosen two random weathers out of a hat. His first thought was that another vacuum tube had gone bad. Suddenly he realized the truth. There had been no malfunction. The problem lay in the numbers he had typed. In the computer’s memory, six decimal places were stored: .506127. On the printout, to save space, just three appeared: .506. Lorenz had entered the shorter, rounded-off numbers, assuming that the difference – one part in a thousand was inconsequential. It was a reasonable assumption. If a weather satellite can read ocean-surface temperatures to within one part in a thousand, its operators consider themselves lucky. Lorenz’s Royal McBee was implementing a classical program. It used a purely deterministic system of equations. Given a particular starting point, the weather would unfold exactly the same way each time. Given a slightly different starting point, the weather should unfold in a slightly different way. A small numerical error was like a small puff of wind – surely the small puffs faded or cancelled each other out before they could change important, large scale features of the weather. Yet in Lorenz’s particular system of equations, small errors proved catastrophic.

He decided to look more closely at the way two nearly identical runs of weather flowed apart. He copied one of the wavy lines of output onto a transparency and laid it over the other, to inspect the way it diverged.

dynamic systems agriculture First, two humps matched detail for detail. Then one line began to lag a hairsbreadth behind. By the time the two runs reached the next hump, they were distinctly out of phase. By the third or fourth hump, all similarity had vanished.

It was only a wobble from a clumsy computer. Lorenz could have assumed something was wrong with his particular machine or his particular model – probably should have assumed. It was not as though he had mixed sodium and chlorine and got gold. But for reasons of mathematical intuition that his colleagues would begin to understand only later, Lorenz felt a jolt; something was philosophically out of joint. The practical import could be staggering. Although his equations were gross parodies of the earth’s weather, he had a faith that they captured the essence of the real atmosphere.

I first read this extract exactly 20 years ago and there are numerous parallels between the description of Lorenz’s equations, model and system of prediction and the model, algorithms, and code used to produce the AgOptimizer results. Just as the weather was of immense interest to Lorenz, the optimization of grazing systems has a depth and complexity that drives the development of our work.