The intellectual legacy of Isaac Newton was a vision of the
clockwork universe, set in motion at the instant of creation
but thereafter running in prescribed grooves, like a well-oiled
machine. It was an image of a totally deterministic worldone
leaving no room for the operation of chance, one whose
future was completely determined by its present. As the great
mathematical astronomer Pierre-Simon de Laplace eloquently
put it in 1812 in his Analytic Theory of Probabilities:
An intellect which at any given moment knew all the forces that
animate Nature and the mutual positions of the beings that comprise
it, if this intellect were vast enough to submit its data to
analysis, could condense into a single formula the movement of
the greatest bodies of the universe and that of the lightest atom:
for such an intellect nothing could be uncertain, and the future
just like the past would be present before its eyes.
This same vision of a world whose future is totally predictable
lies behind one of the most memorable incidents in
Douglas Adams's 1979 science-fiction novel The Hitchhiker's
Guide to the Galaxy, in which the philosophers Majikthise and
Vroomfondel instruct the supercomputer "Deep Thought" to
calculate the answer to the Great Question of Life, the Uni-
verse, and Everything. Aficionados will recall that after five
million years the computer answered, "Forty-two," at which
point the philosophers realized that while the answer was
clear and precise, the question had not been. Similarly, the
fault in Laplace's vision lies. not in his answer-that the universe
is in principle predictable, which is an accurate statement
of a particular mathematical feature of Newton's law of
motion-but in his interpretation of that fact, which is a serious
misunderstanding based on asking the wrong question.
By asking a more appropriate question, mathematicians and
physicists have now come to understand that determinism
and predictability are not synonymous.
In our daily lives, we encounter innumerable cases where
Laplacian determinism seems to be a highly inappropriate
model. We walk safely down steps a thousand times, until
one day we turn our ankle and break it. We go to a tennis
match, and it is rained off by an unexpected thunderstorm.
We place a bet on the favorite in a horse race, and it falls at
the last fence when it is six lengths ahead of the field. It's not
so much a universe in which-as Albert Einstein memorably
refused to believe-God plays dice: it seems more a universe
in which dice play God.
Is our world deterministic, as Laplace claimed, or is it governed
by chance, as it so often seems to be? And if Laplace is
really right, why does so much of our experience indicate that
he is wrong? One of the most exciting new areas of mathematics,
nonlinear dynamics-popularly known as chaos theoryclaims
to have many of the answers. Whether or not it does, it
is certainly creating a revolution in the way we think about
order and disorder, law and chance, predictability and
randomness.
According to modern physics, nature is ruled by chance
on its smallest scales of space and time. For instance, whether
a radioactive atom-of uranium, say-does or does not decay
at any given instant is purely a matter of chance. There is no
physical difference whatsoever between a uranium atom that
is about to decay and one that is not about to decay. None.
Absolutely none.
There are at least two contexts in which to discuss these
issues: quantum mechanics and classical mechanics. Most of
this chapter is about classical mechanics, but for a moment let
us consider the quantum-mechanical context. It was this view
of quantum indeterminacy that prompted Einstein's famous
statement (in a letter to his colleague Max Born) that "you
believe in a God who plays dice, and 1 in complete law and
order." To my mind, there is something distinctly fishy about
the orthodox physical view of quantum indeterminacy, and 1
appear not to be alone, because, increasingly, many physicists
are beginning to wonder whether Einstein was right all along
and something is missing from conventional quantum
mechanics-perhaps "hidden variables," whose values tell an
atom when to decay. (I hasten to add that this is not the conventional
view.) One of the best known of them, the Princeton
physicist David Bohm, devised a modification of quantum
mechanics that is fully deterministic but entirely consistent
with all the puzzling phenomena that have been used to support
the conventional view of quantum indeterminacy.
Bohm's ideas have problems of their own, in particular a kind
of "action at a distance" that is no less disturbing than quantum
indeterminacy.
However, even if quantum mechanics is correct about
indeterminacy on the smallest scales, on macroscopic scales
of space and time the universe obeys deterministic laws, This
results from an effect called decoherence, which causes sufficiently
large quantum systems to lose nearly all of their indeterminacy
and behave much more like Newtonian systems. In
effect, this reinstates classical mechanics for most humanscale
purposes. Horses, the weather, and Einstein's celebrated
dice are not unpredictable because of quantum mechanics.
On the contrary, they are unpredictable within a Newtonian
model, too. This is perhaps not so surprising when it come to
horses-living creatures have their own hidden variables,
such as what kind of hay they had for breakfast. But it was
definitely a surprise to those meteorologists who had been
developing massive computer simulations of weather in the
hope of predicting it for months ahead. And it is really rather
startling when it comes to dice, even though humanity perversely
uses dice as one of its favorite symbols for chance.
Dice are just cubes, and a tumbling cube should be no less
predictable than an orbiting planet: after all, both objects obey
the same laws of mechanical motion. They're different
shapes, but equally regular and mathematical ones.
To see how unpredictability can be reconciled with determinism,
think about a much less ambitious system than the
entire universe-namely, drops of water dripping from a tap. *
This is a deterministic system: in principle, the flow of water
into the apparatus is steady and uniform, and what happens
to it when it emerges is totally prescribed by the laws of fluid
motion. Yet a simple but effective experiment demonstrates
that this evidently deterministic system can be made to
behave unpredictably; and this leads us to some mathematical
"lateral thinking," which explains why such a paradox is
possible.
If you turn on a tap very gently and wait a few seconds for
the flow to settle down, you can usually produce a regular
series of drops of water, falling at equally spaced times in a
regular rhythm. It would be hard to find anything more predictable
than this. But if you slowly turn the tap to increase
the flow, you can set it so that the sequence of drops falls in a
very irregular manner, one that sounds random. It may take a
little experimentation to succeed, and it helps if the tap turns
smoothly. Don't turn it so far that the water falls in an unbroken
stream; what you want is a medium-fast trickle. If you get
it set just right, you can listen for many minutes without any
obvious pattern becoming apparent.
In 1978, a bunch of iconoclastic young graduate students
at the University of California at Santa Cruz formed the
Dynamical Systems Collective. When they began thinking
about this water-drop system, they realized that it's not as
random as it appears to be. They recorded the dripping noises
with a microphone and analyzed the sequence of intervals
between each drop and the next. What they found was shortterm
predictability. If I tell you the timing of three successive
drops, then you can predict when the next drop will fall. For
example, if the last three intervals between drops have been
0.63 seconds, 1.17 seconds, and 0.44 seconds, then you can be
sure that the next drop will fall after a further 0.82 seconds.
(These numbers are for illustrative purposes only.) In fact, if
you know the timing of the first three drops exactly, then you
can predict the entire future of the system.
So why is Laplace wrong? The point is that we can never
measure the initial state of a system exactly. The most precise
measurements yet made in any physical system are correct to
about ten or twelve decimal places. But Laplace's statement is
correct only if we can make measurements to infinite precision,
infinitely many decimal places-and of course there's
no way to do that. People knew about this problem of measurement
error in Laplace's day, but they generally assumed
that provided you made the initial measurements to, say, ten
decimal places, then all subsequent prediction would also be
accurate to ten decimal places. The error would not disappear,
but neither would it grow.
Unfortunately, it does grow, and this prevents us from
stringing together a series of short-term predictions to get one
that is valid in the long term. For example, suppose I know
the timing of the first three water drops to an accuracy of ten
decimal places. Then I can predict the timing of the next drop
to nine decimal places, the drop after that to eight decimal
places, and so on. At each step, the error grows by a factor of
about ten, so I lose confidence in one further decimal place.
Therefore, ten steps into the future, I really have no idea at all
what the timing of the next drop will be. (Again, the precise
figures will probably be different: it may take half a dozen
drops to lose one decimal place in accuracy, but even then it
takes only sixty drops until the same problem arises.)
This amplification of error is the logical crack through
which Laplace's perfect determinism disappears. Nothing
short of total perfection of measurement will do. If we could
measure the timing to a hundred decimal places, our predictions
would fail a mere hundred drops into the future (or six
hundred, using the more optimistic estimate). This phenomenon
is called "sensitivity to initial conditions," or more informally
"the butterfly effect." (When a butterfly in Tokyo flaps
its wings, the result may be a hurricane in Florida a month
later.) It is intimately associated with a high degree of irregularity
of behavior. Anything truly regular is by definition
fairly predictable. But sensitivity to initial conditions renders
behavior unpredictable-hence irregular. For this reason, a
system that displays sensitivity to initial conditions is said to
be chaotic. Chaotic behavior obeys deterministic laws, but it
is so irregular that to the untrained eye it looks pretty much
random. Chaos is not just complicated, patternless behavior;
it is far more subtle. Chaos is apparently complicated, apparently
patternless behavior that actually has a simple, deterministic
explanation.
The discovery of chaos was made by many people, too
numerous to list here. It came about because of the conjunction
of three separate developments. One was a change of scientific
focus, away from simple patterns such as repetitive
cycles, toward more complex kinds of behavior. The second
was the computer, which made it possible to find approximate
solutions to dynamical equations easily and rapidly.
The third was a new mathematical viewpoint on dynamics-a
geometric rather than a numerical viewpoint. The first provided
motivation, the second provided technique, and the
third provided understanding.
The geometrization of dynamics began about a hundred
years ago, when the French mathematician Henri Poincare-a
maverick if ever there was one, but one so brilliant that his
views became orthodoxies almost overnight-invented the
concept of a phase space. This is an imaginary mathematical
space that represents all possible motions of a given dynamical
system. To pick a nonmechanical example, consider the
population dynamics of a predator-prey ecological system.
The predators are pigs and the prey are those exotically pungent
fungi, truffles. The variables upon which we focus attention
are the sizes of the two populations-the number of pigs
(relative to some reference value such as one million) and the
number of truffles (ditto). This choice effectively makes the
variables continuous-that is, they take real-number values
with decimal places, not just whole-number values. For
example, if the reference number of pigs is one million, then a
population of 17,439 pigs corresponds to the value 0.017439.
Now, the natural growth of truffles depends on how many
truffles there are and the rate at which pigs eat them: the
growth of the pig population depends on how many pigs
there are and how many truffles they eat. So the rate of
change of each variable depends on both variables, an observation
that can be turned into a system of differential equations
for the population dynamics. I won't write them down,
because it's not the equations that matter here: it's what you
do with them.
These equations determine-in principle-how any initial
population values will change over time. For example, if we
start with 17,439 pigs and 788,444 truffles, then you plug in
the initial values 0.017439 for the pig variable and 0.788444
for the truffle variable, and the equations implicitly tell you
how those numbers will change. The difficulty is to make the
implicit become explicit: to solve the equations. But in what
sense? The natural reflex of a classical mathematician would
be to look for a formula telling us exactly what the pig population
and the truffle population will be at any instant. Unfortunately,
such "explicit solutions" are so rare that it is
scarcely worth the effort of looking for them unless the equations
have a very special and limited form. An alternative is
to find approximate solutions on a computer; but that tells us
only what will happen for those particular initial values, and
most often we want to know what will happen for a lot of different
initial values.
Poincare's idea is to draw a picture that shows what happens
for all initial values. The state of the system-the sizes of
the two populations at some instant of time-can be represented
as a point in the plane, using the old trick of coordinates.
For example, we might represent the pig population by
the horizontal coordinate and the truffle population by the
vertical one. The initial state described above corresponds to
the point with horizontal coordinate 0.017439 and vertical
coordinate 0.788444. Now let time flow. The two coordinates
change from one instant to the next, according to the rule
expressed by the differential equation, so the corresponding
point moves. A moving point traces out a curve; and that
curve is a visual representation of the future behavior of the
entire system. In fact, by looking at the curve, you can "see"
important features of the dynamics without worrying about
the actual numerical values ofthe coordinates.
For example, if the curve closes up into a loop, then the
two populations are following a periodic cycle, repeating the
same values over and over again-just as a car on a racetrack
keeps going past the same spectator every lap. If the curve
homes in toward some particular point and stops, then the
populations settle down to a steady state, in which neither
changes-like a car that runs out of fuel. By a fortunate coincidence,
cycles and steady states are of considerable ecological
significance-in particular, they set both upper and lower
limits to populations sizes. So the features that the eye detects
most easily are precisely the ones that really matter.
Moreover, a lot of irrelevant detail can be ignored: for example, we
can see that there is a closed loop without having to work out
its precise shape (which represents the combined "waveforms"
of the two population cycles).
What happens if we try a different pair of initial values?
We get a second curve. Each pair of initial values defines a
new curve; and we can capture all possible behaviors of the
system, for all initial values, by drawing a complete set of
such curves. This set of curves resembles the flow lines of an
imaginary mathematical fluid, swirling around in the plane.
We call the plane the phase space of the system, and the set of
swirling curves is the system's phase portrait. Instead of the
symbol-based idea of a differential equation with various initial
conditions, we have a geometric, visual scheme of points
flowing through pig/truffle space. This differs from an ordinary
plane only in that many of its points are potential rather
than actual: their coordinates correspond to numbers of pigs
and truffles that could occur under appropriate initial conditions,
but may not occur in a particular case. So as well as the
mental shift from symbols to geometry, there is a philosophical
shift from the actual to the potential.
The same kind of geometric picture can be imagined for
any dynamical system. There is a phase space, whose coordinates
are the values of all the variables; and there is a phase
portrait, a system of swirling curves that represents all possible
behaviors starting from all possible initial conditions, and
that are prescribed by the differential equations. This idea
constitutes a major advance, because instead of worrying
about the precise numerical details of solutions to the equations,
we can focus upon the broad sweep of the phase portrait,
and bring humanity's greatest asset, its amazing image
processing abilities, to bear. The image of a phase space as a
way of organizing the total range of potential behaviors, from
among which nature selects the behavior actually observed,
has become very widespread in science.
The upshot of Poincare's gre'at innovation is that dynamics
can be visualized in terms of geometric shapes called attractors.
If you start a dynamical system from some initial point
and watch what it does in the long run, you often find that it
ends up wandering around on some well-defined shape in
phase space. For example, the curve may spiral in toward a
closed loop and then go around and around the loop forever.
Moreover, different choices of initial conditions may lead to
the same final shape. If so, that shape is known as an attractor.
The long-term dynamics of a system is governed by its
attractors, and the shape of the attractor determines what type
of dynamics occurs.
For example, a system that settles down to a steady state
has an attractor that is just a point. A system that settles down
to repeating the same behavior periodically has an attractor
that is a closed loop. That is, closed loop attractors correspond
to oscillators. Recall the description of a vibrating violin
string from chapter 5; the string undergoes a sequence of
motions that eventually puts it back where it started, ready to
repeat the sequence over and over forever. I'm not suggesting
that the violin string moves in a physical loop. But my
description of it is a closed loop in a metaphorical sense: the
motion takes a round trip through the dynamic landscape of
phase space.
Chaos has its own rather weird geometry: it is associated
with curious fractal shapes called strange attractors. The butterfly
effect implies that the detailed motion on a strange attractor
can't be determined in advance. But this doesn't alter the fact
that it is an attractor. Think of releasing a Ping-Pong ball into a
stormy sea. Whether you drop it from the air or release it from
underwater, it moves toward the surface. Once on the surface, it
follows a very complicated path in the surging waves, but however
complex that path is, the ball stays on-or at least very
near-the surface. In this image, the surface of the sea is an
attractor. So, chaos notwithstanding, no matter what the starting
point may be, the system will end up very close to its attractor.
Chaos is well established as a mathematical phenomenon,
but how can we detect it in the real world? We must perform
experiments-and there is a problem. The traditional role of
experiments in science is to test theoretical predictions, but if
the butterfly effect is in operation-as it is for any chaotic system-
how can we hope to test a prediction? Isn't chaos inherently
untestable, and therefore unscientific?
The answer is a resounding no, because the word "prediction"
has two meanings. One is "foretelling the future," and the
butterfly effect prevents this when chaos is present. But the
other is "describing in advance what the outcome of an experiment
will be." Think about tossing a coin a hundred times. In
order to predict-in the fortune-teller's sense-what happens,
you must list in advance the result of each of the tosses. But
you can make scientific predictions, such as "roughly half the
coins will show heads," without foretelling the future in
detail-even when, as here, the system is random. Nobody suggests
that statistics is unscientific because it deals with unpredictable
events, and therefore chaos should be treated in the
same manner. You can make all sorts of predictions about a
chaotic system; in fact, you can make enough predictions to
distinguish deterministic chaos from true randomness. One
thing that you can often predict is the shape of the attractor,
which is not altered by the butterfly effect. All the butterfly
effect does is to make the system follow different paths on the
same attractor. In consequence, the general shape of the attractor
can often be inferred from experimental observations.
The discovery of chaos has revealed a fundamental misunderstanding
in our views of the relation between rules and the
behavior they produce-between cause and effect. We used to
think that deterministic causes must produce regular effects,
but now we see that they can produce highly irregular effects
that can easily be mistaken for randomness. We used to think
that simple causes must produce simple effects (implying that
complex effects must have complex causes), but now we
know that simple causes can produce complex effects. We
realize that knowing the rules is not the same as being able to
predict future behavior.
How does this discrepancy between cause and effect arise?
Why do the same rules sometimes produce obvious patterns
and sometimes produce chaos? The answer is to be found in
every kitchen, in the employment of that simple mechanical
device, an eggbeater. The motion of the two beaters is simple
and predictable, just as Laplace would have expected: each
beater rotates steadily. The motion of the sugar and the egg
white in the bowl, however, is far more complex. The two
ingredients get mixed up-that's what eggbeaters are for. But
the two rotary beaters don't get mixed up-you don't have to
disentangle them from each other when you've finished. Why
is the motion of the incipient meringue so different from that
of the beaters? Mixing is a far more complicated, dynamic
process than we tend to think. Imagine trying to predict
where a particular grain of sugar will end up! As the mixture
passes between the pair of beaters, it is pulled apart, to left
and right, and two sugar grains that start very close together
soon get a long way apart and follow independent paths. This
is, in fact, the butterfly effect in action-tiny changes in initial
conditions have big effects. So mixing is a chaotic process.
Conversely, every chaotic process involves a kind of mathematical
mixing in Poincare's imaginary phase space. This is
why tides are predictable but weather is not. Both involve the
same kind of mathematics, but the dynamics of tides does not
get phase space mixed up, whereas that of the weather does.
It's not what you do, it's the way that you do it.
Chaos is overturning our comfortable assumptions about
how the world works. It tells us that the universe is far
stranger than we think. It casts doubt on many traditional
methods of science: merely knowing the laws of nature is no
longer enough. On the other hand, it tells us that some things
that we thought were just random may actually be consequences
of simple laws. Nature's chaos is bound by rules. In
the past, science tended to ignore events or phenomena that
seemed random, on the grounds that since they had no obvious
patterns they could not be governed by simple laws. Not
so. There are simple laws right under our noses-laws governing
disease epidemics, or heart attacks, or plagues of locusts.
If we learn those laws, we may be able to prevent the disasters
that follow in their wake.
Already chaos has shown us new laws, even new types of
laws. Chaos contains its own brand of new universal patterns.
One of the first to be discovered occurs in the dripping tap.
Remember that a tap can drip rhythmically or chaotically,
depending on the speed of the flow. Actually, both the regularly
dripping tap and the "random" one are following slightly
different variants of the same mathematical prescription. But as
the rate at which water passes through the tap increases, the
type of dynamics changes. The attractor in phase space that
represents the dynamics keeps changing-and it changes in a
predictable but highly complex manner.
Start with a regularly dripping tap: a repetitive drip-dripdrip-
drip rhythm, each drop just like the previous one. Then
tum the tap slightly, so that the drips come slightly faster.
Now the rhythm goes drip-DRIP-drip-DRIP, and repeats every
two drops. Not only the size of the drop, which governs how
loud the drip sounds, but also the timing changes slightly
from one drop to the next.
If you allow the water to flow slightly faster still, you get a
four-drop rhythm: drip-DRIP-drip-DRIP. A little faster still,
and you produce an eight-drop rhythm: drip-DRIP-drip-DRIPdrip-
DRIP-drip-DRIP. The length of the repetitive sequence
of drops keeps on doubling. In a mathematical model, this
process continues indefinitely, with rhythmic groups of 16,
32, 64 drops, and so on. But it takes tinier and tinier changes
to the flow rate to produce each successive doubling of the
period; and there is a flow rate by which the size of the group
has doubled infinitely often. At this point, no sequence of
drops repeats exactly the same pattern. This is chaos.
We can express what is happening in Poincare's geometric
language. The attractor for the tap begins as a closed loop,
representing a periodic cycle. Think of the loop as an elastic
band wrapped around your finger. As the flow rate increases,
this loop splits into two nearby loops, like an elastic band
wound twice around your finger. This band is twice as long as
the original, which is why the period is twice as long. Then in
exactly the same way, this already-doubled loop doubles
again, all the way along its length, to create the period-four
cycle, and so on. After infinitely many doublings, your finger
is decorated with elastic spaghetti, a chaotic attractor.
This scenario for the creation of chaos is called a perioddoubling
cascade. In 1975, the physicist Mitchell Feigenbaum
discovered that a particular number, which can be measured
in experiments, is associated with every period-doubling cascade.
The number is roughly 4.669, and it ranks alongside 1t
(pi) as one of those curious numbers that seem to have extraordinary
significance in both mathematics and its relation to
the natural world. Feigenbaum's number has a symbol, too:
the Greek letter () (delta). The number 1t tells us how the circumference
of a circle relates to its diameter. Analogously,
Feigenbaum's number () tells us how the period of the drips
relates to the rate of flow of the water. To be precise, the extra
amount by which you need to turn on the tap decreases by a
factor of 4.669 at each doubling of the period.
The number 1t is a quantitative signature for anything
involving circles. In the same way, the Feigenbaum number ()
is a quantitative signature for any period-doubling cascade,
no matter how it is produced or how it is realized experimentally.
That very same number shows up in experiments on liquid
helium, water, electronic circuits, pendulums, magnets,
and vibrating train wheels. It is a new universal pattern in
nature, one that we can see only through the eyes of chaos; a
quantitative pattern, a number, emerges from a qualitative
phenomenon. One of nature's numbers, indeed. The Feigenbaum
number has opened the door to a new mathematical
world, one we have only just begun to explore.
The precise pattern found by Feigenbaum, and other patterns
like it, is a matter of fine detail. The basic point is that
even when the consequences of natural laws seem to be patternless,
the laws are still there and so are the patterns. Chaos
is not random: it is apparently random behavior resulting
from precise rules. Chaos is a cryptic form of order.
Science has traditionally valued order, but we are beginning
to appreciate the fact that chaos can offer science distinct
advantages. Chaos makes it much easier to respond quickly to
an outside stimulus. Think of tennis players waiting to
receive a serve. Do they stand still? Do they move regularly
from side to side? Of course not. They dance erratically from
one foot to the other. In part, they are trying to confuse their
opponents, but they are also getting ready to respond to any
serve sent their way. In order to be able to move quickly in
any particular direction, they make rapid movements in many
different directions. A chaotic system can react to outside
events much more quickly, and with much less effort, than a
non chaotic one. This is important for engineering control
problems. For example, we now know that some kinds of turbulence
result from chaos-that's what makes turbulence look
random. It may prove possible to make the airflow past an aircraft's
skin much less turbulent, and hence less resistant to
motion, by setting up control mechanisms that respond
extremely rapidly to cancel out any small regions of incipient
turbulence. Living creatures, too, must behave chaotically in
order to respond rapidly to a changing environment.
This idea has been turned into an extremely useful practical
technique by a group of mathematicians and physicists,
among them William Ditto, Alan Garfinkel, and Jim Yorke:
they call it chaotic control. Basically, the idea is to make the
butterfly effect work for you. The fact that small changes in
initial conditions create large changes in subsequent behavior
can be an advantage; all you have to do is ensure that you get
the large changes you want. Our understanding of how
chaotic dynamics works makes it possible to devise control
strategies that do precisely this. The method has had several
successes. Space satellites use a fuel called hydrazine to make
course corrections. One of the earliest successes of chaotic
control was to divert a dead satellite from its orbit and send it
out for an encounter with an asteroid, using only the tiny
amount of hydrazine left on board. NASA arranged for the
satellite to swing around the Moon five times, nudging it
slightly each time with a tiny shot of hydrazine. Several such
encounters were achieved, in an operation that successfully
exploited the occurrence of chaos in the three-body problem
(here, Earth/Moon/satellite) and the associated butterfly
effect.
The same mathematical idea has been used to control a
magnetic ribbon in a turbulent fluid-a prototype for controlling
turbulent flow past a submarine or an aircraft. Chaotic
control has been used to make erratically beating hearts return
to a regular rhythm, presaging invention of the intelligent
pacemaker. Very recently, it has been used both to set up and
to prevent rhythmic waves of electrical activity in brain tissue,
opening up the possibility of preventing epileptic attacks.
Chaos is a growth industry. Every week sees new discoveries
about the underlying mathematics of chaos, new applications
of chaos to our understanding of the natural world, or
new technological uses of chaos-including the chaotic dishwasher,
a Japanese invention that uses two rotating arms,
spinning chaotically, to get dishes cleaner using less energy;
and a British machine that uses chaos-theoretic data analysis
to improve quality control in spring manufacture.
Much, however, remains to be done. Perhaps the ultimate
unsolved problem of chaos is the strange world of the quantum,
where Lady Luck rules. Radioactive atoms decay "at random";
their only regularities are statistical. A large quantity of
radioactive atoms has a well-defined half-life-a period of
time during which half the atoms will decay. But we can't
predict which half. Albert Einstein's protest, mentioned earlier,
was aimed at just this question. Is there really no difference
at all between a radioactive atom that is not going to
decay, and one that's just about to? Then how does the atom
know what to do?
Might the apparent randomness of quantum mechanics be
fraudulent? Is it really deterministic chaos? Think of an atom
as some kind of vibrating droplet of cosmic fluid. Radioactive
atoms vibrate very energetically, and every so often a smaller
drop can split off-decay. The vibrations are so rapid that we
can't measure them in detail: we can only measure averaged
quantities, such as energy levels. Now, classical mechanics
tells us that a drop of real fluid can vibrate chaotically. When
it does so, its motion is deterministic but unpredictable. Occasionally,
"at random," the vibrations conspire to split off a
tiny droplet. The butterfly effect makes it impossible to say in
advance just when the drop will split; but that event has precise
statistical features, including a well defined half-life.
Could the apparently random decay of radioactive atoms
be something similar, but on a microcosmic scale? After all,
why are there any statistical regularities at all? Are they traces
of an underlying determinism? Where else can statistical regularities
come from? Unfortunately, nobody has yet made this
seductive idea work-though it's similar in spirit to the fashionable
theory of superstrings, in which a subatomic particle
is a kind of hyped-up vibrating multidimensional loop. The
main similar feature here is that both the vibrating loop and
the vibrating drop introduce new "internal variables" into the
physical picture. A significant difference is the way these two
approaches handle quantum indeterminacy. Superstring theory,
like conventional quantum mechanics, sees this indeterminacy
as being genuinely random. In a system like the drop,
however, the apparent indeterminacy is actually generated by
a deterministic, but chaotic, dynamic. The trick-if only we
knew how to do it-would be to invent some kind of structure
that retains the successful features of superstring theory,
while making some of the internal variables behave chaotically.
It would be an appealing way to render the Deity's dice
deterministic, and keep the shade of Einstein happy.
Chapter 6 : Broken Symmetry
Chapter 7 : The Rhythm of Life
Chapter 8 : Do dice Play God
Chapter 9 : Drops Dynamics and Daisies
Do Dice Play God : Nature's Numbers Chapter 8
-------
Subscribe to:
Post Comments (Atom)
This chapter is about whether we can predict the future or whether it is arbitrary and random. In ancient times, the world must have seemed pretty arbitrary. Disasters such as floods and diseases must have seemed to happen without warning or apparent reason. People must have noticed certain regularities in the behaviour of nature.
ReplyDeleteDarlene L. Hagos
Albert Einstein believed that God does not play dice with the Universe; that the world in which we live is governed by precise laws rather than chance. The branch of mathematics popularly known as Chaos Theory clarifies the question through a new paradox: precise laws may offer the appearance of randomness.
ReplyDelete-LIAH VERTERA
According to modern physics, nature is ruled by chance on its smallest scales of space and time. For instance, whether a radioactive atom-of uranium, say-does or does not decay at any given instant is purely a matter of chance.
ReplyDeleteThe discovery of chaos has revealed a fundamental misunderstanding in our views of the relation between rules and the behavior they produce-between cause and effect
Every time I finish a single chapter, I’m able to know something new. And in this chapter I learned that a simple thing can be the cause of a complex one. Another is that the world in which we live is governed by precise laws rather than chance. Chaos is an example of the simple cause why we create assumptions about the world in our minds. It tells us that we don’t really have a deep understanding of the universe and some things that we thought were just random may actually be the effects of simple laws.
ReplyDelete-Garcia
LAERALLIV ROQUE/ BSA-11M2
ReplyDeleteThis chapter's main topic is the uncertainty of nature. I, for one, believe that most things happen by chance, though some would work if we want it to work as they should have. A perfect example for this is Real Estate. Property specialists would know where they can find many potential clients, however, it is a job by chance because the certainty of having these clients depends upon their allocated span of time. No matter how good he/she is , if the buyer went to his/her co-worker, it is basically useless. Life is all about the choices we make, however we don't always get the chance. That's life. Nothing is certain.
LAERALLIV ROQUE/ BSA-11M2
ReplyDeleteThis chapter's main topic is the uncertainty of nature. I, for one, believe that most things happen by chance, though some would work if we want it to work as they should have. A perfect example for this is Real Estate. Property specialists would know where they can find many potential clients, however, it is a job by chance because the certainty of having these clients depends upon their allocated span of time. No matter how good he/she is , if the buyer went to his/her co-worker, it is basically useless. Life is all about the choices we make, however we don't always get the chance. That's life. Nothing is certain.
LAERALLIV ROQUE/ BSA-11M2
ReplyDeleteThis chapter's main topic is the uncertainty of nature. I, for one, believe that most things happen by chance, though some would work if we want it to work as they should have. A perfect example for this is Real Estate. Property specialists would know where they can find many potential clients, however, it is a job by chance because the certainty of having these clients depends upon their allocated span of time. No matter how good he/she is , if the buyer went to his/her co-worker, it is basically useless. Life is all about the choices we make, however we don't always get the chance. That's life. Nothing is certain.
We would like to believe we can know things for certain. We want to be able to figure out who will win an election, if the stock market will crash, Thus it seems Einstein was doubly wrong when he said, God does not play dice.
ReplyDeletepagkalinawan mario
In this chapter, I learned that many theories created chaos within the world of mathematics and science. there are a lot of theories circulating around the world that was also being questioned as times goes by. quantum mechanics and classical mechanics were also discussed. I was kind of confused with what I read so I searched it for myself, and based on the websites that I visited, quantum mechanics is science dealing with the behavior of matter and light on the atomic and subatomic scale. while classical mechanics is the study of the motion of bodies. In conclusion, many theories were disregarded that leads to creation of another theory and one of those is the reason why Einstein refuses to believe that dice plays as god.
ReplyDelete- Valdez, Nica Mae C. // BSA11M2
One of my fav chapter of the book! It starts by introducing the concept of phase space that is nothing but a solution space. In my understanding it is the imaginary mathematical space that represent all the possible motion and where you will plot the behavior in order to create the phase portrait. One of the main idea of this chapter is the half life period and the chaos theory.
ReplyDeleteThis chapter tackles about phase space, how the axis corresponds to one of the coordinates and the importance of it in physical system. This phase space is about nothing but a solution space where you should plot behavior for you to be able to create phase portrait. On the other hand,it also tackles about the chaos theory; it is a branch of mathematics that basically focusing on the behavior of dynamical system.
ReplyDeleteThankfully there is hope. Award winning mathematician Ian Stewart reveals, over the course of history, mathematics has given us some of the tools we need to better manage the uncertainty that pervades our lives. From forecasting, to medical research, to figuring out how to win Let's Make a Deal, Do Dice Play God? And now i understand it is a surprising and satisfying tour of what we can know, and what we never will.
ReplyDelete-Lothrell Sarmiento
Base in my understanding in this chapter (eight), that The intellectual legacy of Isaac Newton was a vision of the clockwork universe, set in motion at the instant of creation but thereafter running in prescribed grooves, like a well-oiled machine. Base from the great mathematical astronomer Pierre-Simon de Laplace eloquently put it in 1812 in his Analytic Theory of Probabilities: that an intellectual could be be enough to sumbit its data to the analysis. And another one, there is no physical difference between in a uranium atom that is about to decay or not that is absolutely none. And also the quantum mechanics and classical mechanics is the two context that have been discussed in this chapter.
ReplyDeleteMathematics is flexible, meaning everything are possible, everything have chances, and everything can be assumed, somehow there will always be an effect and causes that can happen, the problems can be solve by the help of the theories relates with mathematics. But at the end of the day, there will always a chaos that will help us to learn something.
ReplyDeleteIn my understanding on chapter 7: The Rhythm of Life, every events happens every day are on default cycles that's why I’m wondering why there is climate change. But after reading this chapter I learned that some events are unpredictable that is why some philosopher taken determinism to know what will be the next action or event that are possibly happened. Thank you
ReplyDeleteIn 1978, a bunch of iconoclastic young graduate students at the University of California at Santa Cruz formed the Dynamical Systems Collective. When they began thinking about this water-drop system, they realized that it's not as random as it appears to be. They recorded the dripping noises with a microphone and analyzed the sequence of intervals between each drop and the next. What they found was short term predictability. If I tell you the timing of three successive drops, then you can predict when the next drop will fall. The geometrization of dynamics began about a hundred years ago, when the French mathematician Henri Poincare-a maverick if ever there was one, but one so brilliant that his views became orthodoxies almost overnight-invented the concept of a phase space. This is an imaginary mathematical space that represents all possible motions of a given dynamical system. To pick a no mechanical example, consider the population dynamics of a predator-prey ecological system.
ReplyDelete- Sherwin Oliva
Chapter 7
ReplyDeleteThis chapter is entitled " The Rhythm of Life" that talks about oscillations which is a unit whose natural dynamic causes it to repeat yhe same cycle of behavior over and over again. We follow a rhythm in life as how we follow the beat when dancing. Example of these are the movement of horses and production machines. Us people has these oscillations like our daily routines and we have this attitude that we keep on repeating that whether if its good or bad. But this chapter teaches us math shows us many aspects of nature that we dont think as mayhematical like our movements but there are rhythms that we follow.
Chapter 8
The chapter is entitled Do dice play God? which is translated as in the book as if everything in our life rides or goes as a chance like the chances of getting ur preferred number from a dice. It stated here that there is this nonlinear dynamics called chaos theory which makes everything unpredictable because we cannot detect chaos unless we conduct experiments . So does to our life sometimes it is predictable but most of the time we get surprised with the outcome we can get.
Chapter 9
In this chapter it talked about simplicity but simplicity becomes complex when we look a little deeper on the situation. The author gave an example when "A fox chases the rabbit" well it does naturally chases the rabbit that makes it simple but if we look deeper like the process to perform that act from the recognition of the fox that the rabbit is a rabbit to ready its legs to run after it. I can relate it to our life that we may see it as simple but it tends to be complicated. Like math, when the teacher is discussing the topic it looks easy but when you try to solve it yourself it turns out to be difficult. Math teaches us about life, there is a thing that we look at it as simple but when we look carefully there is a deeper message hidden with in them.
19th century scientists thought that, if you knew the starting conditions, and then the rules governing any system, you could completely predict the outcomes. In the 1970s and 80s it became increasingly unclear that this was wrong. It is impossible because you can never define the starting conditions with complete certainty. Thus all real world behaviours are subject to ‘sensitivity to initial conditions’. From minuscule divergences at the start point, cataclysmic differences may eventually emerge.
ReplyDelete-Alexis P Baider III
Christine Joyce F. Magote
ReplyDeleteHis book is full of information that if you really read you will get a lot of knowledge, in this chapter he explains the concept of phase space developed by Henri Poincaré: this is an imaginary mathematical space that represents all possible motions in a given dynamic system. The phase space is the 3-D place in which you plot the behaviour in order to create the phase portrait. Instead of having to define a formula and worrying about identifying every number of the behaviour, the general shape can be determined.
In this chapter I learned that According to modern physics, nature is ruled by chance on its smallest scales of space and time. For instance, whether a radioactive atom-of uranium, say-does or does not decay at any given instant is purely a matter of chance. Also Chaos is overturning our comfortable assumptions about how the world works. It tells us that the universe is far stranger than we think.
ReplyDeleteDoctor,Juan Miguel M
Bsca 11-m2
19th century scientists thought that, if you knew the starting conditions, and then the rules governing any system, you could completely predict the outcomes. In the 1970s and 80s it became increasingly unclear that this was wrong. It is impossible because you can never define the starting conditions with complete certainty. Thus all real world behaviours are subject to ‘sensitivity to initial conditions’. From minuscule divergences at the start point, cataclysmic differences may eventually emerge.
ReplyDeleteIn this chapter mention the statement of Einstein " You believe in a God, who play dice and 1 in complete law and order." The very distinction that Einstein was trying to pinpoint is between the randomness of chance and determination of law. Deterministic can provide behavior that appears random.This statement is not so much about whether God play dice, but how God plays dice. It is his discovery that give us the chaos and its implication have yet to create full impact on our thinking.
ReplyDelete-Tuang
In this chapter the only thing that captured my interest is about the quantum mechanics and classical mechanics. But most likely this chapter focuses on the classical mechanics, for a moment let us consider the quantum-mechanical context. It was this view of quantum indeterminacy that prompted Einstein's famous statement that "you believe in a God who plays dice and 1 in complete law and order."
ReplyDeleteAspacio, Mary Joy C.
Mathematics is flexible, meaning everything are possible, everything have chances, and everything can be assumed, somehow there will always be an effect and causes that can happen, the problems can be solve by the help of the theories relates with mathematics.
ReplyDeleteRana maurine lomarda
BSHM11-M4