The 17 Equations That Changed The World By: Ian Stewart Pythagoras Theorem
The 17 Equations That Changed The World By: Ian Stewart Pythagoras Theorem
Stewart.
THE 17 EQUATIONS THAT CHANGED THE WORLD BY: IAN STEWART
PYTHAGORAS THEOREM
LOGARITHM
CALCULUS
The formula given here is the definition of the derivative in calculus. The
derivative measures the rate at which a quantity is changing. For example,
we can think of velocity, or speed, as being the derivative of position — if you
are walking at 3 miles per hour, then every hour, you have changed your
position by 3 miles. Naturally, much of science is interested in understanding how things change, and the
derivative and the integral — the other foundation of calculus — sit at the heart of how mathematicians
and scientists understand change.
LAW OF GRAVITY
Newton's law of gravitation describes the force of gravity between two objects, F, in
terms of a universal constant, G, the masses of the two objects, m 1 and m2, and the
distance between the objects, r. Newton's law is a remarkable piece of scientific history
— it explains, almost perfectly, why the planets move in the way they do. Also
remarkable is its universal nature — this is not just how gravity works on Earth, or in our
solar system, but anywhere in the universe. Newton's gravity held up very well for two hundred years,
and it was not until Einstein's theory of general relativity that it would be replaced.
Polyhedra are the three-dimensional versions of polygons, like the cube to the right.
The corners of a polyhedron are called its vertices, the lines connecting the vertices are
its edges, and the polygons covering it are its faces. A cube has 8 vertices, 12 edges,
and 6 faces. If I add the vertices and faces together, and subtract the edges, I get 8 + 6 -
12 = 2. Euler's formula states that, as long as your polyhedron is somewhat well
behaved, if you add the vertices and faces together, and subtract the edges, you will always get 2. This
will be true whether your polyhedron has 4, 8, 12, 20, or any number of faces. Euler's observation was
one of the first examples of what is now called a topological invariant — some number or property shared
by a class of shapes that are similar to each other. The entire class of "well-behaved" polyhedra will have
V + F - E = 2. This observation, along with Euler's solution to the Bridges of Konigsburg problem, paved
the way to the development of topology, a branch of math essential to modern physics.
Normal Distribution
The normal probability distribution, which has the familiar bell curve graph above, is
ubiquitous in statistics. The normal curve is used in physics, biology, and the social
sciences to model various properties. One of the reasons the normal curve shows up
so often is that it describes the behavior of large groups of independent processes.
Wave Equation
Fourier Transform
The Fourier transform is essential to understanding more complex wave structures, like
human speech. Given a complicated, messy wave function like a recording of a person
talking, the Fourier transform allows us to break the messy function into a combination
of a number of simple waves, greatly simplifying analysis. The Fourier transform is at
the heart of modern signal processing and analysis, and data compression.
Navier Stokes
Like the wave equation, this is a differential equation. The Navier-Stokes equations describes the
behavior of flowing fluids — water moving through a pipe, milk being mixed into coffee, air flow over an
airplane wing, or smoke rising from a cigarette. While we have approximate solutions of the Navier-
Stokes equations that allow computers to simulate fluid motion fairly well, it is still an open question
(with a million dollar prize) whether it is possible to construct mathematically exact solutions to the
equations.
Maxwell’s Equation
This set of four differential equations describes the behavior of and relationship
between electricity (E) and magnetism (H). Maxwell's equations are to classical
electromagnetism as Newton's laws of motion and law of universal gravitation are to
classical mechanics — they are the foundation of our explanation of how
electromagnetism works on a day to day scale. As we will see, however, modern
physics relies on a quantum mechanical explanation of electromagnetism, and it is now
clear that these elegant equations are just an approximation that works well on human
scales.
This states that, in a closed system, entropy (S) is always steady or increasing.
Thermodynamic entropy is, roughly speaking, a measure of how disordered a system is. A
system that starts out in an ordered, uneven state — say, a hot region next to a cold region
— will always tend to even out, with heat flowing from the hot area to the cold area until
evenly distributed. The second law of thermodynamics is one of the few cases in physics
where time matters in this way. Most physical processes are reversible — we can run the equations
backwards without messing things up. The second law, however, only runs in this direction. If we put an
ice cube in a cup of hot coffee, we always see the ice cube melt, and never see the coffee freeze.
Relatively
Einstein radically altered the course of physics with his theories of special and general
relativity. The classic equation E = mc2 states that matter and energy are equivalent to
each other. Special relativity brought in ideas like the speed of light being a universal
speed limit and the passage of time being different for people moving at different
speeds. General relativity describes gravity as a curving and folding of space and time themselves, and
was the first major change to our understanding of gravity since Newton's law. General relativity is
essential to our understanding of the origins, structure, and ultimate fate of the universe.
Schrödinger Equation
This is the main equation in quantum mechanics. As general relativity explains our
universe at its largest scales, this equation governs the behavior of atoms and subatomic
particles. Modern quantum mechanics and general relativity are the two most
successful scientific theories in history — all of the experimental observations we have
made to date are entirely consistent with their predictions. Quantum mechanics is also
necessary for most modern technology — nuclear power, semiconductor-based computers, and lasers are
all built around quantum phenomena.
Information Theory
The equation given here is for Shannon information entropy. As with the thermodynamic entropy given
above, this is a measure of disorder. In this case, it measures the information content of a message — a
book, a JPEG picture sent on the internet, or anything that can be represented symbolically. The Shannon
entropy of a message represents a lower bound on how much that message can be compressed without
losing some of its content. Shannon's entropy measure launched the mathematical study of information,
and his results are central to how we communicate over networks today.
Chaos Theory
This equation is May's logistic map. It describes a process evolving through time —
xt+1, the level of some quantity x in the next time period — is given by the formula on
the right, and it depends on xt, the level of x right now. k is a chosen constant. For
certain values of k, the map shows chaotic behavior: if we start at some particular
initial value of x, the process will evolve one way, but if we start at another initial
value, even one very close to the first value, the process will evolve a completely
different way. We see chaotic behavior — behavior sensitive to initial conditions — like this in many
areas. Weather is a classic example — a small change in atmospheric conditions on one day can lead to
completely different weather systems a few days later, most commonly captured in the idea of a butterfly
flapping its wings on one continent causing a hurricane on another continent.