Astoundingly, I'm enjoying a book about calculus, Infinite Powers by Steven Strogatz. The subtitle is the reason: How Calculus Reveals the Secrets of the Universe.
Hey, I'm all in on having the secrets of the universe revealed to me. Especially when the cost was a mere $16.52 to have Amazon deliver the book to my doorstep.
I started this post with the word "astoundingly" because I was forced to take a calculus class in the first year of my Systems Science Ph.D. program studies way back when. (I completed the course work, but then gave up on being called Dr. Hines, as appealing as that prospect was.)
All I remember about the class was that I didn't like it. I passed, but I can't recall a single deep philosophical statement about the universe during that enforced dive into the equations of calculus, introductory variety.
I've only read 25 pages of Infinite Powers, yet already I'm fascinated by calculus -- because Strogatz, a Professor of Applied Mathematics at Cornell, is both an excellent writer and a talented expositor of this difficult subject.
Here's a sampling in Strogatz' own words of what's intrigued me in the few pages I've perused so far. (The headlines are mine.)
Calculus is key to modern civilization.
Without calculus, we wouldn't have cell phones, computers, or microwave ovens. We wouldn't have radio. Or television. Or ultrasound for expectant mothers, or GPS for lost travelers. We wouldn't have split the atom, unraveled the human genome, or put astronauts on the moon. We might not even have the Declaration of Independence.
The universe is mathematical.
For reasons nobody understands, the universe is deeply mathematical. Maybe God made it that way. Or maybe it's the only way a universe with us in it could be, because nonmathematical universes can't harbor life intelligent enough for us to ask the question.
In any case, it's a mysterious and marvelous fact that our universe obeys laws of nature that always turn out to be expressible in the language of calculus as sentences called differential equations. Such equations describe the difference between something right now and the same thing an instant later or between something right here and the same thing infinitesimally close by.
Calculus expresses the universe's operating system.
To put this awesome assertion another way, there seems to be something like a code to the universe, an operating system that animates everything from moment to moment and place to place. Calculus taps into this order and expresses it.
Infinity is what gives calculus its power.
Calculus succeeds by breaking complicated problems down into simpler parts. That strategy, of course, is not unique to calculus. All good problem-solvers know that hard problems become easier when they're split into chunks. The truly radical and distinctive means of calculus is that it takes this divide-and-conquer strategy to its utmost extreme -- all the way out to infinity.
Instead of cutting a big problem into a handful of bite-size pieces, it keeps cutting and cutting relentlessly until the problem has been chopped and pulverized into its tiniest conceivable parts, leaving infinitely many of them. Once that's done, it solves the original problem for all the tiny parts, which is usually a much easier task than solving the initial giant problem.
The remaining challenge at that point is to put all the tiny answers back together again. That tends to be a much harder step, but at least it's not as difficult as the original problem was.
Even the quantum world bows down to calculus.
Admittedly, some aspects of our ever-changing world lie beyond the approximations and wishful thinking inherent in the Infinity Principle. In the subatomic realm, for example, physicists can no longer think of an electron as a classical particle following a smooth path in the same way that a planet or a cannonball does.
According to quantum mechanics, trajectories become jittery, blurry, and poorly defined at the microscopic scale, so we need to describe the behavior of electrons as probability waves instead of Newtonian trajectories. As soon as we do that, however, calculus returns triumphantly. It governs the evolution of probability waves through something called the Schrodinger equation.
It's incredible but true: Even in the subatomic realm where Newtonian physics breaks down, Newtonian calculus still works. In fact, it works spectacularly well.
Completed infinity is the original sin of calculus.
Should we take the plunge and say that a circle truly is a polygon with infinitely many infinitesimal sides? No. We mustn't do that, mustn't yield to that temptation. Doing so would be to commit the sin of completed infinity. It would condemn us to logical hell.
...Like the biblical original sin, the original sin of calculus -- the temptation to treat a circle as an infinite polygon with infinitesimally short sides -- is very hard to resist, and for the same reason. It tempts us with the prospect of forbidden knowledge, with insights unavailable by ordinary means.
The sin of dividing by zero.
All across the world, students are taught that division by zero is forbidden... The root of the problem is infinity. Dividing by zero summons infinity in much the same way that a Ouija board supposedly summons spirits from another realm. It's risky. Don't go there.
...The transgression that dragged us into this mess was pretending that we could actually reach the limit, that we could treat infinity like an attainable number.
...In the context of chopping a line into pieces, potential infinity would mean that the line could be cut into more and more pieces, as many as desired but still always a finite number and all of nonzero length. That's perfectly permissible and leads to no logical difficulties.
What's verboten is to imagine going all the way to a completed infinity of pieces of zero length.
That, Aristotle felt, would lead to nonsense -- as it does here, in revealing that zero times infinity can give any answer. And so he forbade the use of completed infinity in mathematics and philosophy. His edict was upheld by mathematicians for the next twenty-two hundred years.
Infinity is a dangerous temptress.
Somewhere in the dark recesses of prehistory, somebody realized that numbers never end. And with that thought, infinity was born. It's the numerical equivalent of something deep in our psyches, in our nightmares of bottomless pits, and in our hopes for eternal life.
Infinity lies at the heart of so many of our dreams and fears and unanswerable questions: How big is the universe? How long is forever? How powerful is God? In every branch of human thought, from religion and philosophy and science and mathematics, infinity has befuddled the world's finest minds for thousands of years.
Discrete and continuous become one.
Consider how an old-fashioned analog clock differs from a modern-day digital/mechanical monstrosity. On the analog clock, the second hand sweeps around in a beautifully uniform motion. It depicts time as flowing. Whereas on the digital clock, the second hand jerks forward in discrete steps, thwack, thwack, thwack. It depicts time as jumping.
Infinity can build a bridge between these two very different conceptions of time. Imagine a digital clock that advances through trillions of little clicks per second instead of one loud thwack. We would no longer be able to tell the difference between that kind of digital clock and a true analog clock.
Likewise with movies and videos: as long as the frames flash by fast enough, say at thirty frames a second, they give the impression of seamless flow. And if there were infinitely many frames per second, the flow truly would be seamless.
...So in everyday life, the gulf between the discrete and the continuous can often be bridged, at least to a good approximation. For many practical purposes, the discrete can stand in for the continuous, as long as we slice things thinly enough. In the ideal world of calculus, we can go one better.
Anything that's continuous can be sliced exactly (not just approximately) into infinitely many infinitesimal pieces. That's the Infinity Principle. With limits and infinity, the discrete and the continuous become one.
Get real. Accept limits. Embrace continuous.
From the beginning, calculus has stubbornly insisted that everything -- space and time, matter and energy, all objects that ever have been or will be -- should be regarded as continuous. Accordingly, everything can and should be quantified by real numbers.
In this idealized, imaginary world, we pretend that everything can be split finer and finer without end. The whole theory of calculus is built on that assumption. Without it, we couldn't compute limits, and without limits, calculus would come to a clanking halt.
If all we ever used are decimals with only sixty digits of precision, the number line would be pockmarked and cratered. There would be holes where pi, the square root of two, and any other numbers that need infinitely many digits after the decimal point would exist.
Even a simple fraction such as 1/3 would be missing, because it too requires an infinite number of digits (o.333...) to pinpoint its location on the number line. If we want to think of the totality of all numbers as forming a continuous line, those numbers have to be real numbers.
They may be an approximation to reality, but they work amazingly well. Reality is too hard to model any other way. With infinite decimals, was with the rest of calculus, infinity makes everything simpler.