# Fourier Transform: Nature’s Way of Analyzing Data

Described as “nature’s way of analyzing data” by Yale professor Ronald Coifman, the Fourier Transform is arguably the most powerful analytical tool in modern mathematics. Professor Peter Moore, a Yale structural biologist and professor of biophysics, agrees. “To form an image on your retina, the lens in your eye performs Fourier transformations on the light that enters it,” he explains. This tool is truly ubiquitous in nature, as our eyes and ears have subconsciously performed the Fourier transform to interpret sound and light waves for millions of years. Hence it was only a matter of time until the human intellect caught up to our internal processing systems and was able to functionally describe this process. After years of research, French Baron Jean-Baptiste-Joseph Fourier uncovered this powerful tool in the early 1800s, naming it the Fourier transform.

Fourier, a French military scientist, became interested in heat transfer in the late 1790s. In fact, many of his guests often complained that he kept his home uncomfortably warm. During Napoleon’s expansion campaigns, Fourier served on the Institute of Egypt’s scientific body in 1800, and after the French left Egypt, he set his efforts on repairing France from the devastation of the 1789 French Revolution. During this time, his obsession with heat transfer drove him to derive an equation describing the conduction of heat in solid bodies. Within seven years, he invented the Fourier transform to solve this equation.

The question itself was complicated; Fourier wanted to solve his equation to describe the flow of heat around an iron ring that attaches a ship’s anchor to its chain. He proposed that the irregular distribution of temperature could be described by the frequencies of many component sinusoidal waves around the ring. His premise was that the maximum temperature and position of the harmonics of these sinusoidal components could be derived via the Fourier transform of the originally irregular distribution of temperatures.

With today’s conceptions of mathematics and physics, these claims seem natural. Professor Coifman explains, “The time vibrations of any mechanical system is a combination of sines and cosines.” However, during the early 1800s, Fourier’s claims were radical. He proposed that discontinuous functions, such as temperature distributions, could be described by combining many continuous functions; for example, an infinite number of sinusoids could represent any function, including one with multiple jumps or discontinuities. Not surprisingly, these claims were met with heavy scrutiny. In fact, during one of Fourier’s research presentations, a contemporary French mathematician Joseph Louis Lagrange reportedly exclaimed that his ideas were “nothing short of impossible.”

Although Lagrange himself made a variety of mathematical contributions during the 1800s that greatly aided modern studies of astronomy and economics, the ubiquitous power of the Fourier transform in the modern mathematical world indicates that his doubts were misguided. Today, the Fourier transform can be applied in two different ways. First, it can be used to describe continuous functions – functions providing values for every real number. In these cases, the original function is deconstructed into component sinusoidal functions at every frequency, and these are combined by the Fourier integral operation. A second type of function that the Fourier transform can be applied is one consisting of numerous discrete values, a common form of data obtained from scientific experimentation. In these cases, a Fourier series is calculated by the sum of a series of sine or cosine functions at discrete frequencies.

Even though Fourier derived this method of analysis in the early 1800s, its general applicability to solving scientific problems in an efficient manner was greatly limited until the advent of modern electronic computation. Before the 1960s, Fourier transforms for newly discovered functions that were based on experimental data or not found in reference tables necessitated an intimidating and tedious amount of arithmetic calculations: for every n data points (usually well over 1,000 for many studies), approximately n additions and n2 multiplications were needed to perform the transform. As stated by Professor Moore, “the sheer computational difficulties” and effort involved in performing a Fourier transform on paper restricted its widespread usage in the sciences.

However, in 1965, Princeton mathematics professors James Cooley and John Tukey developed the so-called “Fast Fourier Transform” (FFT) algorithm at IBM’s Watson Research center. This computational method was an integral discovery that greatly expanded the potential use of Fourier methods to solve scientific problems, and was arguably based on the work of German mathematician Carl Freidrich Gauss in the early 1800s. The FFT exponentially reduces the number of multiplication steps needed to analyze curves. For example, if a curve consisted of 8 samples, then 82, or 64, multiplication steps would be needed for analysis via traditional Fourier methods. The FFT breaks this curve into four irreducible sets of 2 sample points each. Then, these are recombined into two four-point transforms, and finally into the 8-point transform of interest. Since each stage requires 8 multiplications, the total number of steps required is 16, just one-fourth of the original 64. Therefore, the advent of the Fast Fourier Transform, and its counterparts, such as the related Hartley transform, have allowed for a much more widespread application to many scientific fields dealing with fluctuations or wave-like phenomena. Professor Moore agrees, “One of the reasons that the Fourier transform has become so pervasive today is because the computation has become routine.”

Peter Moore has been using Fourier transform methods to solve biological problems since his days as a graduate student, explaining that Fourier transforms are nothing short of “pervasive in the physical sciences.” He primarily deals with its applications to crystallography and nuclear magnetic resonance (NMR) imaging, noting that Fourier transforms are essentially “built into the physics of many scientific phenomena.” Further applications lie in the realm of imaging and spectroscopy. Modern NMR methods collect data in the form of electrical signals as a function of time, but display them as a function of frequency. A Fourier transform is used to proceed from the time domain to the frequency domain. “I use them all the time,” Moore notes, “I can’t even remember a time when I wasn’t using them.” Professor Coifman, whose research deals with inventing new complex transformations, agrees with the ubiquity of this method, “Nowadays, there isn’t a single electronic instrument that doesn’t use a Fourier transform.” Indeed, the Fourier transform today is vital to audio and video compression; without it, MP3 players and digital video cameras would not exist!

“The image formation carried out by any focusing lens is most accurately described by the Fourier transform,” concludes Professor Moore. From this, the true ubiquity of the Fourier transform reveals itself; not only does it possess fundamental applications in modern electronics, biology, chemistry, and medicine, it is rooted in stimuli processing mechanisms that we relied upon for millions of years before the work of Jean-Baptiste-Joseph Fourier. Professor’s Coifman’s concluding words are therefore undeniable: the Fourier transform is “one of the most fundamental mathematical tools in today’s world.”

Bracewell, Ronald. The Fourier Transform & Its Applications. New York: McGraw-Hill, 1999.

Bracewell, R. (1989) “The Fourier Transform,” Scentific American, June: 62-69.

Brigham, Oran. The Fast Fourier transform and its applications. New York: Prentice-Hall:

1988.