There are three really important things about the Fourier transform in my mind. Two are math and one is engineering.
- the Fourier Transform preserves energy (Parceval's theorem, the norm of the transformed function is the same as the norm of the original)
- there exists an inverse transform to get the original function back
- once you grasp that magnitude/phase describe patterns in the function you can gain powerful intuition about the transform and how to use it as an analytical and design tool.
Those first two properties tell us that the transform preserves information, basically it's another way of looking at the same thing to gain more insight without loss. The third is something not harped on enough in engineering courses, and failure to teach it in my mind is one reason so many people thing controls/signal processing is black magic.
A big followup question here is, are there other transforms for which energy is preserved, and there exists an inverse? The answer is yes, there are infinitely many. The third property is more useful, which begs the question, which of those other transforms are useful?
An example of this is a cousin of the Fourier Transform called the Discrete Cosine Transform which is critical in compression, classification, and machine learning (especially contemporary speech recognition). It's not as straightforward as Fourier, since the result isn't as obvious as breaking down the energy into patterns, but rather what it does is break the energy into correlated parts, in other words it preserves the energy while representing it in decorrelated bins. The strongest of those bins is the most important part, which is why compression technology works by taking the DCT and tossing out low magnitude components (it preserves the most important energy) while also showing how it can work for machine learning, where it decomposes the input information into an equivalent but decorrelated representation where inputs aren't shared for different parts of something like a neural net.
There are other equally cool orthogonal transforms, I like the Hilbert transform myself because it can extract really useful info like signal envelopes and be used to make wacky noises, like a frequency shifter.
Don’t forget that Heisenberg’s uncertainty principle can be derived as a property (theorem?) of the Fourier transform [1]. I found the Laplace transform intriguing as well [2].
Parent is right. The Laplace transform is a generalization of Fourier's. There are two types of Laplace tranforms, the two-sided and the one-sided. The two-sided is defined in all of R.
The s variable is a complex frequency in the form s = σ + jw. Setting σ = 0 in yields the Fourier tranform. The single-sided LT transform is equivalent to the one-sided if the signal is causal. A causal signal f(t) is zero for all t < 0. All of the signals in the real world are causal. This is convenient as the exponential e^(-st) = e^(-(σ + jw)t) gets really big for values of σ close to minus infinity something the renders the whole integral non convergent. We can avoid that if we make sure the signal is causal.
Good summary. I would add the bandwidth equality, which is akin to the Uncertainty Principle in physics, that shows a function cannot be be both time and frequency bounded.
"The Scientist and Engineer's Guide to Digital Signal Processing" is a bit too verbose and hand wavy for my liking; looking for something more succinct and rigorous.
Two texts I keep on my desk that are a bit more rigorous than most:
- Oppenheim & Schafer Discrete Time Signal Processing (the Bible of DSP)
- Manolakis & Ingle, Applied Digital Signal Processing (good discussion of orthogonal transforms).
A lot of what I know about transform analysis comes from self study of linear algebra/vector spaces with some reading here and there in commonly cited papers. Might want to pick up a text on that subject, it's the same idea but more rigor than an engineer would use.
There's also a book I haven't worked all the way through yet and is dated, heavily based in EE concepts (non negligible amount of circuit theory) and extremely rigorous called Theory of Linear Physical Systems by Ernst Guilleman. I picked it up last week actually and quite like it.
It has a lot of information and approaches with Fourier/Laplace methods, which is interesting since it predates the FFT and has so much information on concepts that engineers 50 years ago would need to build their intuition with instead of through tooling. I picked it up for the network theory/dynamical systems angle (which relates to some stuff I'm working on) but the rigor is definitely higher than what you'd see in those more digestible books.
Practical Signal Processing by Owen might be a lead worth looking into. It was recommended by Ossmann in his HackRF tutorial videos (he created the HackRF).
It's gonna fall near the extreme end of succinctness.
I had since purchased the book but haven't gone more than a couple chapters deep. It definitely looks like it allows you to dive into the meat of DSP. Don't know if it will fit the bill for rigorous.
I also didn't pay anything near what it's currently listing for on Amazon.
You could try Arfken and Weber's "Mathematical Methods for Physicists". It's a common reference for Physicists. While it doesn't focus on engineering (or signal) applications, it's both fairly rigorous and succinct enough for practical use. Looks like there are pdf's online.
- the Fourier Transform preserves energy (Parceval's theorem, the norm of the transformed function is the same as the norm of the original)
- there exists an inverse transform to get the original function back
- once you grasp that magnitude/phase describe patterns in the function you can gain powerful intuition about the transform and how to use it as an analytical and design tool.
Those first two properties tell us that the transform preserves information, basically it's another way of looking at the same thing to gain more insight without loss. The third is something not harped on enough in engineering courses, and failure to teach it in my mind is one reason so many people thing controls/signal processing is black magic.
A big followup question here is, are there other transforms for which energy is preserved, and there exists an inverse? The answer is yes, there are infinitely many. The third property is more useful, which begs the question, which of those other transforms are useful?
An example of this is a cousin of the Fourier Transform called the Discrete Cosine Transform which is critical in compression, classification, and machine learning (especially contemporary speech recognition). It's not as straightforward as Fourier, since the result isn't as obvious as breaking down the energy into patterns, but rather what it does is break the energy into correlated parts, in other words it preserves the energy while representing it in decorrelated bins. The strongest of those bins is the most important part, which is why compression technology works by taking the DCT and tossing out low magnitude components (it preserves the most important energy) while also showing how it can work for machine learning, where it decomposes the input information into an equivalent but decorrelated representation where inputs aren't shared for different parts of something like a neural net.
There are other equally cool orthogonal transforms, I like the Hilbert transform myself because it can extract really useful info like signal envelopes and be used to make wacky noises, like a frequency shifter.