Top
Best
New

Posted by rbanffy 9/4/2025

What Is the Fourier Transform?(www.quantamagazine.org)
474 points | 205 comments
anyfoo 9/4/2025|
If you like Fourier, you're going to love Laplace (or its discrete counterpart, the z transform).

This took me down a very fascinating and intricate rabbit hole years ago, and is still one of my favorite hobbies. Application of Fourier, Laplace, and z transforms is (famously) useful in an incredibly wide variety of fields. I mostly use it for signal processing and analog electronics.

segfault99 9/5/2025||
When I did EE, didn't have access to any kind of computer algebra system. Have 'fond' memories of taking Laplace transform transfer functions and converting to z-transform form. Expand and then re-group and factor. Used a lot of pencil, eraser and line printer fanfold paper for doing the very basic but very tedious algebra. Youngsters today don't know how lucky.. (ties onion to belt, etc., etc.)
taneq 9/5/2025|||
Did you make sproingies from the tear-off side strips of the printer paper, though? That was the best bit. :P
segfault99 9/5/2025||
Of course!
echelon 7 days ago||
This continued with kids into the 90's. I miss that bit.

https://www.reddit.com/r/nostalgia/comments/b6dptv/folding_t...

mkipper 7 days ago||||
Was this professionally or in school? I still did this in an EE program 15 years ago and I can't imagine things have changed since then. I think kids still have to do lots of ugly math in EE classes.
segfault99 7 days ago||
Undergrad. Mid-late 1980s.

I wasn't making point about mathematics qua mathematics. Was thinking that if I were doing EE undergrad today, I'd use SageMath or Mathematica to crunch the mechanical algebraic manipulations involved in doing a z-transform.

schlauerfox 7 days ago||||
I just recently got my Computer Engineering degree which is the modern Electronics Engineering and we had a whole class on transforms. We had to do it on paper, but that professor at Cal State LA knew what the heck she was doing. We learned it good.
zwnow 9/5/2025|||
No worries, as a self proclaimed youngster I didn't manage to understand Fourier in 2 days and never bothered again. Also had no other prior knowledge to algebra so maybe that's why I struggled. Never perceived algebra as useful in anything programming related, will continue to do so as most problems are solvable without it. I'll let the degree havers do all that stuff.
Sharlin 9/5/2025|||
> Never perceived algebra as useful in anything programming related

Image, video, and audio processing and synthesis, compression algorithms, 2D and 3D graphics and geometry, physics simulation, not to mention the entire current AI boom that's nothing but linear algebra… yeah, definitely algebra isn't useful in anything programming related.

zwnow 9/5/2025||
Yea that's not what I work with my guy
Sharlin 9/5/2025||
You said "anything programming related", not "anything related to my work", "my guy".
hnuser123456 7 days ago||||
So you're a programmer but you've never assigned a number to a variable or written any math operations? Do you just do string translations or something?
segfault99 7 days ago|||
Plot twist: He's a Haskell guru juggling hylomorphisms blindfolded.
zwnow 7 days ago|||
I'm talking algebra you need a degree for. Well algebra u learn while getting one that is.
perching_aix 9/5/2025||||
You might find LLMs to be a useful crutch for this to an extent, although it's very easy to take the wrong turn and go off into the deep end. But as long as you keep forcefully connecting it back to practical reality, you can get progress out of it. And of course, never actually make it calculate.
dsego 9/5/2025||||
Have I got a video for you.

gingerBill – Tools of the Trade – BSC 2025 https://youtu.be/YNtoDGS4uak

IAmBroom 7 days ago|||
"Book learnin' didn't do me no good no how!"
armanj 9/5/2025|||
Years ago, I often struggled to choose between Amazon products with high ratings from a few reviews and those with slightly lower ratings but a large volume of reviews. I used the Laplace Rule of Succession to code a browser extension to calculate Laplacian scores for products, helping to make better decisions by balancing high ratings with low review counts. https://greasyfork.org/en/scripts/443773-amazon-ranking-lapl...
CuriouslyC 9/5/2025|||
Just for reference, in case you find yourself in an optimization under uncertainty situation again: The decision-theoretic right way to do this is generate a bayesian posterior over true ranking given ranking count and a prior on true rankings, add a loss function (it can just be the difference between the true rating of the selected item and the true rating of the non-selected item for simplicity) then choose your option to minimize the expected loss. This produces exactly the correct answer.
yossarian22 7 days ago||
Can you please provide an example or link to read more? Seems very interesting.
kragen 5 days ago||
https://en.m.wikipedia.org/wiki/Decision_theory
Shadowmist 9/5/2025||||
I always assume that all the ratings are fake when there is a low count of ratings since it is easy for the seller to place a bunch of game orders when they are starting out.
anentropic 7 days ago||
A bigger problem I find is many Amazon listings having a large number of genuine positive reviews, but for a completely different product than the one currently for sale.

Recently I was buying a chromecast dongle thing and one of the listings had some kind of "Amazon recommends" badge on it, from the platform. It had hundreds of 5 start reviews, but if you read them they were all for a jar of guava jam from Mexico.

I'm baffled why Amazon permits and even seemingly endorses this kind of rating farming

kragen 9/5/2025|||
While this is a good idea, I think it's unrelated to the Laplace transform except that they're named after the same dude?
armanj 7 days ago||
I referenced 3B1B for the name: youtube.com/watch?v=8idr1WZ1A7Q
another_twist 7 days ago|||
When I first learned Laplace transform in university, it was my goto for differential equations of any kind. I was even naive enough to believe well this is a solved problem now. Eventually found out this wasnt the case after studying PDEs. Its still my favourite transform. Immensely useful not to mention the whole method of moments in random variables is basically laplace transform.

I don't like Fourier transform but for petty reasons. In the engineering exams, I messed up a Fourier Transform calculation and ended up just a few points short of a perfect score. Hate it ever since :)

Sesse__ 7 days ago||
You know that if you have the Laplace transform, you can just insert s = iω and then you have the Fourier transform, right? :-P

(Or jω, if you prefer that notation)

another_twist 4 days ago||
Tell me about it. But I like my numbers real not imaginary !
arethuza 9/5/2025|||
When I think of Laplace Transforms I always think of control theory - poles, zeros etc.
kmarc 9/5/2025|||
Probably that's why we are learning about it in the "Control Theory" classes at university. :-)

Jokes aside, I graduated as "Computer Engineer" (BSc) and then also did a "Master in Computer Science"; I was (young and) angry at the universe why soooo many classical engineering classes and then theory I had to sit through (Control theory, Electrical engineering, Physics), and we never learned about the cool design patterns etc etc.

Today I see that those formative years helped me a lot with how I develop intuition when looking at large (software) systems, and I also understand that those ever changing best design patterns I can (could have) just look up, learn, and practice in my free time.

I wish a today-me would have told my yesterday-me all this.

arethuza 9/5/2025||
I learned about it after I graduated with a CS degree - I mean in true university degree fashion we'd been taught about Laplace and Z transforms (and related things) but with no practical applications.

After graduating I joined an academic research team based mainly in a EE department who were mainly Control Engineers - we were mainly doing stuff around qualitative reasoning and using it for fault diagnosis, training etc.

arethuza 9/5/2025||
To be fair (and because I've just remembered - it was ~40 years ago) we did get some practical stuff covered in the maths part of my CS degree in the application of group theory (groups, rings & fields) to coding theory.
analog31 9/5/2025|||
My control theory professor (who was also my physics advisor -- it was a small college) explained it like this: Physicists like Fourier transforms because they go from minus to plus infinity, like the universe. Control engineers like Laplace transforms because they start at zero, and a control system also has a starting point.
Sesse__ 7 days ago||
The two-sided Laplace transform would probably have made his head explode.
zozbot234 9/5/2025|||
The so-called "Z transform" for discrete sequences is really just a misnomer for the actual method of generating functions (and formal power-series/Laurent-series). You just write a discrete sequence as a power series in z^(-1).
segfault99 9/5/2025|||
True dat. But you see there's this thing called 'Engineering Maths'. Apparently it's really bad for real mathematicians' blood pressure.
zozbot234 9/5/2025||
Analytic combinatorics (the rubric where mathematicians would want to place all the region-of-convergence, zeros-poles, etc. analysis of generating functions–formal power/Laurent series–Z transforms that engineering often focuses on) is not exactly easy-going either. Other common methods (relating convolution to multiplication, inverting transforms etc.) would traditionally be comprised under the Operational Calculus of Mikusiński.
segfault99 6 days ago||
I forgot to mention the converse also applies. Mathematicians talking about stuff we engineers learned the paint by numbers way makes our heads hurt!
srean 7 days ago|||
> really just a misnomer

No. Things acquire different names if they are independently discovered by different communities.

Native Americans called Indians. Lol! what was that.

artyom 9/5/2025|||
Essentially that's what electrical/electronics engineering is about.
jojobas 9/5/2025||
Then there's the whole mindfuck of fractional order Fourier (and other) transforms.
yshklarov 9/5/2025||
As everyone in this thread is sharing links, I'm gonna pitch in, too.

This lecture by Dennis Freeman from MIT 6.003 "Signals and Systems" gives an intuitive explanation of the connections between the four popular Fourier transforms (the Fourier transform, the discrete Fourier transform, the Fourier series, and the discrete-time Fourier transform):

https://ocw.mit.edu/courses/6-003-signals-and-systems-fall-2...

RachelF 9/5/2025||
I wonder what happened to Wavelet transforms? The were very popular years ago, and now one never hears about them.
energy123 9/5/2025|||
The use-case is slightly different. Wavelets are suited for non-stationary signals, while Fourier Transform has no time localization so it's more for stationary signals. Although short-time Fourier transform exists, which can handle non-stationary signals under the assumption of local stationarity.

Also, a property of wavelets is they're non-parametric, which limits their utility in knowledge discovery applications.

For ML applications, my opinions is that they're somewhat superseded by deep learning methods that apply less restrictive inductive bias. As data grows, the restrictive prior assumptions of wavelets will hurt, sort of like how CNN is being abandoned for ViT, even though CNN can outperform in situations where data is scarce.

So overall, they have a pretty small set of usecases where they're more suited than other alternative tools.

yshklarov 9/5/2025||||
Really, do you think they've somehow fallen out of favor? If so, that's a surprise to me.

In any case, they are a bit more advanced, and out of scope for the undergraduate course I linked to.

acjohnson55 9/5/2025|||
They have specialized applications for sure. I think it's just not as hot an area for new applied math work as 20 years ago.
mallowdram 9/5/2025||
Excellent! Thanks!
kragen 9/5/2025||
This is maybe a good first thing to read if you've never heard of the Fourier Transform before, but it makes it sound a great deal more arbitrary and random than it actually is. It might set your understanding back by giving you the illusion that you understand things you don't actually understand, and that would be a shame, because some of those things are more beautiful than a sunrise or a hummingbird.

It would be very sad to lose those treasures by passing them by because you thought you already had them. As the song says:

> When you're young, you're always looking

> On the far side of the hill;

> You might miss the fairest flower

> Standing by your side very still.

And the flowers of Fourier analysis are as fair as the fairest flowers in this universe, or any universe.

https://news.ycombinator.com/item?id=45134843 may be a clue to the hidden beauty, for those who are puzzled as to what I might be talking about.

nerdsniper 9/5/2025|
As usual, 3 Blue 1 Brown delivers: https://youtu.be/spUNpyF58BY?si=nSqHf_3zbhyu9YGd
kragen 9/5/2025|||
This is a much better introduction than the article, and it's only 20 minutes.
BrandoElFollito 6 days ago|||
3b1b is the xkcd of maths
oh_fiddlesticks 9/5/2025||
This video [1] from a visual effects channel (Captain Disillusion) has an excellent visual illustration of how the Fourier tranform works and how its used in visual effects in his video about blurring and unblurring ("ENHANCE!") images.

1. https://youtu.be/xDLxFGXuPEc?feature=shared

lock1 9/5/2025||
While I like CD's works, I would say "CD / Blur" is the least informational of all the CD slash series. I guess it's a fun and more accessible option, but it certainly lacks depth compared to something like 3B1B's video on FT.
gus_massa 7 days ago||
I like his videos and most of this particular video. For example, I agree with his choice to not discuss the nasty details of the FFT. Also, I think the "blur" part is fine, but the "unblur" part is too oversimplified.

I think he could have explained how the Gaussian filter almost kills all details / high frequency features, then rounding completely destroy them and then they can not be magically reconstructed to "unblur". He gives some hints about this, but they are to few and too general.

PS: There are some non lineal ticks "enhance" the "unblur" version , like minimizing the L1 norm of the gradient (or something like that). It helps with some images. I'm not sure which is the current state of the art.

alkyon 7 days ago|||
I found this video from 3Blue1Brown more informative in terms of mathematics involved:

https://youtu.be/spUNpyF58BY?feature=shared

Edit: In fact it was already mentioned in the comments, but I haven't noticed

hybrid_study 9/5/2025||
The Carl Sagan bit is also an amusing tribute.
abetusk 9/5/2025||
I have a pet theory that the reason why the FT, and other transforms (generating functions, Mellin/Laplace/Legendre/Haar), are so useful is because many real world functions are sparse and lend themselves to compressed sensing.

The FT, as are many other transforms, are 1-1, so, in theory, there's no information lost or gained. In many real world conditions, looking at a function in frequency space greatly reduces the problem. Why? Pet theory: because many functions that look complex are actually composed of simpler building in the transformed space.

Take the sound wave of a fly and it looks horribly complex. Pump it through the FT and you find a main driver of the wings beating at a single frequency. Take the sum of two sine waves and it looks a mess. Take the FT and you see the signal neatly broken into two peaks. Etc.

The use of the FT (or DCT or whatever) for JPEG, MP3 or the like, is basically exploiting this fact by noticing the signal response for human hearing and seeing it's not uniform, and so can be "compressed" by throwing away frequencies we don't care about.

The "magic" of the FT, and other transforms, isn't so much that it transforms the signal into a set of orthogonal basis but that many signals we care about are actually formed from a small set of these signals, allowing the FT and cousins to notice and separate them out more easily.

mitthrowaway2 9/5/2025||
As mentioned by other commenters, a reason for the FT's dominance in particular is because sine, cosine, and complex exponentials are the eigenfunctions of the derivative operator. Since so many real-world systems are governed by differential equations, the Fourier Transform becomes a natural lens to analyze these systems. Sound waves are one (of many) examples.

And there's another good reason why so many real-world signals are sparse (as you say) in the FT domain in particular: because so many real-world systems involve periodic motion (rotating motors, fly's wings as you noted, etc). When the system is periodic, the FT will compress the signals very effectively because every signal has to be harmonic of the fundamental frequency.

abdullahkhalids 9/5/2025||
The question is why "so many real-world systems are governed by differential equations" and "so many real-world systems involve periodic motion".

Well, stable systems are can either be stationary or oscillatory. If the world didn't contain so many stable systems, or equivalently if the laws of physics didn't allow so, then likely life would not have existed. All life is complex chemical structures, and they require stability to function. Ergo, by this anthropic argument there must be many oscillatory systems.

kragen 9/5/2025|||
Differential equations aren't limited to describing stable systems, though, and there are chaotic systems that are also in some sense stable.

Ordinary differential equations can describe any system with a finite number of state variables that change continuously (as opposed to instantaneously jumping from one state to another without going through states in between) and as a function of the system's current state (as opposed to nondeterministically or under the influence of the past or future or some kind of supernatural entity).

Partial differential equations extend this to systems with infinite numbers of variables as long as the variables are organized in the form of continuous "fields" whose behavior is locally determined in a certain sense—things like the temperature that Fourier was investigating, which has an infinite number of different values along the length of an iron rod, or density, or pressure, or voltage.

It turns out that a pretty large fraction of the phenomena we experience do behave this way. It might be tempting to claim that it's obvious that the universe works this way, but that's only because you've grown up with the idea and never seriously questioned it. Consider that it isn't obvious to anyone who believes in an afterlife, or to Stephen Wolfram (who thinks continuity may be an illusion), or to anyone who bets on the lottery or believes in astrology.

But it is at least an excellent approximation that covers all phenomena that can be predicted by classical physics and most of quantum mechanics as well.

As a result, the Fourier and Laplace transforms are extremely broadly applicable, at least with respect to the physical world. In an engineering curriculum, the class that focuses most intensively on these applications is usually given the grandiose title "Signals and Systems".

jcgrillo 9/5/2025|||
One amazing application of spectral theory I always harp on when this topic comes up is Chebfun[1]. Trefethen's Spectral Methods in Matlab is also wonderful.

[1] http://www.chebfun.org/

kragen 9/5/2025||
I haven't read it! Thanks for the recommendation!
kragen 7 days ago||
Apparently he uploaded it to ResearchGate: https://www.researchgate.net/profile/Hector-Carmenate/post/H...
abdullahkhalids 7 days ago|||
I agree broadly with what you say. I didn't have time to make a more comprehensive comment.
seanhunter 9/5/2025||||
That first question is a tautology. It’s like asking “Why is a screwdriver so perfect for turning screws?”

We have discovered a method (calculus) to mathematcally describe continuous functions of various sorts and within calculus there is a particular toolbox (differential and partial differential equations) we have built to mathematically describe systems that are changing by describing that change.

The fact that systems which change are well-described by the thing we have made to describe systems which change shouldn’t be at all surprising. We have been working on this since the 18th century and Euler and many other of the smartest humans ever devoted considerable effort to making it this good.

When you look at things like the chaotic behaviour of a double pendulum, you see how the real world is extremely difficult to capture precisely and as good as our system is, it still has shortcomings even in very simple cases.

EMIRELADERO 9/5/2025|||
As an aside, here's a relevant video about the (sometimes not) chaotic nature of double pendulums: https://www.youtube.com/watch?v=dtjb2OhEQcU
gsf_emergency_2 9/5/2025|||
What ought to be surprising is that the "thing" itself doesn't change.

A learning that describes chaos well enough may not want to be associated with "calculus", or even "math" (ask a friendly reverse mathematician about that)

https://www.johndcook.com/blog/2021/04/09/period-three-impli...

Somewhat tangentially, if Ptolemy I had responded (to Euclid) with anything less specific ---but much more personal--- than "check your postulate", we wouldn't have had to wait one millennium.

(Fermat did the best he could given margin & ego, so that took only a century or so (for that country to come up with a workable strategy))

Less tangentially, I'd generalize Quigley by mentioning that groups of hominids stymie themselves with a kind of emergent narcissism. After all, heuristics,rules and even values informed by experience & intuition are a sort of arrogance. "Tautology" should be outlawed in favour of "Narcissism" as a prosocial gaslighting term :)

cycomanic 9/5/2025||||
> The question is why "so many real-world systems are governed by differential equations" and "so many real-world systems involve periodic motion". > > Well, stable systems are can either be stationary or oscillatory. If the world didn't contain so many stable systems, or equivalently if the laws of physics didn't allow so, then likely life would not have existed. All life is complex chemical structures, and they require stability to function. Ergo, by this anthropic argument there must be many oscillatory systems.

I would say that the it's very difficult to imagine a world that would not be governed by differential equations. So it's not just that life wouldn't exist it's that there wouldn't be anything like the laws of physics.

As a side note chaotic systems are often better analysed in the FT domain, so even in a world of chaotic systems (and there are many in our world, and I'd argue that if there wasn't life would not exist either) the FT remains a powerful tool

im3w1l 9/5/2025||||
> Well, stable systems are can either be stationary or oscillatory.

In practice this is probably true, but I can see another possibility. The system could follow a trajectory that bounces around endlessly in some box without ever repeating or escaping the box.

abdullahkhalids 7 days ago|||
You can treat that, and scientist often do treat it, as a stationary system with some error bounds.

For example, the concept of homeostasis in biology is like this. Lots of things are happening inside the living body, but it's still effectively at a functional equilibrium.

Similarly, lots of dynamic things are happening inside the Sun (or any star), but from the perspective of Earth, it is more or less stationary, because the behavior of the sun won't escape some bounds for billions of years.

srean 7 days ago|||
If this box was of a bounded size then that trajectory would have interesting property - there are chunks of time you can edit out such that what remains will look as if they are converging on a point.

I suspect you will find ergodicity interesting.

hackandthink 9/5/2025|||
Natura non facit saltus.

https://en.wikipedia.org/wiki/Natura_non_facit_saltus

AIPedant 9/5/2025|||
On the simplest end of that spectrum, Taylor series are useful because many real-world dynamics can be approximated as a "primarily linear behavior" + "nonlinear effects."

(And cases where that isn't true can still be instructive - a Taylor series expansion for air resistance gives a linear term representing the viscosity of the air and a quadratic term representing displacement of volumes of air. For ordinary air the linear component will have a small coefficient compared to the quadratic component.)

yatopifo 9/5/2025||
As you noted, it’s about what’s important to us. The physical function may or may not be sparse, but our brain model is guaranteed to be sparse. A note played on a violin is anything but a sine function, yet our brains associate it with a single idealized tone. Our world model is super compressed.
seanhunter 9/5/2025||
One thing I find fascinating about Fourier analysis is the way the trigonometric Fourier series played such a central role in “breaking mathematics”[1] and the crisis that led to providing a rigorous basis for limits and continuity and all the other stuff that is now called real and complex analysis.

Cauchy had just proved that the limit of the sum of an infinite set of continuous functions was itself continous, and then along came Fourier with “are you sure about that bro?” and showed that you could take the infinite sum of very clearly continuous functions (just sine and cosine) and approximate something like a sawtooth function (which was very obviously discontinous) as closely as you like.

[1] by which I mean making obvious the fact that they had been proceeding for 100+ years using calculus without a rigorous basis.

laszlokorte 9/4/2025||
Shameless plug: If you are interested in Fourier Transform and signal processing you might enjoy my somewhat artistic 3D visualisation of the fourier transform as well as the fractional fourier transform [1]

(Fractional fourier transform on the top face of the cube)

And for short time fourier transform showing how a filter kernel is shiftes across the signal. [2]

[1]: https://static.laszlokorte.de/frft-cube/

[2]: https://static.laszlokorte.de/time-frequency/

nblgbg 9/5/2025||
Thanks a lot for all of this ! https://tools.laszlokorte.de/
laszlokorte 9/5/2025||
I am glad you are enjoying it! :)
hovden 9/5/2025|||
If I might also plug ‘the Atlas of Fourier Transforms’. If your interested in understanding building intuition of symmetry and phase in fourier space, the book illustrates many structures.
laszlokorte 9/5/2025||
Looks amazing! Thank you
yshklarov 9/4/2025|||
I love the visualization! Thanks for sharing.

How do you compute the fractional FT? My guess is by interpolating the DFT matrix (via matrix logarithm & exponential) -- is that right, or do you use some other method?

laszlokorte 9/4/2025||
I am glad you like it!

Yes the simplest way to think of it is to exponentiate the dft matrix to an exponent between 0 and 1 (1 being the classic dft). But then the runtime complexity is O(n^2) (vector multiplied with precomputed matrix) or O(n^3) opposed to the O(n log n) of fast fourier transform. There are tricks to do a fast fractional fourier transform by multiplying and convolving with a chirp signal. My implementation is in rust [1] compiled to web assembly, but it is based on the matlab of [2] who gladly answered all my mails asking many questions despite already being retired.

[1]: https://github.com/laszlokorte/svelte-rust-fft/tree/master/s...

[2]: https://nalag.cs.kuleuven.be/research/software/FRFT/

xphos 9/5/2025||
I made a cool rust fft tui a long time ago too

https://github.com/lquinn2015/FFT-tui

laszlokorte 9/5/2025||
Look great! I was already thinking about reimplementing mine as TUI
mallowdram 9/5/2025||
Fantastic! Thanks!
chamomeal 9/5/2025||
So weird, I was just reading this article yesterday. I did an undergrad in physics and really miss this stuff. Ended up getting nostalgic and watching 3 blue 1 brown videos while drinking tequila.
lutusp 9/5/2025||
Such a shame. In an otherwise well-written article, the author mentions Cooley and Tukey's discovery of the FFT, but without mentioning that Gauss discovered it first, among others, each of whom approached the same idea from different directions.

The Wikipedia FFT article (https://en.wikipedia.org/wiki/Fast_Fourier_transform) credits Gauss with originating the FFT idea later expanded on by others, and correctly describes Cooley and Tukey's work as a "rediscovery."

acjohnson55 9/5/2025||
As they say, everything in modern math is named for the 2nd person to discover it after Gauss.
olq_plo 9/5/2025||
Yes, bad article to omit that. It is such a cool fun fact. Gauss was unreal.
noncoml 9/4/2025|
3Blue1Brown made a video with great visualizations: https://www.youtube.com/watch?v=spUNpyF58BY
mananaysiempre 9/4/2025|
I’ve dabbled in explaining the FT some, and I think there’s an important trait of that video that needs to be highlighted: it’s a superb demonstration of what the Fourier transform does, mechanically, but it does not at all try to explain why it works in the places we usually apply it or how (having, admittedly, a thoroughly anachronistic mathematical background) you could have invented it.

To be extra clear: it’s a very good video and you should watch it if you don’t have a feel for the Fourier transform. I’m just trying to proactively instil a tiny bit of dissatisfaction with what you will know at the end of it, so that you will then go looking for more.

More comments...