Hacker Newsnew | past | comments | ask | show | jobs | submit | takd's commentslogin

The question itself merits discussion, while the post did not exactly answer it.


Mathematically, the Fourier transform is "simply" a way of representing time signals in a certain orthogonal vectorial basis. Vectors in an ordinary sense, e.g. a displacement vector on Earth's surface can also be represented in several orthogonal bases: one basis could, for example, be two vectors pointing North and East; another could be a vector pointing along a certain road and one perpendicular to it. There is nothing inherently special about any of these bases, one could draw maps according to any of these two or many other conventions. (Orthogonal basis vectors are not even necessary, only convenient.)

The interesting thing about time-dependent signals (or any "pretty" function, really) is that they live in an infinite-dimensional vector space, which is hard to imagine; but (besides some important technicalities) the math works mostly the same way: signals as infinite-dimensional vectors can be represented in a lot of bases. One representation is the Fourier transform, where the basis vectors are harmonic functions. The "map" showing the shape of a signal as a combination of infinitely many harmonic functions -- i.e. the frequency domain -- is just as real as any other map with different basis vectors, e.g. the Walsh–Hadamard transform mentioned in the article. And, crucially, the original time-domain representation is also just one map showing us the signal, though it is often the most natural to us.


Excellent answer and I am sure you are aware of this, but like to point out:

"Mathematically, the Fourier transform is "simply" a way of representing time signals in a certain orthogonal vectorial basis."

Not just time signals but any piecewise continuous and differentiable as well as Dirichlet integrable function. This has many applications, just a few examples from the top of my head: image processing, solving differential equations, fast multiplication.

I'd also like to add that from a mathematical point of view these transforms are "lossless" in the sense that the transformed function has the exact same information as the original and you can get back the exact original even if all you have is the transform.

I feel this often gets lost when people approach the Fourier transform from a more engineering perspective, not at least because we often do the transform to throw away unwanted information, like certain frequency components.

In the end it really is just one of many perspectives to look at a function.


> I feel this often gets lost when people approach the Fourier transform from a more engineering perspective, not at least because we often do the transform to throw away unwanted information, like certain frequency components.

That was my problem as well. My first introduction to Fourier transforms was through more of an engineering lens. I remember having trouble with the _inverse_ Fourier transform. I was OK with a Fourier inverse of an already transformed function but I wasn't quite sure what that would mean when applied to a non-transformed, "regular" function.


Inverse fourier transform of a non transformed signal gives you basically the fourier transform with some changes (I can't remember which, were the numbers conjugates or something?). Applying it the second time gives you same result as if you'd do the forward direction transform twice.

If you apply fourier transform 4 times you get your original function back. You can think of it as 90 degree rotation. Inverse transform just rotates it in the opposite direction.

The rotation analog is not even too far fetched as fractional fourier transform allows you to do an arbitrary angle rotation.


Having never heard of this for the Fourier Transform, needed to read.

F0: original signal

F1: frequency domain signal

F2: reverse time signal

F3: inverse fourier signal

F4: original signal

Also, has further weird applications I've never heard of with "Fractional Fourier Transforms" [1] which can apparently result in smooth smears of time -> frequency domain [2].

[1] https://en.wikipedia.org/wiki/Fractional_Fourier_transform

[2] https://en.wikipedia.org/wiki/File:FracFT_Rec_by_stevencys.j...


See [1] for my visualization of this 4-cycle and the fractional fourier transform

[1]: https://static.laszlokorte.de/frft-cube/


Thanks. It's neat being able to visualize them, and the 3D display's actually pretty cool looking for all the different functions.

The Fractional DFT part though, doesn't seem to do anything no matter the function chosen. Firefox 125.

Edit: Nvm, figured it out. Have to visualize from the top down to see the Fractional DFT portion. Haven't seen many visualization systems where each orientation shows a different type of data. Actually a pretty neat idea from a UI perspective.


As a related aside, the terms "cepstrum" and the "quefrencies" [1] (c.f. spectrum and frequencies) sound so hilarious that when I first heard about them I was convinced it was some kind of prank.

[1] https://en.wikipedia.org/wiki/Cepstrum


This does read like a joke, I had never heard of it either and I'm wondering if many people do use this at all..

Operations on cepstra are labelled quefrency analysis (or quefrency alanysis[1]), liftering, or cepstral analysis. It may be pronounced in the two ways given, the second having the advantage of avoiding confusion with kepstrum.


It's used in speech signal processing & seismic signal analysis.


Almost all speech recognizers until this latest crop (of end-to-end DL NN ASR) operated on cepstral coefficients (and their delta-s and delta-delta-s) as their feature vector.


> [...] I wasn't quite sure what that would mean when applied to a non-transformed, "regular" function.

Have you gained some intuition/understanding for this?

I tried a few inputs in WolframAlpha, but unless I manually type in the integral for the inverse transform there's not even a graph :) (and I have no idea whether it's even the same thing without putting a `t` in the exponent and wrapping it in an f(t) = ... )

https://www.wolframalpha.com/input?i=integral+%28sin%28x%29+...


Not parent (but GP) and intuition can mean many things but what helped me was keeping in mind:

Every continuous periodic function turns into a discrete aperiodic one when transformed. Works both ways.

Continuous aperiodic stays continuous aperiodic. Discrete periodic stays discrete periodic.


A fourier transform basically gives you an infinite number of sine waves with different amplitudes/phases at every frequency. If you add them all back together (the inverse fourier transform), you get back your original signal. Audio compression in this case would just be excluding the sine waves that are too high frequency too hear when you add them all back. I always hate how people try to make the fourier transform sound more complex than it actually is (and yes there is more nuance to compression than this, but this is just the basic idea).


The DFT has quite severe limitations that do not appear in the Fourier transform. In particular, the Nyquist criteria that there be zero signal energy for ALL frequencies above half the sampling frequency can only be approximated, and must be accomplished BEFORE sampling, i.e. in the time domain.


I wonder how Fourier transformations are taught in engineering courses, because the idea that it could be "lossy" is strange and not obvious to me. It has an inverse after all.


In my engineering courses, Fourier transforms were taught in the context of discrete fourier transforms. Because sampling is a thing that matters (computer audio is discrete data points, not an actual wave).

The Fourier transform of a discrete signal repeats in the frequency domain. For example, [1, -1, 1] could be a sine wave with the exact half of the sampling frequency going from 1 to -1 back to 1 once .... Or it could be a sine wave with double the sampling frequency that is actually going from 1 to -1 to 1 to -1 all within the gap of the first 2 samples. Or it could be 3x the sampling frequency, 4x the sampling frequency, etc. The solution is to only keep the part of the transform that is below the Nyquist limit, because we don't have a sampling rate accurate enough to measure the higher frequencies, so just assuming they dont exist. This also means that if the source signal WAS in fact 4x the sampling frequency, we will see a spike at 1/2 the sampling frequency in the fourier transform, and when we re-create the signal, it will be completely wrong.

So unless you have analog hardware for measuring the Fourier transform (or are working purely in a non-physical mathematical domain, like "i have a sine wave" which can be perfectly represented), you are naturally going to be taking discrete samples of a signal to measure the Fourier transform, which means you are going to be losing any part of the signal that doesn't adhere to sampling rates.

Because my engineering courses were so heavily focused on digital signal processing, when I hear "fourier transform" i immediately think of "discrete fourier transform" and loss is immediately applicable.


Don't engineers at least learn to solve the heat equation analytically using Fourier transforms? It wasn't all numerical algorithms, was it?


It can be lossy and not just because of truncation. Convergence is somewhat finicky in Fourier analysis and was not well understood mathematically before the 60's. Engineers made great use of it anyway.

Remember that it is an integral transform. Basically, any data on a set of vanishing measure can be lost or corrupted. Unfortunately it can even be the case the deviation around some points is unbounded.


Frequency of a signal and time at which you measure it have an uncertainty principle. https://en.wikipedia.org/wiki/Fourier_transform#Uncertainty_... You either know frequency well, but on a large window or shorten the window and get larger error bars on frequency.


A square wave would take an infinite number of sinusoidal waves to perfectly reconstruct. To approximate it, one truncates the coefficients. This is done in any engineering application where memory isn't infinite. Which is all of them.


Of course, a discrete, finite sampling of a square wave at a set of points in time only requires a finite number of coefficients to perfectly reconstruct.


Which is the point. A discrete fourier transform can never recreate a square wave unless you have infinite samples. It can only recreate the finite sampled signal (which is only an approximation of the square wave, and not a real square wave).

This means that sure, the Fourier transform itself isn't lossy (garbage in, garbage out) but Fourier transforms would be used in contexts where loss are introduced. If I have a real perfect square wave, and I want to a take a fourier transform of it, the sampling is going to introduce loss, so to associate sampling losses with the transform itself is fair. Real square wave ran through a DFT program on my computer is going to spit out an approximation of a square wave -- loss.


The good news of course is that if you were sampling a real signal, then that signal was not actually a perfect square wave. So the fact that you can't (re)construct a perfact square wave is somewhat moot...


Generally yes, but it's a perfectly reasonable assumption that a natural source could generate a signal that is beyond the bounds of what we can record. Any real signal generated by a computer is going to fit within the constraints of what we can generate, but inevitably something like a whale, or a quasar or something will generate a wave that will be lossy.

But also, the question this is all responding to was effectively "why would engineers associate Fourier transforms with loss" and the answer is simply "because the techniques used in calculating most Fourier transforms are going to inherently put a frequency limit and anything beyond that will be lost or show up as an artifact". Engineers work with real world constraints and tend to be hyper aware of those constraints even if they often don't matter.


I think it's the discontinuous jump that causes the problem, and natural signals do sometimes have sudden jumps.


I used to think of it like another basis too, but nowadays I think this basis analogy is a bit fraught, or at least not the whole story.

In particular, for multidimensional spaces, the usual multidimensional Fourier transform only really works if you have a flat metric on that space (I.e. no curvature). That’s a bit of a warning signal given that our universe itself is curved.

There was some very interesting work recently where it was shown how to generalize Fourier series to certain hyperbolic lattices [1], and one important outcome of that work is that the analog of the Fourier space is actually higher dimensional than the position space.

Furthermore, the dimensionality of the ‘Fourier space’ in this case depends on the lattice discretization. One 2D lattice discretization may have a 4D frequency-like domain, and another 2D lattice might have a 8D frequency-like domain.

[1] https://arxiv.org/abs/2108.09314 or https://www.pnas.org/doi/full/10.1073/pnas.2116869119


Not the whole story indeed, but you have to dive into representation theory somewhat to get more: the Fourier transform is more or less the representation theory of the (abelian) group of the translations of your space, thus the homogeneity requirement. The finite-lattice version[1] (a discretized torus, basically) may serve to hint what’s in stock here.

[1] https://www-users.cse.umn.edu/~garrett/m/repns/notes_2014-15... (linear algebra required at least to the degree that one is comfortable with the difference between a matrix and an operator and knows what a direct sum is)


If you like this topic, I strongly recommend you read the references I attached to my comment.

In uniformly curved 2D hyperbolic spaces, it turns out that there is a higher dimensional non-Abelian Fuchsian translation group defined on a higher genus torus.


> That’s a bit of a warning signal given that our universe itself is curved.

What does this has to do with whether they are a different basis for cases where we don't account for curvature? This seems completely irrelevant, sure the tool can't be used in some cases but it can be used as a basis change in other cases.


It's not an analogy. It's literally just another basis.

> In particular, for multidimensional spaces, the usual multidimensional Fourier transform only really works if you have a flat metric on that space

What the hell does the metric of space-time have to do with this? When computing a fourier transform, we're not working in 3+1 dimensional space-time, we're working in either an N-dimensional (in the discrete case) or \infty-dimensional (in the continuous case) vector space; while that term contains the word "space" they DO NOT, in this context, have anything to do with Euclidean space or the Pseudo-Riemannian manifold that GR treats space-time as.


I wanted to know more about this too, and I hate to make meta comments, but I'm afraid your confrontational approach may make the other person think this conversation isn't worth the hassle

Which would be a bad thing, reading this kind of conversation is what makes this site worthwhile


For what it’s worth, I did reply, but I probably wouldn’t have if you hadn't expressed frustration at the prospect of not reading further.


> What the hell does the metric of space-time have to do with this?

Maybe calm down for a moment and try not being such a hot-headed ass. You seem to have missed the point entirely.

I’m well aware that these functions can be described as vectors in an infinite dimensional Hilbert space.

The problem I’m bringing up is that the domains of these functions (i.e. not the vector itself) typically have geometric properties we care about.

The problem is that if one has a manifold with a non-trivial intrinsic geometry, then functions defined on that manifold cannot be faithfully Fourier transformed without losing pretty much all geometrically relevant information.

It turns out that in some cases, there are generalizations of the Fourier transform of a function on a curved manifold, but in those cases, the domain of the transformed function is very different, typically having a higher dimensionality.

This is particularly relevant and problematic in physics, where the Fourier transforms of functions on spacetime are really important and useful, but dont work in curved spacetimes.

E.g. it’s a big problem when doing QFT on a curved spacetime that one cannot separate positive frequencies of a field from negative frequencies.


In the past astronomers believed in the geocentric model of the universe with epicycles. It was extremely accurate, and if more accuracy was needed they added more epicycles. It was a completely wrong model, but they unknowingly used the Fourier series as a function approximator.


It wasn't a wrong model, it was just much more complex than needed, given better understanding of physics. Viewing the universe relative to stationary Earth is a perfectly fine exercise, even if it means you have to DFT the rest of the solar system for the math to work.


I agree overall. But a note - every orthonormal basis partitions the frequency spectrum. It doesn't go away, if you are using e.g. polynomials then you're building functions up out of their frequency components too. The Fourier basis has every element correspond to a specific frequency, which is special in some sense. I would say rather.. they are each designed for a purpose. A basis change can rearrange the spectrum in such a way that analysis of it is complicated. Then you're analyzing something else (eg smoothness). Most functions of interest do have distinctive spectra, even if the Fourier basis doesn't answer all the questions.


> one basis could, for example, be two vectors pointing North and East; another could be a vector pointing along a certain road and one perpendicular to it.

And there's no requirement that they be perpendicular is there? The second just needs 'some amount of perpendicular', North and North-East for example? Since any [n, e] can also be described as [(1-sqrt(2)*e)*n, sqrt(2)*e] in the latter. (I think that's right, but my main point is you can do it, not the particular value there, and if that's way off I'll blame the fever.)


If you can walk this way / and this way | then you can do / minus | to get your orthogonal -


Yes exactly.


You typically want an orthonormal basis though, but yes you don't need it.


Sure, nothing special about sine waves as basis functions for signal decomposition. Not necessarily the best either, depending on what you want to do.

Still, as pertains to whether "the frequency domain is a real place", maybe sine waves are relevant as representing resonant frequencies of physical systems.

There also seems to be something fundamental about the way multiple radio frequencies can simultaneously propagate through a vacuum as long as they are different frequencies.


Ok. Now what's the difference between the Fourier transform and the Fourier series?

To me what you described sounds more like the Fourier series.


the fourier transform of a periodic signal is composed of a train of dirac deltas, each multiplied by some factor

the delta with smallest frequency is the fundamental frequency, and the others are harmonics

when you do the inverse fourier transform on this train, each delta becomes a sinusoid

that's how you can write any periodic function as a sum of sinusoids, all of them multiples of the fundamental frequency

and that's the fourier series: it's just the fourier transform, followed by an inverse fourier transform, macroexpanded

but the fourier series only work for periodic functions, because only periodic functions have a bunch of isolated, periodic deltas as its fourier transform

so the fourier transform is only half the step of a fourier series (to write down the series you also need the inverse fourier transform) but, at the same time, the fourier transform is a generalization of the fourier series, because it works for nonperiodic functions too


To elaborate more, when we say that a signal is the linear combination of infinitely many frequencies, if those frequencies are multiples of a fundamental frequency (that is, a countable set of frequencies), then we are talking about a Fourier series and the signal must be periodic

But if those frequencies span a whole continuum (rather than frequencies f, 2f, 3f, 4f..), that is, an uncountable set of frequencies, then this signal is non-periodic and we can't talk about a Fourier series anymore, we must use the Fourier transform


> the delta with smallest frequency is the fundamental frequency, and the others are harmonics

Nitpick, but this isn't true. If my signal is a linear combination of two sinusoids - one at frequency 3 and the other at frequency 5, then there is no "fundamental" frequency when you do the FT.


I had good experiences with nbdime.


Which koan is that?


An apocryphal one from the Russian-speaking corner of the Internet. I thought it was in the classic Rootless Root, but turns out I misremembered and it’s a different thing. I tracked down the source if you want to read more: [1][2][3][4] (some humor might be lost in translation).

    ----
A sysadmin struggles to choose a strong password for his RADIUS server. He seeks advice from Master Ha Ker Noose.

“Master, in your opinion, is password ‘诺曼底登陆’ a strong one?”

“No”, said Master Ha. “For it is a common dictionary word.”

“But it’s not in the dictionary...”

“‘Dictionary’ means the wordlists used for brute-force attacks. Such lists are compiled from all combinations of letters and symbols ever mentioned on the Internet.”

“Okay... How about ‘шьзщыышиду’?”

“I doubt it. From the dictionary.”

“But how?! it’s—”

“Just google it. See for yourself.”

The sysadmin swiftly navigates the Web.

“You are absolutely right, Master.”

After some time, the sysadmin comes back:

“Master, I have managed to generate a good password which is not present in any dictionary.”

Master Ha nods with approval.

“I checked in Google”, the sysadmin goes on. “This combination does not exist on the Internet.”

“Now it does.”

    ----
[1] https://infowatch.livejournal.com/16359.html

[2] https://infowatch.livejournal.com/16525.html

[3] https://infowatch.livejournal.com/59867.html

[4] https://infowatch.livejournal.com/110897.html


What versions of the mentioned packages is this feature included in? I.e. what should I be looking for in my distro updates?


Touchpad gesture implementation in X server has been released in version 21.1

Wayland touchpad gesture implementation in Qt widget framework has been released in version 6.2.0

X11 touchpad gesture implementation in Gtk widget framework has been released in version 4.5.0

X11 touchpad gesture implementation it Qt widget framework will be released in version 6.3.0 (March 2022)

Touchpad gesture implementation for XWayland will be released in version 22.1 (early 2022)


I really like the idea, and the site is pretty. Is there a daily RSS feed too? I cannot find it.


Thank you!

I am currently looking into how to generate an RSS feed. I personally don't use RSS, so I am not sure how you would like to have it be implemented, I have two questions:

1. Do you want all articles on HackerDaily added to your RSS feed? (HackerDaily only shows the articles with 100+ points, so that's about 30-50 articles per day)

2. HackerDaily uses your local timezone to find the articles per day, would you like to have this for the RSS feed as well, or is based on UTC also okay?


Ah ok I was wondering about which stories showed up. Nice work, like the day of the week selections--thought of maybe having today on the far right.

I also made an attempt (with far less style) to reduce my dopamine driven visits by showing top stories only for today so the second visit looks almost identical to the first with 'not much new' to see, then move on. Popular keywords are grouped at the bottom to help surface the rare interesting cuts.


1. Yes, I think all HackerDaily articles should be included. 30-50 articles per day is not that much.

2. I think separate RSS feeds for different timezones might be better (not _every_ timezone, but say, every whole hour offset), so everyone can choose what they want; either based on the timezone or on their daily routine.


I have been using /e/OS on a Fairphone 3 for several months now, and I am quite satisfied.

Yes, it is built on LineageOS, though I am not sure how closely it is tracked.

They have their own cloud services, but you can opt out. You can use your own Nextcloud server as well, which I think is pretty neat: you can have a convenient cloud-based experience (contact and calendar sync etc.) with the stock /e/OS installation, while also having ownership of your data in your Nextcloud instance. (Their own servers are also heavily based on Nextcloud.)

Regarding app stores: I think it is the weakest point. F-Droid is nice but not enough, the /e/OS App Store is a bit murky for me; for most of the non-FOSS apps I use the Aurora Store, which seems trustworthy to me so far.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: