I'm a postgraduate researcher working on GNSS interferometric reflectometry (GNSS-IR), a related but slightly different technique.
I'll shout out this awesome open source project, where you can use GNSS-IR and your own RINEX files to measure tides or soil moisture around your antenna: https://gnssrefl.readthedocs.io/en/latest/
(Your antenna needs to be near the sea or bare soil, respectively)
You've got exactly the right idea, except "cross compare" is underselling it :)
Here's a previous thread on this topic[0].
For each (receiver, satellite) pair, you can calculate the TEC along the signal propagation path by comparing the time of flight of two carrier waves (e.g. L1 and L2)[1].
By fusing the data from each line of sight together you can get a rough, real time, 3D (4D) model of the ionosphere. Then, you have a separate problem of identifying ionospheric anomalies in the model and relating them to phenomena like earthquakes.
Aerial survey LiDAR can process multiple returns from a single laser pulse. So, some energy might be reflected back from a leaf, but some energy will pass through (or around) the leaf, hit the ground, then reflect back to the sensor. Some systems can record 5+ points from a single laser pulse.
With this information, you can filter the point cloud to only include points from the final return, which is likely to be the ground/a solid surface unless the vegetation is very dense.
You don't even need multireturn, typically your point cloud will have points from the tree or whatever plus some that returned from the structure behind it.
A raw point cloud is run through a series of processing steps to label each point with a class, e.g. "Ground", "Low/Medium/High Vegetation", "Building", "Transmission Tower", etc.
There will be a different algorithm for each feature class. For example, points that are part of a building might be identified by finding groups of points that form a very flat surface. ML models can also do this based on training data.
The final digital elevation model (DEM) is then just taking the "Ground" class from the classified point cloud and using them to triangulate a surface. This differs from a digital surface model (DSM), which will triangulate a surface based on ground+building+vegetation points.
It's a bit pricer but very capable and well documented. There is a physical switch to enable power input for the modem over a separate USB to the data connection, so the power for transmission can bypass the Pi electronics.
I noticed during testing that the 4G connection would sometimes drop and require manual intervention to reset... So I added a systemd timer to test the connection and bring the interface down/up again whenever it disconnects. No problems since then, now with two months of uptime with ~200MB upload per day :)
I'm running a Raspberry Pi based GNSS receiver from a 26 Ah SLA battery and an 80W panel. Just passed 2 weeks of uptime in a cloudy period of southern hemisphere autumn.
A monte carlo simulation using historical conditions said it had a ~95% chance of no downtime over 3 winter months. A slightly larger battery would bring that up to 99%.
The Pi (3b+), GNSS reciever (u-blox ZED F9P), and Waveshare 7600G 4G modem average about 3.5W idle. The GNSS reciever is about 0.1 - 0.2 W of that. Wifi would be more energy efficient, I imagine.
It's functionally equivalent to an RTK base station (the configuration script I'm using is even called "RTKbase"[0]), but it's being used for researching GPS-based soil moisture retrieval[1]. Basically the GPS signal bounces off the ground and causes an interference pattern that changes based on the wetness of the soil.
There is actually a permanent survey grade GNSS reciever about 200 m away from the u-blox receiver. But the geography around it (too hilly) means it doesn't work for soil moisture retrieval.
>> They're measuring it by looking for phase differences in the received L-band (~2GHz) signals
The "L-Band signals" are GNSS signals, for example GPS L1 and L2, which use a carrier wavelength of 1575.42 MHz and 1227.6 MHz, respectively. Both L1 and L2 signals are emitted at the same time, but experience differing levels of delay in the ionosphere during their journey to the receiver. The delay is a function of total electron content (TEC) in the ionosphere and the frequency of the carrier wavelength. Since we already know precisely how carrier frequency affects the ionospheric delay, comparing the delay between L1 and L2 signals allows us to calculate the TEC along the signal path.
Another way to think of it is: we have an equation for signal path delay with two unknowns (TEC, freq). Except, it is only one unknown (TEC). Use two signals to solve simultaneously for this unknown. Use additional signals (like L5) to reduce your error and check your variance.
Apparently, TAI specifically defines the second (in terms of cesium transitions) at sea level (where gravitational potential is equal). I never knew that second part.
I'm a postgraduate researcher working on GNSS interferometric reflectometry (GNSS-IR), a related but slightly different technique.
I'll shout out this awesome open source project, where you can use GNSS-IR and your own RINEX files to measure tides or soil moisture around your antenna: https://gnssrefl.readthedocs.io/en/latest/
(Your antenna needs to be near the sea or bare soil, respectively)