It's just nowhere near feasible. Each element would need power, orientation, precise positioning and a data link to the processing stations. For SKA low the raw data rate from all antennas is something like 2 Pb/s. Which is more than all of Starlink combined. Which is massively stepped down to 7 Tb/s by the central processor, a supercomputer with purpose built signal processing hardware. Then the next stage takes it to 100 Gb/s. You would likely have to transmit the data via radio links, which would defeat the purpose of going to space. When a radio telescope is built in space it will likely be on the Moon (or in orbit), designed for lower frequencies and much less ambitious than SKA low.
I still don't understand the case; sorry. Actually, we're not working with the same set of facts seems to the main thing.
> "would likely have to transmit the data via radio links"
No; you'd use free-space optical communication, which doesn't interfere at all, and which Starlink has pioneered. They have working laser links at 200 Gbps, per link,
The optical bandwidths potentially accessible, in vacuum, are much wider than that of microwave links to/from Earth.
> "a supercomputer with purpose built signal processing hardware"
I don't see why couldn't put that in space, today. That SKA signal processing system you're talking about amounts to 100 petaflops, drawing 2 MW of power. That's far less power than Starlink already has in orbit right now (somewhere in the 10's of megawatts). It's even within a factor-of-10 of raw compute—the figures I found say each V2 Starlink has 1.2 TFlops of local processing.
I don't understand why it'd be impractical to put an equivalent signal-processing system in orbit. At any rate, there's YC startups getting funded for space-compute proposals more ambitious than that,
Laser links would make everything more complicated and expensive. Radio antennas can share the same hardware to link with many terminals, not so with laser systems. They are point to point, so you need many more. They also require precise pointing, you can forget passive spacecraft. The number of 42 PB per day is about three orders of magnitude lower than the data rates required by SKA Low.
>I don't see why couldn't put that in space, today.
Because the electronics are not built for space. Space rated electronics is about a decade behind their ground based siblings. Radiation is one of the most problematic, degrading electronics and disrupting digital memory. You also need petabytes of temp storage. Which, again, doesn't exist. For reference, JWST carries a 70 GB solid state recorder, I think Roman has about 1 TB. Then there is all the cooling, which is difficult in space. Saying there are some start-up considering computing is not the same as it being possible to order a system like this today.
The question is not could it theoretically be done, but can it be done on even the sorts of budget of a space science mission. And the answer to that is no. SKA pushes computing and technology to the limit on the Earth, where these things are much more advanced.
>Cold Dark Matter, as used in simulations etc, usually has six free parameters.
That is not correct. LCDM as a cosmology is described by 6 parameters mostly for CMB analyses, but this is much more than just CDM. These parameters include the amount of normal matter, the cosmological constant, the amount of dark matter, reionization, the Hubble constant and two parameters which describe the initial fluctuations. CDM only has one parameter in the model, its density. As you can see there aren't many nobs to turn. These parameters are also fixed to observational values. And if you think you can fit all modern cosmological data with a physically-meaningful model and fewer parameters then go ahead.
Your suggestion about carbon is not falsifiable observationally. With real data you can only place an observational upper limit, you cannot measure the abundance is exactly zero. Without a quantitative calculated prediction of the carbon abundance it cannot be falsified. Similarly you can only test direct collapse black holes if you have some way of finding them, their observational properties depend on the formation scenario. You also need the expected number density and redshifts of such objects to reject anything.
Dark matter is also not falsible observationally, every time a supposed DM effect fails to be observed it's just assumed it's darker than expected.
One could re-postulate the theory as the innumerable tiny hands of god pushing on mass in the divinely chosen direction and nothing really changes but the name it theory.
The hands are there, they're just smaller than the resolving power.
At some point it's time to admit fault, but so far that's not happening despite the ever accumulating pile of evidence against DM. For a supposedly mature main stream theory the proponents are surprisingly fragile.
Let me start by saying there are a lot of false claims in the dark matter section.
It's also filled with self contradiction, announcing that DM as wrong, and later pronouncing LCDM as unfalsifiable.
The pillars of modem cosmology are the ability to quantitatively describe and predict large-scale structure, the expansion history of the universe, the CMB and primordial nucleosynthesis.
Can this "model" calculate any of those things? No. What the author has here is some ideas, not a model.
To demonstrate you can even reproduce the Cosmic Web you have to actually run some calculations, or simulations.
How do you know AGN bubbles produce a universe that looks anything like ours?
The author dismisses simulations as "not science", while paradoxically using them as the only representation of the cosmic web in the article.
These simulations have a lot of value, they demonstrate that standard cosmology and normal gravity has no problem forming voids and filaments.
These simulations have been compared to countless new observations, which this model cannot because it's purely qualitative.
The article says these simulations are worthless because they don't work from first principles, this is a practical limitation that you cannot simulate galaxies down to the resolution of atoms on any existing computer. You have to make some simplifications. The structure of the cosmic web is seen in all of them, even going back to very early simulations, it doesn't depend on these assumptions.
And at the end of the article we go back to the problem of dark matter, and find out the author has no explanation for rotation curves or other classical tests of DM.
So despite bashing DM cosmology, this model explains none of the pillars of evidence for dark matter. At some point in developing an idea like this you need to actually start applying physics, either with calculations or simulations. Every new hypothesis is perfect before it has been subjected to rigor and analysis.
I agree with most of what you've said here, particularly the following line which I'm copying for emphasis because I think it's incredibly important.
> These simulations have a lot of value, they demonstrate that standard cosmology and normal gravity has no problem forming voids and filaments.
That being said, I think the author intends for this article to be more of a call to action than an actual result. Simulations aren't cheap, somebody needs to actually do the work. The point that there aren't any simulations without dark matter is an important one too.
One can do simple simulations on a laptop which show the cosmic web. It's not really an excuse for not having tried. There are lots of claims in the article which need to be justified, and in science that comes before making big claims.
These simulations take their simple initial conditions from the Cosmic Microwave Background fluctuations, but models without dark matter fail to match the observed CMB. There are no major baryon-only simulations because cosmology doesn't work without DM, and you have nothing to start from. You need a quantitative model which works on some level to even begin, people have tried with modified gravity models.
We need a model that includes electromagnetism. The author isn't the only one making this claim. When we do magnetohydrodynamic cosmological sims we consistently find surprising effects. The recent simulation showing that black hole accretion disks are supported by magnetism comes to mind.[0][1]
Apologies, I know this is typically considered bad form, but have you gotten to the following section in the article?[2] It appears to directly contradict your claims.
> MOND’s also been around since the early 1980s, but, in 2021, it finally developed a model – the Aether-Scalar-Tensor framework, or AeST – which ALSO maps perfectly onto the acoustic peaks revealed by WMAP and Planck. (It does it by proposing a new vector field and scalar field that duplicate the effects of Cold Dark Matter in the early universe...
Many state of the art galaxy formation simulations use MHD, it does not however affect the formation of the Cosmic Web.
> MOND’s also been around since the early 1980s, but, in 2021, it finally developed a model – the Aether-Scalar-Tensor framework, or AeST – which ALSO maps perfectly onto the acoustic peaks revealed by WMAP and Planck. (It does it by proposing a new vector field and scalar field that duplicate the effects of Cold Dark Matter in the early universe
This doesn't contradict what I said. These models are not simply removing DM, I said people have tried with modified gravity models like these. First the models which fit the CMB were engineered to do, and each model has many more free parameters than dark matter. And note they have replaced invisible matter with an invisible matter-like field, it's not a simplification. Remembering that cold dark matter predicted these features, with fewer parameters and it is a physical model, not merely a fit. People have run simulations with the more basic MOND models, to find it cannot form realistic structure. Generally it forms structure more quickly than standard cosmology. Finding they need to add dark matter to their already modified gravity models to get something reasonable.
> Many state of the art galaxy formation simulations use MHD...
> Generally [MOND] forms structure more quickly than standard cosmology.
Magnetohydrodynamics, not just hydrodynamics. It would need some type of restoring force similar to how the black hole simulation was found to behave when it included magnetic fields.
> People have run simulations with the more basic MOND models, to find it cannot form realistic structure.
Neither does ΛCDM though, right? It matches the CMB, sure, but it doesn't match the many of the observed properties of the universe either. The observed voids and over-densities are significantly larger than predicted, nor is the observed distribution of dark matter in galaxies what we predict.
> And note they have replaced invisible matter with an invisible matter-like field, it's not a simplification.
Great point, and I'm happy to admit I missed this.
But I'm not actually a proponent of MOND, nor even an opponent of ΛCDM. It's the best we got. I just feel it's not quite as accepted as current consensus seems to, so I'm glad there's people out here still trying to poke holes in it and find alternatives.
I think the point is that for filaments to form in simulations you have to assume dark matter to be so abundant that it stops explaining other things that are explained with it. Basically you can adjust dark matter parameters to explain everything but you can't find a single set of parameter values to explain all the things at the same time.
That's not the case. The hydrodynamical simulations mentioned use the same cosmological value set by CMB data. The ratio is not the same in every galaxy, because matter and DM behave differently on small scales, but this variation comes out naturally in simulations.