The strong decay would just be forbidden from conservation of energy. If the mass of the Tbb state is less than the sum of the B+ and the B0 masses, then that decay isn't allowed.
"at least several weeks" is technically correct but I think a bit of an understatement. A sector takes about 3-4 weeks to warm up and 4-5 weeks to cool down. Then it needs to undergo powering tests and probably some "training" quenches.
Since the winter shutdown is scheduled for end of October (thanks to the energy crisis), there's a good chance we are finished with proton collisions for the year. If the leak can get fixed and the sector cooled down by mid/late September, we might have time for the heavy ion run. Last year's ion run was cancelled due to shortening the year, so 2 years in a row would not be great.
The Future Circular Collider is looking to switch to niobium-tin from LHC's niobium titanium. https://en.wikipedia.org/wiki/Future_Circular_Collider Unfortunately they'll still run at liquid helium temperatures-- while advanced superconductors have higher critical temperatures, the critical field (magnet strength) still goes higher as the magnet gets colder, and that's the figure of merit for collider designers. http://hyperphysics.phy-astr.gsu.edu/hbase/Solids/scbc.html
This is why you see new fusion reactor designs like the SPARC which use HTS superconductors throughout still use mildly exotic cryocoolants like liquid hydrogen-- not as expensive as liquid helium, but still better performing than liquid nitrogen. (Not to mention that liquid nitrogen is annoying in nuclear applications: it's easily activated by neutron radiation and deposits monoatomic carbon dust through your cryocooler circuit)
They introduce their own problems. They're typically brittle ceramics, making them hard to work with. They're more expensive to manufacture. And they have a lower critical current density (i.e. there's less current they can carry before losing superconductivity).
It won't improve our understanding of quantum numbers in general (a quantum number is just a discrete conserved quantity). The final paragraph mentions measuring the quantum numbers of this particular particle, namely its angular momentum and parity.
That’s not the full explanation; the antiproton is stable (or, is as stable as the proton).
It’s that the net baryon number is 0 and that the flavor ‘quantum numbers’ aren’t conserved by the weak force. If you could turn off the weak force, this cc ubar dbar tetraquark would be absolutely stable.
Quantum mechanics is probabilistic. The probability of these particular particles in these particular collisions is very small. So we need to do a lot of collisions in order to have a statistically significant signal.
> but the "camera" has limitations?
Sure. While we regularly upgrade our detectors or build newer, fancier ones, there will always be some limitations.
> but measurement interpretation inefficient?
It's certainly slow. It can take months/years to infer a result from the data. The major bottleneck is people. Even the thousands-strong armies of physicists who work on the LHC experiments would take decades to fully exploit the available datasets.
Your comment kind of implies that theorists analyse the data. (They don't even have access to it.) Analysis is the job of experimental physicists (like me) and can take years.
We don't wait until we have theoretical interpretations before publishing the discovery of a new particle. In fact, our results are kept confidential until the end of the collaboration's internal review process. In the absence of leaks, the first any theorist should hear of this new particle would have been the conference talk or press release.
That might be how it works in astronomy, but not in particle physics. The collaborations which build and operate the detectors are also the people who get to analyse the data. There is "open data" but it's released after a long embargo period and is much less complete. (In fact, you won't find any LHCb open data yet because CERN won't give us enough storage to host it.)
Officially, all qualified members sign all papers. The names of the people who worked on an analysis are not publicised. There is already a preference to only work on analysis, because it's cheap and easy (no need to send people to CERN or kit-out expensive labs), but that doesn't help much in the way of actually collecting data, so "service work" shouldn't be discouraged by attributing extra credit to people doing analysis.
Institutes sometimes write press releases such as this where they exaggerate their involvement. In this instance, only 2 out of 10 people in the analysis group are affiliated to Oxford. It's very disingenuous not to acknowledge that they're part of a larger collaboration.