It’s a very interesting article that’s based entirely on a flawed premise, that floating point is the only way to represent numbers in a computer. It’s true that there are numbers that can’t be represented by a computer using any system, but they are so huge that you probably couldn’t represent them any other way either (say a random number with a quadrillion digits, since a billion digits is totally possible and a trillion is pushing it on consumer hardware. Even then, it could probably be done with a million bucks of hardware).
It all hinges about the mentioned tract “Dangers of Computer Arithmetic”.
FWiW many of the mathematicians I knew that worked on the early iterations of the CAYLEY|Magma symbolic algebra system were equally ill suited to multiplying large numbers and bookkeeping - if not by numerical manipulation talent then certainly by temperament.
None the less they managed to cobble together a computer math system that could generate a headline:
Quantum encryption algorithm cracked in minutes by computer running Magma