The interesting aspect of the Cloudflare support, which is not clarified, is how they came to the risk assessment that it is ok to roll out a change non-gradual globally without testing the procedure first. The only justification I can see is that the React/Next.js remote command execution vulnerabilities are actively exploited. But if this is the case they should say so.
This thing crashes on Ubuntu LTS 24.04 during start. Apparently all these agents are not able to ensure that a desktop app starts on a popular Linux distribution.
If Google has forgotten how to do Software, than the future doesn't look bright.
In my org nobody has admin rights with the exception of emergencies, but we are ending up with a directory full of Github workflows and nobody knows, which of them are currently supposed to work.
Nothing beats people knowing what they are doing and cleaning up behind them.
The statement "An interface type A is assignable to interface type B if A’s methods are a subset of B’s." is wrong. It is not in the current language specification of Go. The author misunderstood the term type set from the Go language specification. The type set of an interface is the set of types implementing the interface and not the set of methods of an interface. If you use the right meaning of type set, the subset makes sense again.
> A value x of type V is assignable to a variable of type T ("x is assignable to T") if
> ...
> T is an interface type, but not a type parameter, and x implements[1] T
The density of black hole decrease by the inverse square of the mass of the black hole. That means massive black holes have a much lower density that small black holes. So they are more likely to form than small black holes. Dark matter will have played an important role in the creation of those early black holes. If there is no dark matter and some form of MOND theory of gravity is correct, the Schwarzschild formula will require a modification for large black holes. In that case galaxy centers will not require large masses to see the same effects.
That means massive black holes have a much lower density that small black holes. So they are more likely to form than small black holes.
No, it doesn't, because you're ignoring all the physics required to get the mass into a small enough region so that it will collapse to form a black hole. The only way to make that happen that we know works is to form massive stars, where a fraction of the star's mass, in the center of the star, will eventually form a black hole. But the largest stars we know of have masses of ~ 200 times the Sun, and so can't form black holes more than a fraction of that.
If you imagine a more massive gas cloud collapsing under its own gravity, it will fragment into subclumps before getting very dense; these subclumps will themselves fragment or go on to form stars directly, but with an upper limit of, say, ~ 200 solar masses.
(It's possible that if you start with a cloud of pristine gas in the early universe -- nothing but hydrogen and helium -- that it might collapse to form a single supermassive star, or even a black hole directly. That might give you something like a 1000-solar-mass black hole. But that's still fairly speculative, and requires unusual conditions that don't exist generally.)
I think the claim is like this: You have the primordial universe, shortly after the big bang, with fluctuations in density. But the whole density is very high. A fluctuation over a large region could put the region over the threshold to become a black hole, because the density required for that to happen is lower than for a small region.
Mind you, I don't know if this actually works. What was the density of the early universe, compared to the density required to form a black hole? How large were the fluctuations? Is this scenario plausible at all?
I suppose that if you go back close enough to the big bang, then you can get a density high enough. But then, if you go back not much farther, shouldn't the whole universe have formed a black hole? And if it didn't, can we trust the logic that says that the situation should have led to the formation of giant black holes?
I suppose OP defines it as the mass of the BH, divided by the apparent volume taken up by the BH (more precisely: the apparent horizon), as seen from the outside. Put differently, for a Schwarzschild BH: Density ~ M/R³ (modulo constant prefactors) ~ 1/M², since the Schwarzschild radius is linear in M.
The event horizon is a three-dimensional null hypersurface, though, encompassing a four-dimensional spacetime "volume". You are probably referring to the two-dimensional apparent horizon, which depends on the spatial slicing.
There are two scenarios: First: Microsoft uses the JWT signing keys in memory and the attacker were able to get access to it by injecting code or get access to the memory image of such a process. Second: Microsoft actually uses HSMs but has to distribute the keys geographically and the attackers were able to get access to the key this way.
The first scenario is more likely, but you cannot exclude the second as well.
The comments are full of statements regarding security capabilities for passkeys. But there is no public specification that even defines requirements for the exchange of passkeys between devices. Google and Apple make statements on their websites regarding the security, but all of it is practically unverifiable. Please note that end-to-end encryption is useless, if you are not controlling all the endpoints.
Sites of course could use the device public key extension of the WebAuthn protocol, to rely on more than a private key copied intransparently between devices, but I wonder, who will even know about it and actually use it. Google has stated they support the extension, but I cannot find a statement by Apple. A question whether DevPubKey is supported by Apple is unanswered on the Apple Developer Forums.
It is telling to me that the passkey spec has provisions for attestation which will allow lock-in by providers and lock-out by websites based on your provider, but questions of backup, account restore and interoperability between providers receive some hand-wavy "the market will figure it out" response.
"Der kluge Hans" is not directly a Brother Grimm title. There is a story "Der kluge Knecht", which translates into "The clever farmhand". The hero of the story is called in the story kluger Hans. There is another story called "Der gescheite Hans", which means exactly the same. Both stories are not fairy tales but droll stories about a simple-minded person.