Offtopic but I really appreciate golang and so I am always on the lookout of modern alternatives and I found age and I found it to be brilliant for what its worth
But I was discussing it with some techies once and someone mentioned to me that it had less entropy (I think they mentioned 256 bits of entropy) whereas they wanted 512 bits of entropy which pgp supported
I can be wrong about what exactly they talked about since it was long time ago so pardon me if thats the case, but are there any "issues" that you know about in age?
Another thing regarding the transparent servers is that what really happens if the servers go down, do you have any thoughts of having fediverse-alike capabilities perhaps? And also are there any issues/limitations of the transparent keyserver that you wish to discuss
Also your work on age has been phenomenal so thank you for creating a tool like age!
> But I was discussing it with some techies once and someone mentioned to me that it had less entropy (I think they mentioned 256 bits of entropy) whereas they wanted 512 bits of entropy which pgp supported
> I can be wrong about what exactly they talked about since it was long time ago so pardon me if thats the case, but are there any "issues" that you know about in age?
Entropy bikeshedding is very popular for PGP / GnuPG enthusiasts, but it's silly.
age uses X25519, HKDF-SHA256, ChaCha20, and Poly1305. Soon it will also use ML-KEM-768 (post-quantum crypto!). This is all very secure crypto. If a quantum computer turns out to be infeasible to build on Earth, I predict none of these algorithms will be broken in our lifetime.
PGP supports RSA. That's enough reason to avoid it.
Eh. You don't really get to do this sleight of hand. If you're gonna rag on RSA support as a shibboleth for bad design, it's bad for GPG and bad for age. If it's direct evidence of bad design, age shouldn't have permitted it via their SSH key support.
I agree in principle, but I'm not looking at "what SSH dragged in". I'm looking at age as a pure isolated thing, according to the spec: https://github.com/C2SP/C2SP/blob/main/age.md
This transparency keyserver actually gives us an excellent opportunity to measure how many people use Curve25519 vs RSA, even with SSH support.
We should contrast this with actively valid public keys on a PGP keyserver in 2026 and see which uses modern crypto more. The results probably won't be surprising ;)
We've moved from "PGP supports RSA. That's enough reason to avoid it." to "We should contrast this with actively valid public keys on a PGP keyserver in 2026 and see which uses modern crypto more".
Bearer tokens should be replaced with schemes based on signing and the private keys should never be directly exposed (if they are there's no difference between them and a bearer token). Signing agents do just that. Github's API is based on HTTP but mutual TLS authentication with a signing agent should be sufficient.
You don't need any third party modules and can proxy based on ALPN (https://wiki.xmpp.org/web/Tech_pages/XEP-0368#nginx) thus running everything on port 443. Note that ALPN is not encrypted AFAIK but public wifi services don't care.
Poor quality analogy: should ed25519 only have been incorporated into protocols in conjunction with another cryptographic primitive? Surely requiring a hybrid with ecdsa would be more secure? Why did djb not argue for everyone using ed25519 to use a hybrid? Was he trying to reduce security?
The reason this is a poor quality analogy is that fundamentally ecdsa and ed25519 are sufficiently similar that people had a high degree of confidence that there was no fundamental weakness in ed25519, and so it's fine - whereas for PQC the newer algorithms are meaningfully mathematically distinct, and the fact that SIKE turned out to be broken is evidence that we may not have enough experience and tooling to be confident that any of them are sufficiently secure in themselves and so a protocol using PQC should use a hybrid algorithm with something we have more confidence in. And the counter to that is that SIKE was meaningfully different in terms of what it is and does and cryptographers apparently have much more confidence in the security of Kyber, and hybrid algorithms are going to be more complicated to implement correctly, have worse performance, and so on.
And the short answer seems to be that a lot of experts, including several I know well and would absolutely attest are not under the control of the NSA, seem to feel that the security benefits of a hybrid approach don't justify the drawbacks. This is a decision where entirely reasonable people could disagree, and there are people other than djb who do disagree with it. But only djb has engaged in a campaign of insinuating that the NSA has been controlling the process with the goal of undermining security.
> seem to feel that the security benefits of a hybrid approach don't justify the drawbacks.
The problem with this statement to me is that we know of at least 1/4 finalists in the post quantum cryptography challenge is broken, so it's very hard to assign a high probability that the rest of the algorithms will be secure from another decade of advancement (this is not helped by the fact that since the beginning of the contest, the lattice based methods have lost a signficant number of bits as better attacks have been discovered).
I've used dynamic pipelines. They work quite well, with two caveats: now your build process is two step and slower. And there are implementation bugs on Gitlab's side: https://gitlab.com/groups/gitlab-org/-/epics/8205
FWIW Github also allows creating CI definitions dynamically.
I’m talking about in place of a fetch call, you could simply import a json response from an endpoint, there by bypassing the need to call fetch, and you’ll get the response as if it’s imported.
It won’t replace all GET calls certainly but I can think of quite a few first load ones that can simply be import statements once this happens
reply