Cheap electronics are just the feed stock, the basis function for your new creation. Why start with raw matter when you can get fully formed matter for less.
The difference is that today this trust is local and organic to a specific project. A centralized reputation system shared across many repos turns that into delegated trust... meaning, maintainers start relying on an external signal instead of their own review/intuition. That's a meaningful shift, and it risks reducing scrutiny overall.
I am still not going to merge random code from a supposed trusted invdividual. As it is now, everyone is supposedly trusted enough to be able to contribute code. This vouching system will make me want to spend more time, not less, when contributing.
Trust signals change behavior at scale, even if individuals believe they're immune.
You personally might stay careful, but the whole point of vouching systems
is to reduce review effort in aggregate. If they don't change behavior,
they add complexity without benefi.. and if they do, that's exactly where
supply-chain risk comes from.
I think something people are missing here is, this is a response to the groundswell in vibecoded slop PRs. The point of the vouch system is not to blindly merge code from trusted individuals; it's to completely ignore code from untrusted individuals, permitting you to spend more time reviewing the MRs which remain.
To whom? It's not against Github's ToS to submit a bad PR. Anyway, bad actors can just create new accounts. It makes more sense to circulate whitelists of people who are known not to be bad actors.
I also like the flexibility of a system like this. You don't have to completely refuse contributions from people who aren't whitelisted, but since the general admission queue is much longer and full of slop, it makes sense to give known good actors a shortcut to being given your attention.
I don't think the intent is for trust to be delegated to infinity. It can just be shared easily. I could imagine a web of trust being shared between projects directly working together.
That could happen.. but then it would end up becoming a development model similar to the one followed by sqlite and ffmpeg ... i.e., open for read, but closed(almost?) for writes to external contributions.
I don't know whether that's good or bad for the overall open-source ecosystem.
This is an educated guess, but I think it becomes less efficient, so it heats up, and then performs better as it heats. I assume this to be the case because I charge my RC plane LiPos the same way every time, and they take the same amount of energy, but flying in the winter gives much shorter flight times. Since the battery is warm after a flight, even in the cold, I don't think the energy is still there the battery is still discharged when I take it home), so it must just be much less efficient and wasting a lot of energy as heat.
I assume it's just that its internal resistance rises when it's cold, but I might be wrong.
I mean, "everyone already has an account" is already a very good reason. That doesn't mean "I automatically accept contributions from everyone", it might be "I want to make the process of contribution as easy as possible for the people I want as contributors".
Hatching a reputation-based scheme around a "Contributor Management System" and getting "the people you want as contributors" to go along with it is easier than getting them to fill in a 1/username 2/password 3/confirm-password form? Choosing to believe that is pure motivated reasoning.
> GitHub customers really are willing to do anything besides coming to terms with the reality confronting them: that it might be GitHub (and the GitHub community/userbase) that's the problem.
The community might be a problem, but that doesn't mean it's a big enough problem to move off completely. Whitelisting a few people might be a good enough solution.
reply