No, the grandparent comment resonates with me. I have good eyesight (when last tested, four years ago, it was better than 20:20) and use monitors with standard DPI on standard resolutions, and yet it suddenly strikes me that never in my life have I had to zoom out on a web page, and yet I frequently zoom in (including on Hacker News). Standard font sizes really are too small.
Not sure how much is age... I notice that reading my phone is often harder than desktop. I usually have a set of reading or computer glasses with me, but don't need them except for things that are close. Stuff on the phone always seems a bit small. Maybe there should be an effort to consult designers over 40 on some of these web sites/apps.
For Hacker News specifically: Verdana at 13px (12px for body text) was a fine choice at 72 dpi.
HN didn't exist in the 1990s, but its design principles are from that era. The look itself is a nice throwback, but the type size follows a technical constraint that no longer makes sense (it only made sense when CSS was nonexistent or very limited).
Below the fold arguments only make sense when you can't grasp the purpose of the page without scrolling.
You're not supposed to cram every piece of information you can above the fold. The information that is getting cut off in this case is hardly pertinent to the understanding or navigation of the site.
No, it's merely a work-around for two very distinct and self-inflicted problems:
- The pain of exposing language-native compiler/language AST and indexing APIs to a JavaScript IDE.
- RMS' inability to accept that GCC might need to expose an AST to external consumers, and his paranoia regarding how that might undermine his GPL-everywhere ideals.
Using JSON for local IPC -- and requiring IDE language support to funnel itself across that high-overhead IPC connection -- is fey.
This times 1000.
AN API IS ALREADY A PROTOCOL but it doesn't require external processes, serialization, extra failure modes, etc, etc.
The fact that the HN community seems to have jumped aboard this idea, "yeah let's just require a server to do something simple like format text in your editor", is completely flabbergasting. People just seem to have NO IDEA how much complexity they are adding, and don't care.
Maybe in 5 years our machines will be running 10,000 processes at boot because people will want a server for every operation...
Do you know how these IDE features are currently implemented in editors like vim? Unless there is built-in support (e.g. ctags), most plug-ins that provide language-specific features do so by running an external tool, sometimes going so far as scraping the compiler output.
This means that on every single change, a new heavyweight process is created, communication happens over unspecified textual formats, and everything is likely to break with the next update because there is no stable interface.
JSON IPC with a continuously-running process using a well-specified protocol is a huge step up in comparison.
What would a good way be? Why is having the code completion and parsing in a self-contained library a bad way? Or are you just opposed to using TCP instead of an ABI?
A separate server has also some advantages: (1) it's not going to crash your editor, and (2) it opens up the possibility of keeping the language server running outside of the editor and available to other tools.
Yes, serialization and communication has a lot of extra cost compared to a function call but consider that (1) the request rate is limited by user typing speed, which means around 10 requests per second tops, low enough that it won't matter, especially compared to running a type checker, (2) all the calls in LSP can take a very long time to complete, you won't be able to turn this protocol into blocking API calls that you do from the UI loop, because of this.
Even with a plugin you would need to run into a separate thread (or possibly threads) and cancel requests after a timeout.
That said, LSP is a terrible protocol. They chose to represent all offset in terms of UTF-16 (!) encoding units, which is truly retarded since most editors won't be reading UTF-16 files nor will they be representing them internally as UTF-16.
I imagine JSON serialisation overhead is pretty small when compared to parsing/typechecking a Rust program, which is probably what the Rust Language Server has to do whenever anything changes..
Not to mention that a lot of people capable of writing the tooling would struggle to export a C API. I write Scala in my day job and it would take me a while to learn how to do that - and I've done some programming in C/C++ before.
I'm a little confused as to what you propose instead.
Suppose I have vim, and the Rust compiler. I want to add RLS level of support to vim. I download some vimscript plugin, and what? Do you distribute the rust language server as a compiled plugin that you add to the address space of the editor at runtime? And if there's a bug, and it segfaults, then it takes down my entire VIM process?
It seems like there's some complexity in directly calling the code with an API too. It's actually not to bad to just open a pipe and communicate.
Maybe I'm missing something, but wrangling compiled plugins seems like it'd be a bad time.
While I do love vim, its own high level of internal implementation brokenness doesn't really have much bearing on how one implements this sort of thing in a real multi-language IDE.
And as for your question, the answer is ... yes. Sure. Why not? That's what we already do in nearly all IDEs.
An API that can be accessed from heterogeneous languages will involve IPC.
Particularly since the best API will use the compiler's symbol tables (avoiding implementing syntactic and semantic analysis twice, buggily), and compiler implementation languages are even more diverse than editor implementation languages.
> An API that can be accessed from heterogeneous languages will involve IPC.
No. If your language cannot call into a dynamic library using a well-defined C ABI for your platform, then it is already failing to speak a standard protocol. Building all kinds of crazy, complicated, slow infrastructure in order to get it to successfully speak some other protocol, is a symptom of modern-day clueless programming.
> Particularly since the best API will use the compiler's symbol tables (avoiding implementing syntactic and semantic analysis twice, buggily)
Yes, this is of course a good idea. Why one presumes this requires a separate running process, I have no idea.
> No. If your language cannot call into a dynamic library using a well-defined C ABI for your platform, then it is already failing to speak a standard protocol.
This also involves a marshalling cost at the ABI boundary, which may be lower overhead than parsing JSON, but is significantly more brittle. And it's less ergonomic for many plugin/editor authors. And it can't be spec'd with a schema that isn't just "read the headers."
>This also involves a marshalling cost at the ABI boundary
Only for some languages, and that cost should be far far less than running a separate process, shipping json over pipes, and parsing the json
> And it's less ergonomic for many plugin/editor authors
I think that many modern programmers find this more ergonomic than a C ABI is part of what he is complaining about. Let's get comfortable with what is good, rather than make what we are comfortable with?
Why is it "significantly more brittle"? It is a well-specified interface. It is less brittle than talking over a socket because the kinds of points of failure involved with sockets don't exist in this case.
> And it can't be spec'd with a schema that isn't just "read the headers."
What does that even mean? It's a protocol just like any protocol, except you get the added benefit that for many languages it can be typechecked. Why are you claiming it can't be specified or that someone has to "read the headers"? What headers?
From your endorsement of "using the compiler's symbol tables" (paraphrasing) I took you to mean that you're proposing binding directly to GCC (or another tool) as a library, relying on it's internal data structures as this C API. Based on this comment, it sounds like you're now suggesting that this API should still be standardized and require translating from the compiler's internals into some standardized AST/symbol format anyways. I still think the latter is bonkers for several reasons (SIGABRT being one), but it's significantly less bonkers than what I had thought you were proposing initially.
> I still think the latter is bonkers for several reasons (SIGABRT being one)...
The fact that we're typing this, and it works -- without the entire world falling apart because of crashes in complex library code -- demonstrates why this is not remotely bonkers.
Not only that, but it means if the library crashes, your editor process dies. That hardly seems better than sending some text over a socket. At least if the external process crashes, your editor can just restart it.
A library API is bound to a specific language/runtime. But every language out there can speak JSON. Language servers are mostly written in the language they are for, because that language already has the compiler APIs. The editor is often written in a different language.
Well, different failure modes, maybe. If an external process crashes, then you just have your editor restart it. But if you've linked a library into your editor, and it crashes, then your editor crashes.
I much prefer either keeping that code in a separate process, or having that code written in a memory safe language, where it won't take down your editor when something goes wrong.
Incremental recompilation isn't fast enough to wait for between keystrokes, so in-process servers would run in their own thread. Along with accounting for arbitrarily incompatible language runtimes and memory management schemes, wouldn't we be looking at badly re-implementing half of a process-and-ipc infrastructure here, just without memory protection?
The question was not which library could present an AST for all those languages, but which kind of library format can be consumed from all those languages. And ideally without FFI and writing complex wrappers. C libraries non very convenient to use from C#, JVM, JS (node.js) or sometimes it isn't even possible (e.g. for JS in the browser).
The question is important, because otherwise one would constrain editors to be written only in a language which is compatible to the library format.
Using C libraries from C# or the JVM is very convenient, even today. There’s even automated systems to generate the entire bindings for the JVM, I’ve written bindings myself for a few libraries. You can just generate the interface file for Java from the .h with JNAerator, import it, and you’re done.
It does. Because there’s also libraries to automatically spawn a separate JVM and communicate with that via an IPC system. Or even spawn other things.
But if you want a system where I have to transfer gigabytes via a JSON IPC bus every minute, sure. That’s totally not going to destroy performance in projects that are several millions of code long with major auto-generated assets.
The language server protocol is useless for larger interwoven projects. The same issue appears already with JetBrains Rider (a C# IDE where the C# parser is implemented in a separate process)
Because tiny changes can have major effects, for example, if you use templating (or Java annotation processing) and change a template that’s used everywhere.
The IDE has to always stay responsive, no matter what is happening, no matter how large the change is.
Such a change may have a huge impact on the internal state of the language server. But on protocol side it's probably minor. You should just one relevant piece of the new state on the next Goto Definition / Auto completion / etc. request. No need to stream the whole new state from the language server to the client on each update.
My current estimation is that the protocol costs are O(1) regarding project size.
The developer of the rust plugin for KDevelop said he had to rebuild half of the rust parser because the language server protocol is so completely inadequate that it can't even be used to implement half of KDevelops features.
Same with stuff like data stream analysis, how does the language server protocol let you mark a variable and see a graph of where it comes from, and how it's transformed? How do you get proper structural autocomplete?
For many features the IDE needs actual full access to the AST. The language server protocol doesn't provide that, so you have to reimplement half of the language server in your own IDE plugin. And it's far too slow.
The language server protocol is comparable in quality to Kate's language plugins, which is okay for an editor, but completely unacceptable for an IDE. You can't do proper structural refactoring either. Proper structural extraction, proper autocomplete or ast transformations, automated inline functionality across files.
For a lot of functionality you end up having to try and get the AST back from the autocomplete and cursor info that the LSP gives you to fill the IDEs internal data structures, which is ugly, painful, and slow.
I've read the article and I partly agree with the author: To get the best-possible IDE experience you will need more information than LSP provides.
However I disagree that LSP is completely unacceptable for IDEs. I work daily with VSCode and it's LSP based language support for Typescript, C# and others. The amount of IDE features that I get for Typescript is higher than what lots of real IDEs provide for their inbuilt languages. The quality of the LSP based addons seems mainly to depend on the implementation of the language servers and not on the protocol, e.g. the language server experience for TS is top-notch, the one for C# is OK-ish and the one for C++ is pretty weak.
I see no problem on integrating any-kind of auto-complete support through LSP: The IDE/Editor just sends the position inside a document which needs to be completed and the language server responds with all possible suggestions. There's absolutely no need to store an AST in the editor for that. The AST can stay in the language server for that. It's the same for references to types, refactoring commands, etc. Yes, if someone needs some non-standard "data stream analysis" the command and results for that might not yet be standardized in LSP - but they could be added if it's worth it for the users. And I guess one can also have non-standardized optional extension APIs between a language server and an editor. In the end it's the same: The editor asks the language server (which has full access to AST): Give me "a graph of where the variable comes from, and how it's transformed", and the language server responds in a well-defined way which the editor just needs to render.
And back to the article: The point of the article is not necessarily that a Rust Language Service plugin wouldn't be good enough for 90% of all users. From my point of view it's more that the author is excited about implementing language support themself in their preferred way and as perfect as possible. That's an absolutely reasonable thing. If one is interested in implementing type-interference for Rust or other complex languages themselves then I'm sure it's a great learning experience. However it will also be LOTS of work, which for a huge language as Rust might be outside of the scope which most mortal developers will be able to do in their private time. If the end result is a half-baked implementation for an IDE which is more buggy or has less features than a language-server based solution (which could be shared between multiple IDEs), then nothing has been won. The language server approach helps here in a way that instead of many 30% language implementations (or even abondoned ones) in singular IDEs work can be contracted on one LSP implementation, which might reach 70-80% feature set.
> The IDE/Editor just sends the position inside a document which needs to be completed and the language server responds with all possible suggestions. There's absolutely no need to store an AST in the editor for that. The AST can stay in the language server for that. It's the same for references to types, refactoring commands, etc. Yes, if someone needs some non-standard "data stream analysis" the command and results for that might not yet be standardized in LSP - but they could be added if it's worth it for the users. And I guess one can also have non-standardized optional extension APIs between a language server and an editor. In the end it's the same: The editor asks the language server (which has full access to AST): Give me "a graph of where the variable comes from, and how it's transformed", and the language server responds in a well-defined way which the editor just needs to render.
That would be pretty stupid, though, as every language server would duplicate a massive amount of code.
LSP--or rather the general idea of a common interface between IDEs and language-specific static analyzers--is a very good idea regardless of GCC or VSCode. Which isn't to say that LSP made universally correct design decisions (e.g. it's fair to be grumpy at JSON), and also isn't to say that it represents the be-all end-all of IDE integration (the fact that it necessarily represents a lowest-common-denominator interface is well-understood). But having such a standard to establish a baseline of support between languages and editors with the minimal amount of work possible is something that needed to be done.
I don't see that as actually being true. There are thousands of native npm modules which work just fine with Electron. There's really no problem exposing native APIs to JS.
What VSCode has pioneered, and this is something which Atom got wrong, is a multi-process model, where the various IDE components communicate via IPC. This improves stability and allows isolation of 3rd party plugins, and provides mostly foolproof parallelism. When plugins are such a big part of the experience, this is a very good thing. It's a deliberate design decision, not some attempt to overcome the failings of JavaScript.
Couple this with the fact that limiting "API" to mean "C ABI" forces every compiler author to start exposing complex C structures, unsafe pointers and the like - which if their compiler is written in, say Haskell, or LISP is going to be particularly painful, v.s. implementing text-based RPC over a socket in whatever way is best for them.
If Language Server had been a C API then I seriously doubt it would have got much traction, as it's just too awkward for many compiler authors to implement. Unsafe C APIs are, frankly, last century's technology.
That's true for every combination of language / platform where the languages differ. Rewriting AST parsers for every language would be nuts. Overall this vastly increases accessibility / choice when it comes to languages. That's purely a good thing. People are still free to make hyper-optimized versions for whatever case they would like to.
Regardless of those points, having a standard protocol for this kind of thing makes a lot of sense, instead of binding directly to the AST data structures of each compiler.
shipping along with your language, some kind of API that is standardized so that editors can easily add high quality IDE level features to support new languages is a great idea
that this is currently being done via a server + json over pipes or whatever is a suboptimal way to do it, I agree.
Germany banned Hitler from public speaking in the 1920s; all this did was give Hitler more notoriety, and the NSDAP more propaganda material to work with.
Nazi Germany still did not emerge through democratic processes; they failed to gain a parliamentary majority despite years of street/voter harassment. Instead, they subverted democracy by exploiting a civil liberty loophole in the Weimar Constitution.
If anything, the rise of Nazi Germany demonstrates why such loopholes in protections for civil liberties are so fundamentally dangerous. For a more modern example, look at how "hate speech" laws in Russia are in fact used to silence political dissent.
> Asking for their perspectives is a sensible way to investigate what the reason may be.
Is it? Why?
When patients visit the ER, they often believe that whatever change in their diet that they're aware of and focused on -- like eating cabbage last week -- must be the cause of their ailment.
When parents had their children diagnosed with autism, they cast their net for the nearest change to be blamed ... and found vaccines.
Absolutely. People don't always know at a conscious level why they do or don't do things. It's part of the reason why user surveys are a poor substitute for telemetry on how an app is actually used in the real world.
..except that here, we haven't worked out the brain-computer interface yet.
The final diagnosis is not necessarily what the patient thinks it is, but it would be grossly negligent to not ask the patient about all information they think might be relevant.
They ask what the symptoms are. They do not ask the patient to determine what the cause is.
Rust's methodology suffers from severe self-selection bias and politically motivated thinking. It will unerringly produce the answers its creators want to hear; if it does not, the methodology and subject selection will "corrected" to produce such answers.
And similarly, I know plenty of people who consider gender-inclusive policies as a form of psuedoscience too (in that they don't give it much credence).
Diversity is quite literally the reason life still exists after billions of years and five mass extinction catastrophes. It is the foundation for the predominant economic system in human society as well as all of the technological and scientific progress we've made. Hell, you can't even really have cultural (in the broadest sense) progress without a diversity of ideas and opinions, let alone a functioning, stable democracy.
I'm skeptical of race- and gender-inclusive policies because they're often blindly implemented and detrimental to their goals but diversity has shown itself to be not just important, but vital, to the long term success of any large, complex system, whether it be the Apollo program or our planet's ecosystem. I'd love to hear rational arguments against, however.
> diversity has shown itself to be not just important, but vital, to the long term success of any large, complex system, whether it be the Apollo program or our planet's ecosystem.
No, it's causation. The probability that everything goes right in a sufficiently complex system is zero and the probability that a single failure will cause runaway feedback loops or cascading side effects is extremely high, which causes a stable system to go unstable and makes total failure almost inevitable. Without diversity, you can't recover from these failure modes (drastic changes in the ecosystem causing extinction in the case of evolution, limits of physics or economics causing a dead end in science and engineering, changes in the rest of the economy causing centrally planned economies to fall apart, etc).
That is a contrived counterexample that completely ignores the very basics of evolution and how changing environments help drive adaptation. Picking out one irrelevant trait here and asking if diversity is useful is like asking "how much value is there in diversity of spleens in the Rust Community?" They don't care about spleens, they care about people who are interested in Rust; just like natural selection favors organisms who can reproduce, not individual traits.
Natural selection will favor the animals who don't have to waste energy on pigmentation... until a billion years later when a sink hole opens up or one of the species evolves bio luminescence and all the organisms that lack pigments start to reflect all incoming light back at their new, hungry predators. That's the whole point: environments change all the time and the chance of a species surviving is dependent on the diversity of its members, just like the survival of carbon based life through a planetary mass extinction event is dependent on the diversity of species.
Evolution can only happen through random mutation so, by definition, any environment with evolving life forms is always changing unpredictably.
How do you know the (heavily politicized, myopically chosen, inescapably coarse-grained) identity groups they're targeting offer a form of diversity that actually has value in the realm of Rust? Why do you believe that these identity group traits correlate strongly with intellectual diversity that is necessary for the project's health?
If anything, this process (and the efforts from which it stems) seem purposefully designed to eliminate intellectual diversity, in favor of a rigid monoculture maintained by empowering political officers in the enforcement of right-think.
Based on your choice of words and other comments you've made, I believe you fear change and people who are different from you. I doubt there is anything I can say to convince you in the face of your prejudices, whatever they may be.
> If anything, this process (and the efforts from which it stems) seem purposefully designed to eliminate intellectual diversity, in favor of a rigid monoculture maintained by empowering political officers in the enforcement of right-think.
If your idea of a monoculture is a place where people have to be respectful of other people & cultures and not use culturally charged words then I think almost everyone in the Rust community would be happy with that outcome.
If you want to provide evidence that the Rust community is trying to eliminate intellectual diversity in favor of superficial qualities instead of sincerely trying to outreach to underrepresented communities that may have a lot to offer, I'm sure they (and we here on HN) will be happy to have a rigorous, respectful debate with the express goal of improving the experience for as many people in the community as possible, including you.
Until then, you are free to grind your axe elsewhere, perhaps somewhere without multiculturalism to make you so uncomfortable.
I don't think it's "fear of change" or "fear of people who are different" - not even "fear". I feel the sentiment felt is closer to "think poorly-of". Southern US racists certainly aren't "afraid" of black people: I believe they've been conditioned by negative racial stereotypes combined with their own sense of superiority ("blacks are lazy, no-good", "blacks are criminals", et cetera) so the idea of racial equality simply strikes them as silly - take that concept and apply it to today's debates: ("feminists are loud and unruly", "transgender people are freaks", "the other side are all fat women with purple hair who spend too much time complaining on their blogs instead of instigating real change"). I stress these are stereotypes, and certainly not representative, but doubling-down in response seems to reinforce certain negative stereotypes and make it harder to sell the idea of the "new normal".
I believe their concerns about the loss of "intellectual diversity" are genuinely felt - but frame it as someone who genuinely believes themselves and their opinions to be level-headed and that these new voices, who are telling them that their opinion are wrong, will of course put someone on the defensive, it's only natural to feel a creep of thoughtcrime policing.
I hate to use a cop-out cliché but I feel that "both sides" need to apply empathy when engaging in debate with their opposition: those that feel out of place and get defensive, or simply think these are overblown matters, are not deliberately out to actually oppress anyone - and those campaigning for more equitable treatment are not being opportunistic.
Well, an open challenge that might help someone understand what I actually think, at least along one axis:
Let's say I want to objectively evaluate the notion that there is such a thing as an arbitrary, self-declared, non-binary "gender" (or "gender identity") that can range across any number of "genders".
In that case, can you specify the set of propositions used to classify something as a "gender"?
Is your definition purely self-referential (cyclic)?
Does your definition exclude other social self-identifications, such as "goth" or "emo"? Why or why not?
Does your definition rely on references to "biological sex" (e.g. male/female)? If so, what are the sexes "male" and "female"?
Matters of personal-identity are completely orthogonal to what the Rust community should be about.
Reading your posting, I think you're implying that non-traditional notions of gender is evidence of irrational thinking, and you think Rust community would be better-off with an exclusively "rational" (by your measure) membership.
My retort is that it is completely irrelevant - I compare it to admitting open young-earth creationists simultaneously with adherents to Wahhabism into the Rust community: both of those positions (in my opinion) are as irrational and non-evidence-based as otherkin or your notion of gender-identity, and yet all of those individuals are capable of making valuable contributions to the language, the runtime, the standard library, packages and so on - accepting their work has nothing to do with condoning or endorsing their opinions (for example we still call radiation meters Geiger counters, even though Hans Geiger worked on Nazi nuclear weapons).
I won't respond to your questions posed because it's both outside the scope of this discussion and I believe poses a dangerous distraction to identify a wedge with which you can coarsely separate people into groups you think you would agree with - and more importantly: we should not be pontificating on gender-identity because none of us are subject matter experts in the field.
That's like refusing a fresh cheeseburger because you're afraid that by the time you bite into it, it'll develop botulism - even as you're minutes away from starving to death. You're nitpicking tiny details and dismissing a clear improvement because it does not conform perfectly to your idealized standards. Life is messy, people make mistakes. That just means we keep moving forward and self correcting when we need to not when we make up entirely hypothetical downsides, most of which never end up happening anyway. I repeat, again: you have provided zero evidence for your claims that the Rust team is doing the wrong thing or heading in the wrong direction.
This insistence on using hypotheticals instead of providing evidence screams fear; not a rational evaluation of the community and its plans. It's the same tired strategy used by conservatives for thousands of years to fight literacy, education, suffrage, abolition of slavery, welfare, universal healthcare, and pretty much everything good that has happened in human society. No one but its rhetorical peddlers take it seriously because it is purely self defeating: if you're too paralyzed by hypothetical issues to take the first step, then those issues will never be resolved, freeing you from facing the uncomfortable change ahead.
> I don't think it's "fear of change" or "fear of people who are different" - not even "fear". I feel the sentiment felt is closer to "think poorly-of". Southern US racists certainly aren't "afraid" of black people: I believe they've been conditioned by negative racial stereotypes combined with their own sense of superiority ("blacks are lazy, no-good", "blacks are criminals", et cetera) so the idea of racial equality simply strikes them as silly - take that concept and apply it to today's debates: ("feminists are loud and unruly", "transgender people are freaks", "the other side are all fat women with purple hair who spend too much time complaining on their blogs instead of instigating real change"). I stress these are stereotypes, and certainly not representative, but doubling-down in response seems to reinforce certain negative stereotypes and make it harder to sell the idea of the "new normal".
Prejudice, like all elements of human psychology, is complicated but the longer you're around it the more you start to see distinct patterns emerge, each with their own rhetorical strategies. I think in this case it is fear because teacup50 only mentions the people who Rust is targeting with their outreach in passing and even implies that he agrees (or at least "doesn't disagree," whatever that means) with the goals of the effort. I see no evidence that he thinks of minorities, women, etc. as beneath him so it leads me to believe that he views the explicit effort of including them as an attack on the integrity of the community and - by implication - his own identity (let's assume ftm he's part of the Rust community but it could also be him lashing out because of the same thing happening elsewhere). It's not necessarily that the new people will make it worse, but that it is the process of bringing those people into the fold that will do the actual harm. It's defensive tribalism in its most fundamental form: fear, uncertainty, and doubt. I'd even hesitate to even call it prejudice - it's really more like a visceral reaction to a perceived loss of or attack on status - but in practice, the two are hard to differentiate and at some point you have to stop giving the person the benefit of the doubt and start calling a spade a "spade."
> I believe their concerns about the loss of "intellectual diversity" are genuinely felt - but frame it as someone who genuinely believes themselves and their opinions to be level-headed and that these new voices, who are telling them that their opinion are wrong, will of course put someone on the defensive, it's only natural to feel a creep of thoughtcrime policing.
I agree wholeheartedly. I distinctly remember several situations on the rust users mailing list and /r/rust where I felt that Rust team members (not the community but the official Rust team) went way too far into the realm of thoughtcrime policing to the detriment of the community. I've been waiting for him to bring those up as evidence of his position so that we can have a merit based discussion on how to avoid such mistakes in the future but he has done nothing but provide unsubstantiated opinions and hypothetical questions meant to lead someone towards his foregone conclusion (even though he frames it as doubt, another common but transparent rhetorical tactic).
> I hate to use a cop-out cliché but I feel that "both sides" need to apply empathy when engaging in debate with their opposition: those that feel out of place and get defensive, or simply think these are overblown matters, are not deliberately out to actually oppress anyone - and those campaigning for more equitable treatment are not being opportunistic.
Again, I agree wholeheartedly. This is an important conversation to have because otherwise, the entire process threatens to devolve into extreme multiculturalism for multiculturalism's sake. That is not only counterproductive but outright dangerous because it does nothing but polarize otherwise compatible groups of people. Every few decades our culture seems to hit that political correctness peak really hard which just causes another backlash from those who feel they are marginalized. The current (disastrous) political situation in the United States is clear evidence of that polarization and backlash - and it's not doing anyone a lick of good.
Even if I axiomatically disagree with someone's arguments, I am happy to engage and come to a middle ground where we make as many people as happy as possible just like I'd engage a flat earther who presents concrete evidence, if only to show him that he is misinterpreting it. However, just like most flat earthers, teacup has refused to provide any evidence other than a gut feeling and that is in no way a genuine attempt at constructive dialogue.
Or I could be dead wrong. He could just be playing a devil's advocate who is really, really bad at communicating.
The science is bad and the politics are deleterious. That has nothing to do with prejudices on my part; thinking that the methodology and behavior is naively toxic at best doesn't mean I disagree with the egalitarian aims that are claimed to be the motivating factor behind this political ideology.
What science and what politics? Why are they bad or deleterious? Why are they naively toxic? Can you provide any examples? I'd be happy with just one because even an isolated incident can be learned from and used to improve the community.
You can't claim to agree with their egalitarian aims and then absolutely refuse to provide constructive feedback or even any evidence of your claims that their behavior is naive, myopic, counter-productive, etc. It's the logical equivalent of "I'm not racist but..." followed with a comment about how other races have smaller brains as if its a statement of fact with no evidence to back it up.
Because you don't benefit from half of your population if you only teach men to code. The same applies to other minorities. Of course it is essential that we can get everyone to code so that everyone can contribute in the future when a lot of work will require coding skills.
Frankly, the white male privledge guilt message is getting old. Programming is mostly self taught, through hours of social isolation researching and seeking it out. Short of writing politically neutral docs that focus on the subject at hand, there is no need to evangelize. It seems more arrogant than anything.
It's not supposed to be about guilt. It's supposed to be about empathy towards those who don't get that privilege.
(Side note: the word "privilege" is really poorly chosen, because in practice it has a strong implication of blame and guilt assignment in our culture.)
Starting when I was twelve, I worked for multiple summers doing landscaping to buy a computer; mowing yards, hauling wood bark, dirt, laying sod, ripping up sod.
It was hard, sweaty, backbreaking labor, especially for a 12 year old; I did it because I wanted to program. I used that computer to teach myself to program, and got my first real programming job from someone who I'd never previously met in person, over IRC.
According to today's identity politics, I need to be aware of my privilege as a white male nonetheless, and my position on the coarse-grained intersectional hierarchy of privilege.
I'm not really sure what I'm supposed to do with that once aware, or why they get to decide where everyone fits on that hierarchy on the basis of traits they deem important.
> According to today's identity politics, I need to be aware of my privilege as a white male nonetheless
Yup. That's because as a white male, you are still "privileged".
I'm using quotes here, because, as mentioned earlier, I don't think the word is a good fit for the concept it's supposed to describe. Privilege implies something above and beyond what you are normally entitled to; something that you don't necessarily deserve (I think that's the main reason why it elicits such a strong negative emotional response in people). As a white male, you are not getting such things - you're getting normal treatment, in a sense that no-one is making negative assumptions about your intellect, your ability to learn etc on the basis of your race or gender (they may well be making them based on other traits, and you can be underprivileged on the corresponding other axes). The problem is that others do get negative points solely on account of their gender, color of their skin, or even their name alone (http://www.nber.org/digest/sep03/w9873.html).
So you don't really have a "white privilege", but rather they have a "non-white handicap". It's not your fault - but because of said privilege, you're in a better position to try to correct it somewhat.
> I'm not really sure what I'm supposed to do with that once aware
Try to do what you can to even things out. I'm not even necessarily talking about politics, but just day to day things. Have you ever seen a female colleague being talked over in a meeting in your presence? Try to steer the discussion to give her voice. Ever been on a hiring interview loop, and heard dismissive racial or cultural stereotyping of a candidate based on their name alone ("I wonder if he codes like an Indian" etc)? Point it out. And so on.
The 'white' part is just as problematic. There are many other things that make life hard too. Having a mental illness, for example, or being molested/abused as a child. Yet no one talks about non-mentally-ill privilege or unmolested privilege. Just because skin color is something that can't be hidden when we go out in public doesn't elevate it above all the other difficulties or challenges in life. I've known people from all those groups and would say that depressives, schizophrenics and victims of molest have far more challenges in life than do minorities. Which is not to say that there aren't people who have more than one set of challenges, just that the focus on race and the implication that life is easy for someone who's white is not productive and frustrates people with other legitimate challenges in life.
This is, to me, why many reasonable people have a problem with identity politics. In an effort to bring awareness to the struggles of some, it actually ends up making others feel marginalized and their experiences minimized and is confrontational in nature. Empathy is about envisioning yourself in someone else's situation. The way we do identity politics today, it's the reverse. Instead of the desired, "I imagine myself in your position and I see how hard it must be" it's "I imagined myself in your position and, trust me, it's easier than mine."
I have a parent who is a psychologist. Growing up, I was taught that the right way to handle conflict was to always talk about your personal experience. Saying, "you're being insensitive" is accusatory, controversial and bound to cause an argument. Saying, "I feel unappreciated" is an unequivocally correct statement that can't be argued because no one else can know how you experience something. The only way to sort of refute that is to say, "I don't intend to make you feel that way." The problem with the term "white privilege" is that it's not a personal experience term. It's a term that encapsulates a projection of the white experience from the perspective of minorities. Anyone who doesn't feel they lead a privileged life will instinctively reject it. We need to be using terminology that's in line with the "I feel..." way of expressing oneself...terminology that helps convey the difficulties that some people face rather than the lack of difficulties everyone else faces.
> The 'white' part is just as problematic. There are many other things that make life hard too. Having a mental illness, for example, or being molested/abused as a child. Yet no one talks about non-mentally-ill privilege or unmolested privilege.
Actually yes, we do talk about that stuff as well. The things that you hear most - racial, gender, wealth and religious privilege - are talked about more simply because they affect proportionally more people.
The university I attended had females comprise 90% of all students in computer science, and over half of all students in engineering fields. Anecdotes are fun.
Maybe there's some magical place where you can actually learn how to program from someone else, but for the rest of us, there's no substitute for solitary hours grinding away actually doing the work.
Or create problems, since they give a huge amount of leverage to those interested in dominating others through purely institutional/bureaucratic power games.
Rust draws on Coraline Ada's Code of Conduct ("Contributor Covenant") which was straight-up designed to force politics on people. She's aggressively pushed it in as many places as possible so they can ban people later for essentially saying what people like her disagree with in any forum (not just project itself). Here's one mob attack over a Twitter comment that didn't work featuring her with appearance of a Rust team member appearing to throw in some support. At one point, the political attackers set the maintainer up to look like he or she supports pedophiles. Dirty, dirty tactics pushing politics that don't even necessarily represent the beliefs of those they claim to protect. People in minority groups have a wide range of beliefs with many contradicting what these "social-justice warriors" act like they believe. They'll censor them, too, if deemed necessary.
I don't follow the Rust conversations enough to tell you anything about what goes on there. That they adopt and enforce a Code of Conduct designed for censorship of non-believers in that cause with a few, strong supporters on the team is enough to worry anti-censorship people. Regardless of Rust project or somewhere else, we opponents block it on grounds of fighting forced compliance with political views with no consensus. Fighting political domination on forums that are supposed to be about tech. That its author has hit many places from Opal to Github means we prefer to block her CoC even more.
Rather have a Code of Merit with clauses for keeping things civil. Minimal to no politics: just project-focused code, docs, and support of people in project-related conversations. That's it.
> Rust draws on Coraline Ada's Code of Conduct ("Contributor Covenant")
I have no idea what you're talking about. Rust's code of conduct is older than the Contributor Covenant, and we've modified it very, very slightly since its inception.
(I also disagree with the rest of your post as well, but that's offtopic and I don't particularly care to discuss it.)
Hmm. I may have assumed you used it because it's referenced in your Code of Conduct as source material. Googling it more gives me multiple sources:
1. Node.JS (semi-fitting of my post) and Contributor Covenant (target of my post) per current site.
2. Citizen Code of Conduct per Reddit. Looks just like Contributor with same provision that people must follow the politically-dominant group's rules on every forum or be blocked. It claims to be derived from Django Code of Conduct (partly-political/partly-good) and a feminism wiki. The latter mentions things like "Verbal comments that reinforce social structures of domination" with long list following to be subjectively evaluated according to their politics. Just like my post again.
So, even if your people did one before Contributor Covenant, it's similar to that pushed by the same kinds of people shoving politics down everyone's throat with some of the same content. My post still stands with the correction that you borrowed from different leftist, control-freak politicians initially with minimal modification from the one I mentioned. Each of the source are very clear about their political agenda and intended censorship.
Ahh. That got tested on another forum I was on. Painful memory that was a good example of how a huge chunk of the U.S. and many computing pioneers would possibly be censored based on their speech alone.
Yes! It is similar to how no amount of software process can protect you from one bad team member. Also, somehow that bad team/forum member always ends up being the person to use those processes or codes of conduct to crush others.
This is an issue of eyesight, a ridiculously high DPI without corresponding scaling, or both.