Learning binary and knowing what binary is are two separate things. I don't think you need to learn binary to learn to code. Been doing this for 25 years working at some big name companies, I don't know binary other than 1 is on, 0 is off.
I'm starting to feel like this whole notion that newbies can't come in and learn say React or modern JavaScript without having to go all the way back to that day in 1995 when they were hashing out the std lib for JavaScript and learn every thing about it before that "Hello World!" in React starts is becoming a gatekeeping method.
> Learning binary and knowing what binary is are two separate things. I don't think you need to learn binary to learn to code. Been doing this for 25 years working at some big name companies, I don't know binary other than 1 is on, 0 is off.
You're wearing your lack of CS knowledge like some kind of badge of honor. Of course you don't need deep CS knowledge to be a competent programmer. But almost undoubtedly you'll be a better programmer if you do know CS (I'm using binary here as a proxy for CS since I can't imagine having a deep knowledge of CS without knowing something fundamental like binary). How many times over those 25 years could a problem have been more efficiently solved (in programmer time or CPU time) if you had known about problem solving techniques (perhaps those related to binary knowledge for example) that you don't know from the world of CS? You'll never know... You may guess, but you'll never know what you don't know.
It's not gatekeeping to say that to give someone a complete education to be a software developer we should provide them a knowledge of binary. We can teach programming in an approachable fashion AND teach binary in an approachable fashion. We do it every day at my college.
Why are you talking about CS and nonexistent "problem solving related to binary"? By "knowing binary" we are not talking about knowing machine code instructions or the details of how they are executed, but literally knowing how to read and work with binary numbers (using bitwise operations). Which isn't necessary for problem-solving or implementing most algorithms.
(Yes, there are algorithms that use bitwise operations. They're technically expendable and it doesn't make you any less of a programmer not to know everything. Especially if you're using Python or JavaScript!)
Are you joking? Without understanding binary, you can't understand:
- Numeric types, which numbers can be represented exactly, their failure modes, etc.
- Bit-field flags, e.g. for enums
- IP address masks and other bitmasks
- Anything at all about modern cryptography
- Anything at all about data compression
- The various ways color is represented in images
- Any custom binary format, MIDI, USB, anything low-level at all
Honestly the list goes on and on. It's absolutely insane to me to hear people say that you can be a competent software engineer without understanding binary.
The average web CRUD developer never needs to touch any of this stuff.
- Numeric types? Who cares? I know min, I know max. I take number from user and insert it in database. For calculations with money, I use integer cents.
- Bit-fields? I work in Java, what are bitfields?
- IP addresses? I am web dev loper, not network engineer. I don't need to deal with netmasks.
- Cryptography? Me no understand. Me use Let's Encrypt. Is secure, no?
- Compression? Browser do gzip for me. Me no care.
- Colors? I pick the nice color from the color wheel.
- Binary? What is binary? I only use binary when I want users to upload a file. Then I put the files on S3.
Well I do embedded dev as a hobby now so I know this stuff. But for a long time I didn't know how many bits were in a byte simply because I never really needed that knowledge.
Look, it's fine if you want to be hobbyist making some personal website that doesn't store PII or take user submissions. But we're talking about career software developers here. If you want to make a career out of it, this attitude is not only harmful to your career, but dangerous to your company. Besides: do you really not want to actually be good at what you do?
My attitude is: I learn whatever I need to learn to get things done.
And for a long time, I've just never needed to learn about how many bits were in a byte.
The only time I've ever needed to deal with individual bits in Java was when I was working with flags or parsing binary formats. Both of which are extremely rare when doing generic server dev.
You can even do rudimentary binary reverse engineering without any knowledge of bits. I remember running programs through a disassembler, looking for offending JMPs and then patching them out in a hex editor with 0x90.
Not having knowledge is not a problem as long as you know that you are missing that knowledge.
You have an artificially high bar so you can gatekeep people from being smart and be the arbiter of who's smart and who's not. What you don't realize is most people don't give a crap and easily hundreds of billions worth of software is sold every year by developers who don't know about anything you mentioned.
Your attitude is also inappropriate for a place called "hacker news" where people are resourceful try to do the most with whatever they have. Maybe you want to go to /r/compsci
You don't have to be a "competent software engineer" to be a developer, and in fact, we were never talking about being a "competent software engineer".
These developers do not get jobs as "competent software engineer"s, do not train to be a "competent software engineer", and do not care about being a "competent software engineer". And yet they make things that work perfectly fine! I'm sorry that you think handling PII or having a career (ooo) has anything to do with being a "competent software engineer".
> But we're talking about career software developers here.
I don't remember the name of this fallacy but it sucks, knock it off.
It's absolutely insane to pretend most software engineers need to do any of these things. We're making billions of dollars in aggregate without knowing this stuff. Most people aren't writing their own data compression algorithms, and we sure as shit aren't writing cryptographic algorithms because we're told to leave that one to the experts (i.e., mathematicians).
I'm guessing 80% don't even know bit-field flags and never have to use them, despite them being relatively simple. It would also take someone moderately capable at understanding math less than an hour to "learn." I learned them when I was 13 because I fucked around with MUD codebases, and I don't think I am special for it.
Do you realize that once you write code to solve a problem, that exact problem never needs to be solved again? Either you're solving new problems (and "new" can be slight -- new situation, new context, new hardware, new people, new company, whatever), or you're doing the compiler's job. If most people aren't solving new problems, then their job is bullshit. I frankly don't even understand how you can be confident that code copy-pasted from Stack Overflow actually does what you need it to do without understanding the fundamentals.
> we sure as shit aren't writing cryptographic algorithms because we're told to leave that one to the experts.
Shouldn't we all be striving to be an expert in something? If you're not working your way toward expertise, what are you doing? Why are the absolute fundamental basic building blocks for understanding how computers work and what programming languages are doing something that only "they" need to bother to learn?
Most of the problems software engineers are solving today are business problems defined by stakeholders in a business.
And yes, I agree we should be an expert in something. Harping on binary seems like a waste of time, however. I would certainly like that people are interested enough in the field that they spend their time learning as much as they can about CS, but I'm under no illusion that I'm going to automatically be more productive or better as a software engineer because I know bit-fields.
We're "harping" on it because it's so basic. There are a hundred equally basic things you should also know.
> Most of the problems software engineers are solving today are business problems defined by stakeholders in a business.
Even understanding the business problems usually requires a lot of fundamental knowledge in a wide variety of subjects. Being a good software engineer is hard.
And regardless of the problem, if the solution is going to be in code, you can't get away from binary. I actually don't think most programmers should be learning assembly language, because you can actually completely abstract away from it (and you should, because you don't know which assembly language you're programming for). But you can't abstract away from addition, the alphabet, binary, strings, algorithm complexity, and other basics.
PS: I didn't downvote you. I don't downvote people for disagreeing with me. I like disagreement, and besides, it would reduce the visibility of my pithy responses! I only downvote comments that aren't even worth arguing with. So the very fact that I'm taking the time to respond means I have at least some respect for your argument.
"Knowing" binary isn't some deep, fundamental CS knowledge, it's a party trick. Knowing about different number systems in general is nice, but hardly foundational.
The actual choice of what we consider part of the "core" CS curriculum is pretty arbitrary. Why do we consider, say, binary part of that but semantics not? Would it be "gatekeeping" to say that you can't be a good programmer without a passing knowledge of, say, operational and denotational semantics? Do you really have a "complete" CS education without it?
> "Knowing" binary isn't some deep, fundamental CS knowledge, it's a party trick.
Reread my comment I never said binary in-and-of itself is "deep" knowledge. I said not knowing it is a proxy for a lack of CS knowledge more generally.
> But almost undoubtedly you'll be a better programmer if you do know CS (I'm using binary here as a proxy for CS since I can't imagine having a deep knowledge of CS without knowing something fundamental like binary). How many times over those 25 years could a problem have been more efficiently solved (in programmer time or CPU time) if you had known about problem solving techniques (perhaps those related to binary knowledge for example) that you don't know from the world of CS? You'll never know...
You're both right and wrong.
Principles are a lot of cognitive debt to take on, and not strictly necessary to be functional. Literally anybody can sit down with an IDE and write code to merge spreadsheets or whatever the actual business need is. We teach this stuff to anybody with any interest. Kids, even!
If someone thinks they want to be a plumber, they shouldn't start with a 4-year commitment to learning principles of hydraulics and mechanical engineering-- makes much more sense to apprentice, lay some pipe, and if it's the job for you, then go back and learn it at a deeper level. Doing it the other way is why college is so ridiculously expensive and most spend 5 to 6 figures on a major and then go manage at Target.
After failing discrete math three times and giving up, I managed to make it almost 20 years in this industry before needing to learn anything about binary math-- and even then, it was only so I could understand how flags in old MUD code worked, for fun. Truth tables are important, but there has never come a logic problem I couldn't solve by taking a few extra steps to be explicit where you might reduce the logic to a compact blob of symbols. I'll never optimize code as well as someone classically-taught. I don't know what Big-O is-- and outside of a botched Google interview absolutely not a single employer or client has ever given a shit. Nobody has ever been victimized by my code. The "CS" way implies the academic approach is the only way to do things. It's the textbook definition of gatekeeping. All academia did was appropriate and institutionalize experiences yahoos like myself have always learned through ingenuity and trial-and-error.
You've learned more than I do, so you have more tools in your bag. The only thing that sets us apart is that I won't be advancing the industry or publishing anything on arxiv anytime soon. Outside of that, you're going to struggle with quantifying how you're better than the self-taught without resorting to speculation or guild/union mentality (gatekeeping).
You don't know that. Actually all code that is inefficient is victimizing both the user (via performance) and the environment (unnecessary energy usage). I'm not saying you've done anything wrong, I'm just saying we all don't know what we don't know. I'm sure my inefficient code has had many users and electric bills as victims. I released software before that subsequent versions where I had better CS techniques improved 100 fold in performance. I wasted my users' time before I learned how to do it better.
> You've learned more than I do, so you have more tools in your bag. The only thing that sets us apart is that I won't be advancing the industry or publishing anything on arxiv anytime soon. Outside of that, you're going to struggle with quantifying how you're better than the self-taught without resorting to speculation or guild/union mentality (gatekeeping).
You made a lot of assumptions about me without knowing my background. I was a self-taught programmer as a child/teenager and my undergrad degree was in economics, not CS. I went back to school to get a masters in CS which was difficult coming from a self-taught background. I did programming both paid and hobbyist for over a decade before that more formal education. And I write books read largely by self-taught programmers.
Saying a full software development education includes binary and the fundamentals of CS is not gatekeeping, it's a low bar if we don't want inefficient software wasting users time and sucking energy. I'm not saying you have to start there, I'm saying it should be part of your education as a programmer whether self-taught or formally taught.
That said, there's also a point in here that's often underappreciated. There's a big difference between someone who learns today's modern tools from nothing and someone who has a good foundation learning today's modern tools. I think it's fundamentally one of approach. The former treats coding as a trade where you just need to learn a few tools. The latter treats it as a vocation where fundamentals allow you to learn whatever tools you need.
The one makes sense short-term - it gets people to paying work with modern tools in a minimum of time. The other makes sense long-term - it keeps people in paying work with modern tools for decades.
When you're just getting started and looking to get that first job with the life-changing six figure paycheck, all that farting around with fundamentals that the gatekeepers are talking about seems like an absolutely massive waste of time.
It is gatekeeping, but gatekeeping can and sometimes does serve a purpose that is not just the purely selfish.
I've found in my years mentoring and teaching that showing something is much better at keeping them learning and interested than "Sit down and read 30 years of out dated documentation and coding practices just so you can feel the pain and agony I had, then and only then when you've proven yourself can you spit out that hello world on the screen, filthy scum!"
Give them a codepen with modern React already bootstrapped so they can start just tinkering with it and changing things, man watch their EYES LIGHT UP at the possibilities... Every time I see this happen it takes me back to 1997 when I was first learning to build websites.
You're completely right. That's a wonderfully kind, empathetic, and compassionate approach that's incredibly effective for teaching people what kind of power they are starting to have access to.
I've found it's also one that is very expensive as measured by instructional time and energy. I've also found it relatively ineffectual for teaching fundamentals.
I do not have to lecture someone about how they are unworthy and useless to know that understanding a bit of discrete mathematics, HTTP fundamentals, DNS, or relational algebra will make them better software engineers. There are absolutely people in this world who, when learning the glories of React with a codepen, will ask how things work several times and learn big-O notation... but there are far more who won't ask but would benefit from knowing anyway.
Do you think it's perhaps possible that people benefit from both a solid grasp of the often-boring fundamentals as well as feeling the joy of tinkering?
> Do you think it's perhaps possible that people benefit from both a solid grasp of the often-boring fundamentals as well as feeling the joy of tinkering?
I don't think they were explicitly excluding the former, but rather saying it's important to get someone interested before they even become interested in learning the fundamentals.
This seems like a good context for the quote, “if you want to build a ship, […] teach them to yearn for the vast and endless sea.” I don’t think that Saint-Exupery intended to suggest that the yearning was enough on its own, but it makes the process so much more effective.
The trick is doing so without demeaning the value of basic carpentry. Which would be obviously silly in building a ship, but in computing we frequently have people looking to become software engineers without encountering or learning the fundamentals of the field.
This particular project comes from people who regard fundamentals as optional.
There is a difference between going all the way back to 1995 / reading old documentation vs learning the basics of CS.
This strawman is used in several comments. CS is not "some old knowledge you might never use". Knowing that it's way faster to search by key in a hashmap rather than iterating through a whole array is useful. Knowing why it's a bad practice not to have primary key (and other DB knowledge) is useful. Knowing the stages of a HTTP request is useful.
You can get a job and actually do some productive work without any of that, but that some point not knowing all those basics is going to harm your work.
+1, and also, CS fundamentals absolutely can be learned & exercised in the use of high-level tools. One of the beauties of that knowledge is that its concepts are transferable across domains and layers of the computational stack.
Pro tip for anyone working w/ junior devs, especially those who came through bootcamps and the like: you can point them to CS knowledge without actually calling it CS.
haha you caught that before I fixed it, that my friend is the result of my failure of math in my great edumacation in the US public school system hahaha.
I've literally had to use knowledge of binary and number representation just last month in order to implement a lower level protocol. You may not use it in your simple CRUD app jobs but it's absolutely not "some old thing people only had to use back in the day."
I occasionally have to manipulate binary numbers too. But, aside from bitwise operations, I almost always forget what I had learned last time and have to re-read the material again.
I've gotten to use bitwise operations like twice in a 20-year career.
Trivial applications of things that might go in the first couple weeks of an algo class, about the same rate of use.
I always get a little excited when I get to use those things. Like spotting an old acquaintance in a restaurant in another city. "Hey! It's that thing they said was really important but in fact my career would basically be identical if I never learned it at all! Long time no see!"
[EDIT] Still waiting to use math past what I learned in 6th grade, aside from extremely rare use of very simple linear algebra or plugging in stats formulas that I looked up anyway because I don't trust my recollection since I use them so rarely. Doubting I'll even once need any of it before I retire, at this point. Which is great, because I've entirely forgotten all of it, on account of never needing it.
> My point is that most jobs are simple CRUD app jobs or positioning divs on a screen...
It's really not "most jobs." Although I do agree that a CS or software engineering degree is overkill for that type of stuff.
Also, "knowing binary" is a strawman, and not a very good one. A newbie developer getting confused by bit flags isn't a big deal. Point them to Wikipedia and let them read about it.
The much bigger problem is when inexperienced developers go off and write a ton of bad spaghetti code, re-invent a bunch of wheels they never learned about, and generally just write crap because they're clueless about best practices (or even any practices at all). Now the clueless newbie is slowing down everybody else and creating a maintenance nightmare.
TBH, most new developers are pretty bad (self taught, university taught, or whatever). The important thing is having experienced people around to point them in the right direction and help them get "real world" experience.
> I don't think you need to learn binary to learn to code. Been doing this for 25 years working at some big name companies, I don't know binary other than 1 is on, 0 is off.
I... guess?
But the full explanation of binary is only a paragraph long. At a certain point that seems like something you'd have to avoid on purpose.
> Learning binary and knowing what binary is are two separate things.
Really? Binary isn't a language like Python, it's a notation; either you understand it or you don't (you could say it's 'binary', I suppose). If you don't understand it, you don't know what it is.
Most people, in fact, do not have to do this. That’s an exceptionally rare thing to do that I would estimate a fraction of a percent of developers will ever encounter.
I would have put the number at 75% rather than a fraction of a percent. What do all these people do, these "developers" that never have to use bitstrings in python, deal with encodings, endianness, interface with hardware, etc.? Are they all UI people?
Most developers interacting with files are going to be doing it through a higher level API, so while sure, technically they're stored like that, it's not like that's what the developer actually has to deal with.
And most developers don't interact with hardware day to day, no. That's an exceptionally small set of people.
But also... everyone interacts with hardware whenever they use a a computer.
I think our difference of opinion has to do with abstraction layers here. Just so you know where I'm coming from, I work on web apps in java, python, and js, C++ desktop applications, and I operate a mars rover using various domain specific languages. Before switching to engineering I was a scientist and had to deal with data collection in the lab and field from all sorts of instrumentation which often required understanding low-level protocols and the movement of bits and bytes.
It's hard for me to imagine the world you're describing... sounds like it's full of script kiddies.
And hex, or any other base for that matter. Fundamental concepts and not difficult... I mean, hopefully people learned about place value in elementary school...?
I'm starting to feel like this whole notion that newbies can't come in and learn say React or modern JavaScript without having to go all the way back to that day in 1995 when they were hashing out the std lib for JavaScript and learn every thing about it before that "Hello World!" in React starts is becoming a gatekeeping method.