Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
[dead]
on June 8, 2023 | hide | past | favorite


Hello HN! I started this project a few years ago when I was unfortunately rejected by Codecademy for a software engineering position for the 3rd time. I love coding, and I love helping others learn to code. In my spare time I volunteer at Justice Through Code (https://centerforjustice.columbia.edu/justicethroughcode), which helps previously incarcerated people re-enter the workforce. If you're passionate about helping others learn to code, it's a great program to check out.

Recently, I've added a chatGPT powered assistant to Codeamigo, to help learners when they get stuck. Someone in the ed-tech space once told me that over 95% of learners quit once they reach an error that they can't recover from. I hope that we can use AI to assist experts and learners alike.

If you're interested in trying out the demo: https://codeamigo.dev/v2/lesson/hello-codeamigo/step/intro


Do you have any recent figures for job placement?

The market is very tough right now, and even well qualified seniors can’t find work. The general consensus is if you don’t have a degree you shouldn’t even bother (unless you have many YoE, then only maybe) and should go to college or leave the industry. Is that true in your experience?


Most companies don't let you interview more than once per year or once every six months, so if he/she has been rejected three times, I would expect their earliest interview to be over 1.5 years ago


I've never heard of this rule. If a company has three positions, for which 15 equally excellent candidates are shortlisted, and then three months later another position opens up, one of the remaining 12 will have a shot at it. (Of course, the not-shortlisted ones, not likely.)


Google, Facebook (Meta), and Amazon all had a minimum six month re-apply period when I last interviewed with them.


When was that? 2 years ago I was interviewing with Facebook and when I didn't get picked up for a position they offered to put me in for another one to interview a couple weeks later (I opted out, it was clear from the interviews and discussions WFH was considered temporary and I didn't want to switch jobs just to have to quit later since I had no intention of moving to a high cost of living area).


That probably speaks more to the fact that they are lax with the rule for a reasonably well qualified candidates who they actually may want to hire.

The rule in the first place is probably targeted at the bulk of lower quality candidates to avoid them constantly reapplying


I was definitely low quality so that could be true.


Looks like the FB one was in Oct, 2020. Google was in Nov, 2020, Amazon was in 2022.


The FAANG type companies are probably barraged by unskilled wannabes and fakes who want good paying jobs, so they have an explicit rate limiting policy (so they can have automated application systems enforce it?) Don't know about mid-size and small companies.


can you cite that statistic?


> Today's developers didn't learn binary before learning Python, why should you learn how to code without the most modern tools?

This phrasing makes me wary. There's a difference between being self-taught, and not even bothering to teach yourself the absolute fundamentals of computing, like binary...


Learning binary and knowing what binary is are two separate things. I don't think you need to learn binary to learn to code. Been doing this for 25 years working at some big name companies, I don't know binary other than 1 is on, 0 is off.

I'm starting to feel like this whole notion that newbies can't come in and learn say React or modern JavaScript without having to go all the way back to that day in 1995 when they were hashing out the std lib for JavaScript and learn every thing about it before that "Hello World!" in React starts is becoming a gatekeeping method.


> Learning binary and knowing what binary is are two separate things. I don't think you need to learn binary to learn to code. Been doing this for 25 years working at some big name companies, I don't know binary other than 1 is on, 0 is off.

You're wearing your lack of CS knowledge like some kind of badge of honor. Of course you don't need deep CS knowledge to be a competent programmer. But almost undoubtedly you'll be a better programmer if you do know CS (I'm using binary here as a proxy for CS since I can't imagine having a deep knowledge of CS without knowing something fundamental like binary). How many times over those 25 years could a problem have been more efficiently solved (in programmer time or CPU time) if you had known about problem solving techniques (perhaps those related to binary knowledge for example) that you don't know from the world of CS? You'll never know... You may guess, but you'll never know what you don't know.

It's not gatekeeping to say that to give someone a complete education to be a software developer we should provide them a knowledge of binary. We can teach programming in an approachable fashion AND teach binary in an approachable fashion. We do it every day at my college.


Why are you talking about CS and nonexistent "problem solving related to binary"? By "knowing binary" we are not talking about knowing machine code instructions or the details of how they are executed, but literally knowing how to read and work with binary numbers (using bitwise operations). Which isn't necessary for problem-solving or implementing most algorithms.

(Yes, there are algorithms that use bitwise operations. They're technically expendable and it doesn't make you any less of a programmer not to know everything. Especially if you're using Python or JavaScript!)


> nonexistent "problem solving related to binary"

Are you joking? Without understanding binary, you can't understand:

- Numeric types, which numbers can be represented exactly, their failure modes, etc.

- Bit-field flags, e.g. for enums

- IP address masks and other bitmasks

- Anything at all about modern cryptography

- Anything at all about data compression

- The various ways color is represented in images

- Any custom binary format, MIDI, USB, anything low-level at all

Honestly the list goes on and on. It's absolutely insane to me to hear people say that you can be a competent software engineer without understanding binary.


The average web CRUD developer never needs to touch any of this stuff.

- Numeric types? Who cares? I know min, I know max. I take number from user and insert it in database. For calculations with money, I use integer cents.

- Bit-fields? I work in Java, what are bitfields?

- IP addresses? I am web dev loper, not network engineer. I don't need to deal with netmasks.

- Cryptography? Me no understand. Me use Let's Encrypt. Is secure, no?

- Compression? Browser do gzip for me. Me no care.

- Colors? I pick the nice color from the color wheel.

- Binary? What is binary? I only use binary when I want users to upload a file. Then I put the files on S3.

Well I do embedded dev as a hobby now so I know this stuff. But for a long time I didn't know how many bits were in a byte simply because I never really needed that knowledge.


Look, it's fine if you want to be hobbyist making some personal website that doesn't store PII or take user submissions. But we're talking about career software developers here. If you want to make a career out of it, this attitude is not only harmful to your career, but dangerous to your company. Besides: do you really not want to actually be good at what you do?


My attitude is: I learn whatever I need to learn to get things done.

And for a long time, I've just never needed to learn about how many bits were in a byte.

The only time I've ever needed to deal with individual bits in Java was when I was working with flags or parsing binary formats. Both of which are extremely rare when doing generic server dev.

You can even do rudimentary binary reverse engineering without any knowledge of bits. I remember running programs through a disassembler, looking for offending JMPs and then patching them out in a hex editor with 0x90.

Not having knowledge is not a problem as long as you know that you are missing that knowledge.


You're being facetious.

You have an artificially high bar so you can gatekeep people from being smart and be the arbiter of who's smart and who's not. What you don't realize is most people don't give a crap and easily hundreds of billions worth of software is sold every year by developers who don't know about anything you mentioned.

Your attitude is also inappropriate for a place called "hacker news" where people are resourceful try to do the most with whatever they have. Maybe you want to go to /r/compsci


> competent software engineer

Interesting term to conflate with "developer".

You don't have to be a "competent software engineer" to be a developer, and in fact, we were never talking about being a "competent software engineer".

These developers do not get jobs as "competent software engineer"s, do not train to be a "competent software engineer", and do not care about being a "competent software engineer". And yet they make things that work perfectly fine! I'm sorry that you think handling PII or having a career (ooo) has anything to do with being a "competent software engineer".

> But we're talking about career software developers here.

I don't remember the name of this fallacy but it sucks, knock it off.


It's absolutely insane to pretend most software engineers need to do any of these things. We're making billions of dollars in aggregate without knowing this stuff. Most people aren't writing their own data compression algorithms, and we sure as shit aren't writing cryptographic algorithms because we're told to leave that one to the experts (i.e., mathematicians).

I'm guessing 80% don't even know bit-field flags and never have to use them, despite them being relatively simple. It would also take someone moderately capable at understanding math less than an hour to "learn." I learned them when I was 13 because I fucked around with MUD codebases, and I don't think I am special for it.


Do you realize that once you write code to solve a problem, that exact problem never needs to be solved again? Either you're solving new problems (and "new" can be slight -- new situation, new context, new hardware, new people, new company, whatever), or you're doing the compiler's job. If most people aren't solving new problems, then their job is bullshit. I frankly don't even understand how you can be confident that code copy-pasted from Stack Overflow actually does what you need it to do without understanding the fundamentals.

> we sure as shit aren't writing cryptographic algorithms because we're told to leave that one to the experts.

Shouldn't we all be striving to be an expert in something? If you're not working your way toward expertise, what are you doing? Why are the absolute fundamental basic building blocks for understanding how computers work and what programming languages are doing something that only "they" need to bother to learn?


Most of the problems software engineers are solving today are business problems defined by stakeholders in a business.

And yes, I agree we should be an expert in something. Harping on binary seems like a waste of time, however. I would certainly like that people are interested enough in the field that they spend their time learning as much as they can about CS, but I'm under no illusion that I'm going to automatically be more productive or better as a software engineer because I know bit-fields.

PS: Thank you for downvoting me.


> Harping on binary

We're "harping" on it because it's so basic. There are a hundred equally basic things you should also know.

> Most of the problems software engineers are solving today are business problems defined by stakeholders in a business.

Even understanding the business problems usually requires a lot of fundamental knowledge in a wide variety of subjects. Being a good software engineer is hard.

And regardless of the problem, if the solution is going to be in code, you can't get away from binary. I actually don't think most programmers should be learning assembly language, because you can actually completely abstract away from it (and you should, because you don't know which assembly language you're programming for). But you can't abstract away from addition, the alphabet, binary, strings, algorithm complexity, and other basics.

PS: I didn't downvote you. I don't downvote people for disagreeing with me. I like disagreement, and besides, it would reduce the visibility of my pithy responses! I only downvote comments that aren't even worth arguing with. So the very fact that I'm taking the time to respond means I have at least some respect for your argument.


"Knowing" binary isn't some deep, fundamental CS knowledge, it's a party trick. Knowing about different number systems in general is nice, but hardly foundational.

The actual choice of what we consider part of the "core" CS curriculum is pretty arbitrary. Why do we consider, say, binary part of that but semantics not? Would it be "gatekeeping" to say that you can't be a good programmer without a passing knowledge of, say, operational and denotational semantics? Do you really have a "complete" CS education without it?


> "Knowing" binary isn't some deep, fundamental CS knowledge, it's a party trick.

Reread my comment I never said binary in-and-of itself is "deep" knowledge. I said not knowing it is a proxy for a lack of CS knowledge more generally.


> But almost undoubtedly you'll be a better programmer if you do know CS (I'm using binary here as a proxy for CS since I can't imagine having a deep knowledge of CS without knowing something fundamental like binary). How many times over those 25 years could a problem have been more efficiently solved (in programmer time or CPU time) if you had known about problem solving techniques (perhaps those related to binary knowledge for example) that you don't know from the world of CS? You'll never know...

You're both right and wrong.

Principles are a lot of cognitive debt to take on, and not strictly necessary to be functional. Literally anybody can sit down with an IDE and write code to merge spreadsheets or whatever the actual business need is. We teach this stuff to anybody with any interest. Kids, even!

If someone thinks they want to be a plumber, they shouldn't start with a 4-year commitment to learning principles of hydraulics and mechanical engineering-- makes much more sense to apprentice, lay some pipe, and if it's the job for you, then go back and learn it at a deeper level. Doing it the other way is why college is so ridiculously expensive and most spend 5 to 6 figures on a major and then go manage at Target.

After failing discrete math three times and giving up, I managed to make it almost 20 years in this industry before needing to learn anything about binary math-- and even then, it was only so I could understand how flags in old MUD code worked, for fun. Truth tables are important, but there has never come a logic problem I couldn't solve by taking a few extra steps to be explicit where you might reduce the logic to a compact blob of symbols. I'll never optimize code as well as someone classically-taught. I don't know what Big-O is-- and outside of a botched Google interview absolutely not a single employer or client has ever given a shit. Nobody has ever been victimized by my code. The "CS" way implies the academic approach is the only way to do things. It's the textbook definition of gatekeeping. All academia did was appropriate and institutionalize experiences yahoos like myself have always learned through ingenuity and trial-and-error.

You've learned more than I do, so you have more tools in your bag. The only thing that sets us apart is that I won't be advancing the industry or publishing anything on arxiv anytime soon. Outside of that, you're going to struggle with quantifying how you're better than the self-taught without resorting to speculation or guild/union mentality (gatekeeping).


> Nobody has ever been victimized by my code.

You don't know that. Actually all code that is inefficient is victimizing both the user (via performance) and the environment (unnecessary energy usage). I'm not saying you've done anything wrong, I'm just saying we all don't know what we don't know. I'm sure my inefficient code has had many users and electric bills as victims. I released software before that subsequent versions where I had better CS techniques improved 100 fold in performance. I wasted my users' time before I learned how to do it better.

> You've learned more than I do, so you have more tools in your bag. The only thing that sets us apart is that I won't be advancing the industry or publishing anything on arxiv anytime soon. Outside of that, you're going to struggle with quantifying how you're better than the self-taught without resorting to speculation or guild/union mentality (gatekeeping).

You made a lot of assumptions about me without knowing my background. I was a self-taught programmer as a child/teenager and my undergrad degree was in economics, not CS. I went back to school to get a masters in CS which was difficult coming from a self-taught background. I did programming both paid and hobbyist for over a decade before that more formal education. And I write books read largely by self-taught programmers.

Saying a full software development education includes binary and the fundamentals of CS is not gatekeeping, it's a low bar if we don't want inefficient software wasting users time and sucking energy. I'm not saying you have to start there, I'm saying it should be part of your education as a programmer whether self-taught or formally taught.


You're obligated to define exactly what you mean when you talk about "knowing" binary.


You're absolutely right, it's definitely gatekeeping.

That said, there's also a point in here that's often underappreciated. There's a big difference between someone who learns today's modern tools from nothing and someone who has a good foundation learning today's modern tools. I think it's fundamentally one of approach. The former treats coding as a trade where you just need to learn a few tools. The latter treats it as a vocation where fundamentals allow you to learn whatever tools you need.

The one makes sense short-term - it gets people to paying work with modern tools in a minimum of time. The other makes sense long-term - it keeps people in paying work with modern tools for decades.

When you're just getting started and looking to get that first job with the life-changing six figure paycheck, all that farting around with fundamentals that the gatekeepers are talking about seems like an absolutely massive waste of time.

It is gatekeeping, but gatekeeping can and sometimes does serve a purpose that is not just the purely selfish.


I've found in my years mentoring and teaching that showing something is much better at keeping them learning and interested than "Sit down and read 30 years of out dated documentation and coding practices just so you can feel the pain and agony I had, then and only then when you've proven yourself can you spit out that hello world on the screen, filthy scum!"

Give them a codepen with modern React already bootstrapped so they can start just tinkering with it and changing things, man watch their EYES LIGHT UP at the possibilities... Every time I see this happen it takes me back to 1997 when I was first learning to build websites.


You're completely right. That's a wonderfully kind, empathetic, and compassionate approach that's incredibly effective for teaching people what kind of power they are starting to have access to.

I've found it's also one that is very expensive as measured by instructional time and energy. I've also found it relatively ineffectual for teaching fundamentals.

I do not have to lecture someone about how they are unworthy and useless to know that understanding a bit of discrete mathematics, HTTP fundamentals, DNS, or relational algebra will make them better software engineers. There are absolutely people in this world who, when learning the glories of React with a codepen, will ask how things work several times and learn big-O notation... but there are far more who won't ask but would benefit from knowing anyway.

Do you think it's perhaps possible that people benefit from both a solid grasp of the often-boring fundamentals as well as feeling the joy of tinkering?


> Do you think it's perhaps possible that people benefit from both a solid grasp of the often-boring fundamentals as well as feeling the joy of tinkering?

I don't think they were explicitly excluding the former, but rather saying it's important to get someone interested before they even become interested in learning the fundamentals.


This seems like a good context for the quote, “if you want to build a ship, […] teach them to yearn for the vast and endless sea.” I don’t think that Saint-Exupery intended to suggest that the yearning was enough on its own, but it makes the process so much more effective.


The trick is doing so without demeaning the value of basic carpentry. Which would be obviously silly in building a ship, but in computing we frequently have people looking to become software engineers without encountering or learning the fundamentals of the field.

This particular project comes from people who regard fundamentals as optional.


There is a difference between going all the way back to 1995 / reading old documentation vs learning the basics of CS.

This strawman is used in several comments. CS is not "some old knowledge you might never use". Knowing that it's way faster to search by key in a hashmap rather than iterating through a whole array is useful. Knowing why it's a bad practice not to have primary key (and other DB knowledge) is useful. Knowing the stages of a HTTP request is useful.

You can get a job and actually do some productive work without any of that, but that some point not knowing all those basics is going to harm your work.


+1, and also, CS fundamentals absolutely can be learned & exercised in the use of high-level tools. One of the beauties of that knowledge is that its concepts are transferable across domains and layers of the computational stack.

Pro tip for anyone working w/ junior devs, especially those who came through bootcamps and the like: you can point them to CS knowledge without actually calling it CS.


> I don't know binary other than 1 is on, 2 is off.

Either an epic troll or a great illustration :-D


I guess the person you replied to edited their comment. Still, "1 is on, 2 is off" made me smile.


haha you caught that before I fixed it, that my friend is the result of my failure of math in my great edumacation in the US public school system hahaha.

Also it's 9am and I just woke up a bit ago.


You should have left it as it was :)


> Also it's 9am and I just woke up a bit ago.

You should try waking up at 6 AM


I woke up at 8AM Chicago time. Which is 6AM San Francisco time. Same thing.


I guess jokes are no-longer a thing in our culture?

Let me spell it out: You should have woken up at 6 AM. The error would have been even funnier.


> I guess jokes are no-longer a thing in our culture?

Nah I just think people here like to be liberal with their downvotes. "I didn't find that funny, downvote"

While I didn't get the joke, I still appreciate the attempt.


I've literally had to use knowledge of binary and number representation just last month in order to implement a lower level protocol. You may not use it in your simple CRUD app jobs but it's absolutely not "some old thing people only had to use back in the day."


I occasionally have to manipulate binary numbers too. But, aside from bitwise operations, I almost always forget what I had learned last time and have to re-read the material again.


I've gotten to use bitwise operations like twice in a 20-year career.

Trivial applications of things that might go in the first couple weeks of an algo class, about the same rate of use.

I always get a little excited when I get to use those things. Like spotting an old acquaintance in a restaurant in another city. "Hey! It's that thing they said was really important but in fact my career would basically be identical if I never learned it at all! Long time no see!"

[EDIT] Still waiting to use math past what I learned in 6th grade, aside from extremely rare use of very simple linear algebra or plugging in stats formulas that I looked up anyway because I don't trust my recollection since I use them so rarely. Doubting I'll even once need any of it before I retire, at this point. Which is great, because I've entirely forgotten all of it, on account of never needing it.


> You may not use it in your simple CRUD app jobs but it's absolutely not "some old thing people only had to use back in the day."

My point is that most jobs are simple CRUD app jobs or positioning divs on a screen, not deep seeded CS stuff.


> My point is that most jobs are simple CRUD app jobs or positioning divs on a screen...

It's really not "most jobs." Although I do agree that a CS or software engineering degree is overkill for that type of stuff.

Also, "knowing binary" is a strawman, and not a very good one. A newbie developer getting confused by bit flags isn't a big deal. Point them to Wikipedia and let them read about it.

The much bigger problem is when inexperienced developers go off and write a ton of bad spaghetti code, re-invent a bunch of wheels they never learned about, and generally just write crap because they're clueless about best practices (or even any practices at all). Now the clueless newbie is slowing down everybody else and creating a maintenance nightmare.

TBH, most new developers are pretty bad (self taught, university taught, or whatever). The important thing is having experienced people around to point them in the right direction and help them get "real world" experience.

Gate keeping isn't always a bad thing.



> I don't think you need to learn binary to learn to code. Been doing this for 25 years working at some big name companies, I don't know binary other than 1 is on, 0 is off.

I... guess?

But the full explanation of binary is only a paragraph long. At a certain point that seems like something you'd have to avoid on purpose.


> Learning binary and knowing what binary is are two separate things.

Really? Binary isn't a language like Python, it's a notation; either you understand it or you don't (you could say it's 'binary', I suppose). If you don't understand it, you don't know what it is.


> Learning binary and knowing what binary is are two separate things.

Knowing what binary is lets you generate smooth sounding sentences about it, which someone else who also only knows what binary is accepts as true.

Learning binary lets you actually exploit its properties in a solution.


So when you're reading code and you get to a bitmask... what do you do?


Let’s hope you never have to parse an archaic binary format then


Most people, in fact, do not have to do this. That’s an exceptionally rare thing to do that I would estimate a fraction of a percent of developers will ever encounter.


I would have put the number at 75% rather than a fraction of a percent. What do all these people do, these "developers" that never have to use bitstrings in python, deal with encodings, endianness, interface with hardware, etc.? Are they all UI people?


> use bitstrings in python, deal with encodings, endianness, interface with hardware

is nowhere even remotely close to

> parse an archaic binary format

The goalposts are on the other side of the field.


In my experience, most files in userspace are in an archaic binary format. And most hardware talks archaic binary.


Most developers interacting with files are going to be doing it through a higher level API, so while sure, technically they're stored like that, it's not like that's what the developer actually has to deal with.

And most developers don't interact with hardware day to day, no. That's an exceptionally small set of people.


It hurts to be called exceptionally small XD

But also... everyone interacts with hardware whenever they use a a computer.

I think our difference of opinion has to do with abstraction layers here. Just so you know where I'm coming from, I work on web apps in java, python, and js, C++ desktop applications, and I operate a mars rover using various domain specific languages. Before switching to engineering I was a scientist and had to deal with data collection in the lab and field from all sorts of instrumentation which often required understanding low-level protocols and the movement of bits and bytes.

It's hard for me to imagine the world you're describing... sounds like it's full of script kiddies.


I didn't know there were computer nerds who can't at least count in binary.


And hex, or any other base for that matter. Fundamental concepts and not difficult... I mean, hopefully people learned about place value in elementary school...?


> There's a difference between being self-taught, and not even bothering to teach yourself the absolute fundamentals of computing, like binary

Been doing this for almost 30 years now, about 15 of those professionally. I think the last time I used binary was in a microcontrollers class in high school.

Even in college, yeah we talked a lot about binary, we learned the single most useful thing in my career – truth tables – and we dived deep into how CPUs toss bits around. Did we ever use binary for anything practical? No of course not, that’s what the compiler is for.

I mean I guess the real question is: What does “learn binary” even mean? Knowing it exists? That’s easy. Knowing that your code is eventually all binary? Yeah great. Knowing how NAND gates and such work? Well that’s discrete mathematics, bool algebra, quantum physics, circuit design, and whatever field of math talks about translating problems from one to another, not “binary”. Being able to calculate, by hand, numbers from binary to octal and decimal? Meh you can have a computer do that or google the algorithm when you need it. Does “learning binary” mean memorizing the ASCII table so you can read hex dumps or whatever? Maybe, doubt a lot of modern engineers still do that tho.


>Did we ever use binary for anything practical? No of course not, that’s what the compiler is for.

How to tell people you have never implemented anything performance sensitive in your life without spelling it out bluntly in a nutshell. The code of any popular image processing/decoder/encoder/compression/cryptographic libraries is littered with the use of bit operators because they operate at the fundamental building block level of computing, they are the most efficient and the "sufficiently smart compiler" that supposedly always produces the best interpretation is a lie. You merely need to skim through any implementation of jpeg, of h264 or anything that actually matters in this world to see the practical application of working on bits in the real world.

But sure, understanding computer architecture is meaningless. Trust the compiler. Thank gods I can still see stutters scrolling a web page on a 16 core CPU with 64gb of ram. I don't know how I could live my life if people actually knew how to make proper programs!


> How to tell people you have never implemented anything performance sensitive in your life without spelling it out bluntly in a nutshell

And that’s okay. Millions of engineers work on software that leverages those fine-tuned optimizations. Hundreds of engineers make them.

Plus I grew up in an era of cheap compute. Everything got faster every 18 months for free. Even now computers as a whole keep getting faster and faster despite the cap on single core performance.

Even at my relatively high level, the performance concerns I spent months learning about in 2005, just don’t matter anymore when you can get a single server with terabytes of ram and petaflops of compute.

99% of [user facing] code just isn’t performance bound these days. What does it matter if your loop takes 10ms or 20ms when the data is a 300ms network call away


I'll argue - bitwise operations, they're fundamentals too, but

If you arent working close to hardware, then for example doing bitwise operations may be really rare.

I don't think I've met a problem that required them even once during my first few years as SE.

I've had CS degree and years of experience and I werent proficient with those, I needed to write it down in steps

and then I started working closer to hardware where those operations were common and I've learned it

I don't even remember whether we were doing them in school


Boolean logic is close to bitwise operations and it is the knowledge without which nobody should dare to call themselves a software engineer.


is close, yet no the same.

There's difference between "do you understand this if ladder" or "can you simplify this boolean expression"

and "clear bits 3, 25, 26, 27 of this register and set bits 25, 26, 27 to value 0b101"

I'm not saying that this is hard or something, just the 3rd thing is really rare unless you work close to hardware or some other specific tasks


I think the generalization skill that allows to go from one to another is essential. If someone has only some basic understanding of boolean expressions, but cannot apply this knowledge to binary numbers, this is a good characterization of their programming skills in general.


I work in networking, not particularly close to the hardware. I have colleagues who manipulate IP addresses by looking at the string representation of the address, splitting on the `.`, casting each part back to an integer, etc. Their code breaks as soon as we use a netmask other that /24 or (shock, horror!) we run in an IPv6 environments.


I don't really understand the hurdle to "learning binary". It's not necessary to understand how binary works to complete a hello world program, but I think it's something that you want to get a handle on pretty quickly.

If I recall correctly, we only spent a few minutes on it in a larger lesson about different schemes for representing numbers in Comp Sci I. I don't think I've performed binary arithmetic since that class, but it's good to know how it works and I could always look up/figure out how to do those calculations again if I needed to.


It's good to know how to eat healthy, avoid processed foods, maintain healthy sleep patterns, avoid PUFAs, maintain strong mind-muscle connection in the body, communicate effectively and succinctly and empathetically with others, and many other life skills as well. And they'd all make you a heathier, happier, more robust, performant human. Which translates to better productivity. Better code.

So, should this new programmer start there, or start with binary?


Honestly, learning binary isn't something that will take any significant amount of time. This is not a tradeoff you need to make.


I mean, you definitely Should be doing all that. 2 hours at the gym + suppluments + cutting out processed food + getting at least 150 grams of protein a day + drinking purified water have all contributed significantly to keeping my brain active and capable at 38.


How do you know that you wouldn't have the same capabilities if you didn't do those things?


3 years ago I weighted 210, had persistent fatigue and chronic pain. now I'm 187, and feel better than when I was in my 20's. Its made a difference.


38 here as well. Agreed on all counts except the need for supplements if you're eating grassfed beef, lamb, organs (liver, etc) and other high quality whole foods. Also 180g protein for me, since I weigh 205lbs.


> So, should this new programmer start there, or start with binary?

The new programmer should start with the life skills. As a toddler. They should also learn the alphabet and how to wipe their own ass. Why do you think this is a gotcha question?


By your own admission you have never used the knowledge you learned.

Why exactly is it “good to know how it works” if you literally have never used that knowledge? Why is it “something you want to get a handle on pretty quickly” if you don’t touch binary?

Are there places where it would come up? Most certainly. Is it required learning for every single dev out there? Highly debatable.


I've certainly benefited from knowing about floating point error. I likely would have spent a lot of time confused about why certain kinds of math kept coming out wrong without know about the underlying representation of floats, how it results in error, why this tradeoff is good for most scenarios, and other options.

The problem is that this is the sort of foundational knowledge that isn't easily gained through the learn-as-needed approach that applies to higher level things. Most people can notice when they don't know how to use a library. It's probably not obvious to most people who don't already know about it that their computer can handle numbers wrong.


> It's probably not obvious to most people who don't already know about it that their computer can handle numbers wrong.

0.1+0.2=0.30000000000004 is not obvious? When a floating point error happens, it’s quite plainly obvious. At that point, someone would look up a SO article like https://stackoverflow.com/questions/588004/is-floating-point... and learn about it. And from there, FPE mitigations.

Would you get to the mitigations faster if you knew binary? Sure, the first time you ever hit it. But that seems to be bottom of the barrel optimization, IMO.


Most manifestations will be in the middle of something more nuanced than a demo statement, the people in question will often lack the vocabulary to describe what they're seeing, and it's rarely anyone's first or even fifth thought that addition is going wonky. With that in mind, I think it's perhaps a stretch to call floating point error plainly obvious.

You're right, of course. This would be only marginally faster if the person knew binary. That said, there's a large swath of very similar things that crop up where the person benefits from a familiarity with the fundamentals of computing. There's enough of these things that a reasonable person might conclude that a software engineer benefits from such a knowledge base inside their head. That way a person can benefit from all those marginal gains at once.


The example is overly simplified, sure. But even if it’s in the middle of a swath of other operations, it will still result in very similar behavior. You’re likely never to get a clean number once a floating point error happens, and the result will be slightly off and seem to not be rounded.

Searching “why is my arithmetic operation not rounded”, I got https://docs.python.org/3/tutorial/floatingpoint.html as the third answer. I obviously can’t unlearn what floating point arithmetic is, but it feels like someone without any knowledge of it would likely be able to get a similar result relatively quickly as long as they are good at searching for answers (a much more important skill, IMO, which should be considered foundational)

> That said, there's a large swath of very similar things that crop up where the person benefits from a familiarity with the fundamentals of computing.

We’re talking specifically about binary, not fundamentals in general. Some fundamental knowledge is more important than others, and I posit that binary is on the lower end of that spectrum.


IMO, being good at searching is a skill that's only really useful when you have some idea what kind of question you're looking for an answer to.

I don't think we're actually talking specifically about binary. I am treating the reference to binary as a stand-in for the mathematical fundamentals of computing, rather than a narrow comment on understanding binary and bitwise operations.


> IMO, being good at searching is a skill that's only really useful when you have some idea what kind of question you're looking for an answer to.

"why is my arithmetic operation not rounded" seems like something anyone facing the problem would ask. That's pretty much the root issue in words.

> I don't think we're actually talking specifically about binary.

I mean, that is literally what the thread you replied to was discussing.

The comment I replied to explicitly talks about binary and their use of it, the comment they replied to originally also specifically calls out the OP’s quote which talks directly about binary. You can choose to deviate from that if you want to make a point, but the thread has always explicitly been about binary and that is what we should actually be discussing.


> the OP’s quote which talks directly about binary

The OP's quote is "fundamentals of computing, like binary..." It seems reasonable for a person to think the discussion is about "fundamentals of computing" more generally than the specific example given of "binary". More precisely, the fact that a thread caught on to one specific aspect of the discussion doesn't mean that a commenter can't keep in mind the greater context.


> Highly debatable.

For the sake of debate: call me old-fashioned, but I don't ever want to rely on code from a "developer" who isn't familiar with binary notation.


How does knowing binary produce meaningfully better code? Most of us aren't working at a low enough level for it to be substantial.

You should be judging the code, not the person who wrote it.


For nearly all of the software I rely on, I haven't examined the code. I'm not in a position to judge it.

For most of that software, of course, I'm not in a position to judge the developer either; but if all I know is that the developer isn't familiar with binary and hex, then I wouldn't expect her to be competent to write, for example, a brochure website, let alone a webserver. URL encoding depends on hex. Debugging often depends on hex. Arithmetic overflow and carry are binary. Twos-complement notation for signed integers is a binary convention.

I wouldn't hire a developer who couldn't explain binary notation. In fact I don't think I've ever met one like that.

As I suggested, perhaps I'm old-fashioned.


I could see there being a future course about binary on codeamigo actually. I'm not saying people shouldn't learn the fundamentals of computing, rather, it's not knowledge that's a requirement to have before building most modern applications.

Before it was "learn C before learning Python" but some people didn't love that either...I guess my point is, we've been moving to higher and higher abstractions ever since computer programming was invented, the next abstraction is probably going to be talking to an AI to write some code that you need to vet, the above is just marketing speak for that.


How are courses created? Who creates them?

>Made with in

How did all the haze treats you yesterday?


My friend/co-founder and I wrote these courses.

Thanks for asking. I didn't leave my apartment. Hope everyone who is forced to be outside for work is safe, and this passes soon.


It would be unsettling for a software engineer to have little knowledge of the fundamentals.

But software engineers aren't the only people using python. I work with data scientists - with degrees in data engineering from computer science departments in very good universities - and I am certain that they believe a computer to be a magical box. I know for sure they're terrified of binary. Honestly, I'm looking forward to the day that they actually use functions and classes properly.

I wish I was exaggerating, I really do. It'd make my life easier. And it is no surprise - I've seen the supplementary material attached to papers that come out of those departments. I won't go into too much detail, but I don't know how any codebase could more closely resemble a house of cards and still function.

They still have successful careers in what they're good at. After all, one of the main reasons that python is so successful is that it can be used by people who don't know much - or care much - about programming. It can obviously be used by far more capable hands to do many more things, but for applied tasks it takes the pain out of learning something that they consider tangential.


Realistically, one can achieve quite a lot without needing to ever think about binary, and today’s languages/frameworks explicitly enable this.

I’m not arguing that someone shouldn’t eventually teach themselves more fundamentals as they mature their skillset, but most modern languages are so many abstractions above binary that it’s more of a distraction while learning about the basics of code in the context of real world use cases.

Understanding assembly on some level is in a similar category.

I think of this more as an avenue for specialization. One need not learn these things to get started, but they may very well need these things if they want to continue their journey past a certain point.


Seconded. And I'd posit that learning CS theory and fundamentals like binary shouldn't be too difficult if the individual is savvy enough to grok the finer points of Python.


I dislike this perspective.

I taught myself programming at age 12, well before undergrad.

If I'd been forced to learn binary before I could make websites and games, I would have given up. I might have even avoided programming forever.

I think the very first thing someone should learn is the fastest path to build something that interests and delights them. Theory can come later, when they're ready to appreciate it.


You could have made learning binary a game. Learning binary isn't difficult and should only take a few days to a week of lecture to grasp a sufficient understanding.


> should only take a few days

This doesn't work for everybody!

Let people learn in the direction that interests them. The highest energy reward function first.

Once they feel rewarded, then let them learn theory. They'll have a deeper appreciation and the stamina to press deeper.


Hard pill to swallow: Doing things that are good for us that we don’t necessarily like is a part of growing up.


Forcing someone to learn something is a quick way to turn them off forever.


Yes, I can see that now that I’ve tried to teach you this point.


Unsubscribe [1].

I get your point. But you can't convince me that teaching binary before the fun parts of programming is the most effective way to bring more people into the subject matter.

It's like trying to onboard game developers with linear algebra before they play around with pygame.

Fun first, rigor once they're hooked on the magic.

[1] (Tongue in cheek, obviously.)


It's not about learning hard things before fun; it's about learning it at all. Many folks, like OP, say that they should never want/need to understand such things at all in their career. That they are superfluous.

Sure, you can do that, but you can't convince me that that's not troubling.


Don't you learn about number bases in maths lessons in school in the US? We did that in the late 60s in England.


We do, but it's taught as trivia. Applications and various "tricks" are what matter for binary in CS, and they don't teach that. We're taught about non-10 bases, but not taught why we should give a shit about them.

Same as most of the rest of primary and (especially) secondary school math, really. I doubt 1% of recent high school grads can tell you a single reason why anyone should care about quadratic equations, even though they likely spent months of their lives jacking around with them (for unclear reasons). Most of it's just taught is extremely-painful-to-learn trivia. Trig and calc get a little bit of justification & application, but not much. Stats probably comes off the best, as far as kids having even half a clue WTF they can do with it after the class ends.


> Most of it's just taught is extremely-painful-to-learn trivia.

That's weird. When I was in school in the UK (1960..1974) mathematics was illustrated with practical applications. This was especially so as we progressed into more sophisticated physics and chemistry.


You'd get examples sometimes, and (infamously) contrived word problems, but not enough and most of it wasn't at all relatable. Some whole topics were totally lacking in anything but horribly-contrived motivations, including covering bases other than 10. You might get "computers use binary!" which... OK, cool, so what?

But yes, we'd see some applications of usually limited and relatively simple math from e.g. calculus or algebra in other classes, which typically amounted to plugging values into a handful of formulas (which, to be fair to those classes, is what the vast majority of "using math" is in the adult world, aside from basic arithmetic). Not in math class, though, and not at all for many topics.


In my case, I started with the FET junction (electronics).

My first program was Machine Code, designed on a pad of lined paper, and punched directly into RAM, via a hex keypad.

These days, I write Swift. It's nice to not need to deal with the way Machine Code works, but I'm glad I learned it.

That said, everything we learn takes up NVR, so there's a strong argument to be had, against the need to learn below a certain fundamental floor.


> My first program was Machine Code, designed on a pad of lined paper, and punched directly into RAM, via a hex keypad.

Yup. That's also where I started. In fact, I still own one of these in mint condition:

https://www.hewlettpackardhistory.com/item/making-a-case/

I truly believe the low level fundamentals are crucially important. Even today, in the days of Python and Javascript.

Not to go too far, about a year ago I was working on an embedded board we designed which used MicroPython. The communications protocol required IBM CRC-16 to be calculated. MicroPython could not run this calculation fast enough. Since MicroPython allows you to insert code in assembler, I just wrote the CRC routine (an many others) in ARM assembler. The performance boost was, as one would expect, massive. Writing the code wasn't a problem at all given my background.

Having this knowledge also allows you to reach for optimizations that someone who has only seen high level languages and (as posters elsewhere in the thread have revealed) don't understand low level code or even binary. A simple example of this is that comparing to zero is faster than comparing to a value. The simple explanation being that you have to juggle registers and perhaps even read and store that value from memory every time you go around a loop. All processors have the equivalent of a JZ and JNZ (jump if zero, jump if not zero) instruction, which does not require anything to be fetched from memory or register swapping. Of course, this is processor and task dependent.

And then I wonder about such things as DeMorgan, bit-wise operations, masking, etc. I was surprised to read some of the comments on this thread about people not being comfortable with binary beyond 1=true and 0=false. I don't understand how someone can develop non-trivial software without having a solid foundation. I mean, even for CSS you should understand what "FFC21A" or "3FE" means, where it comes from and how to work with these numbers.


The pathways formed by all those fundamental and low level operations create capabilities and reasoning strategies that are very valuable. Perhaps they are even unachievable in other ways.


> Today's developers didn't learn binary before learning Python

I didn't learn binary before Python. I learned Lua before Python. But now I'm messing around with low-level C++, and Rust.

Where you started doesn't necessarily have anything to do with where you are now.


"Today, almost 50% of code is written by AI" whats the source for this?


It's roughly a figure GitHub has stated with Copilot: https://www.microsoft.com/en-us/Investor/events/FY-2023/Morg...

> Scott Guthrie: I think you're going to see – I mean, I guess the way to look at it would be what is the productivity win you're giving to the business, whether it's around making an employee more productive or making a specific business process more effective. Take the example of, say, GitHub Copilot, since that's a product that's GA today. We're now seeing the developers using GitHub Copilot are 55% more productive with it on tasks, and that's sort of measured by independent studies. And 40% of the code they're checking in is now AI-generated and unmodified.


So, not 50% (which I think we all knew), but an estimated 40% of the people using Copilot (some percentage of devs), with that figure being heavily dependent on how the study was done. Are people self reporting these figures? Because you'd have to think that there'd be a sampling bias there.

Anecdotally, copilot has gotten in my way more than it has been useful, so I'm having trouble buying these figures.


That probably sources back to a blog post from GitHub saying that almost half of new code was being written by copilot.

https://github.blog/2023-03-22-github-copilot-x-the-ai-power...


Apparently the site's own chatbot seems to disagree with the number.

> How much of all code in general is written by AI?

Answer: Very little code is written by AI. Most code is written by people.


Yep, I find it hard to believe


94% of most statistics on the internet are made up.


One "today" I'll be spot on.


Can you please use an ADA compliant color palette generator, I don't want to have to squint or copy text to notepad to use a website...


Can someone turn the lights on? I'm light theme everything, I want that thang to sear my retinas when I load the page.

About Codecademy: Has it changed there a lot recently? I had problems getting them to understand that I was not interested in the position a few years ago.


I was about to suggest dark reader, which has a "Light mode". But it turns out that this website has poor colour contrast and most words just "disappear" with both dark or light mode.


>About Codecademy: Has it changed there a lot recently?

Well, they did get acquired roughly year and a half or so ago (i think it was late 2021), so things might have changed from a cultural perspective within Codecademy or the parent company.


Likewise, my vision is not very good so black-on-white is easier for me to see than gray-on-black.


If you don't mind me asking - is it generally easier for you to read black-on-white or white-on-black? Or does it change e.g. based on the environment (bright room during day vs. darker room during night)?


black-on-white is easier - lots of brightness makes the iris contract, giving me sharper vision during the day and when looking at a bright screen. At night, I try to keep my lights on and not sit in a dark room with a screen or else my eyes get fatigue very quickly.


Understood, thank you! I've mostly focused on dark mode for my stuff, but your experience is a very good reason to spend equal time on light and dark mode. I'd hate for someone not to be able to use whatever I build simply due to "bad design" for that person.

One last question if you don't mind - is a light mode with good foreground/background contrast enough for you to see well, even if there are elements with bright colors (e.g. here the orange top bar)? Or is it easier to clearly see everything with fewer colors (and thus higher contrast)?


It should be fine to use colors for elements on the light theme, as far as accessibility is concerned. Colored text is not a very good idea, but everything else is up to your artistic vision.

On the topic of the theme and colors, I think the trick for keeping elements distinct despite low contrast, is to have fewer "levels" of elements. Modern "flat" designs keep very low contrast between the actual page background and any organization elements (containers, panels etc) and only give high contrast to the "main" elements, such as the primary text content and whatever element is currently selected. For example, on a page with a "download" button it's a good idea to give that one button a high contrast scheme, instead of all buttons on the page. Contrast can be done between light and dark, between gray and color, or between color and a different color.

Another thought I had about light/dark theme, is that dark theme is often modeled after a colorful console terminal, where technicolor colors are used to draw attention to different types of messages, or as blinking LEDs on a device front panel, so that gives extremely high contrast between the (basically invisible) bg and the object. Light mode, otoh, is often modeled after a book or a newspaper, or some other physical medium, where there is one basic color for text and the background does the heavy lifting - bright for bulk text, colored for special notices or announements, pale for uninteresting stuff, etc.


Thank you very much for replying and sharing your thoughts! :)


All I ask is that you consider the os/browser preference settings for light/dark.


I do! I use the OS/browser preference as a default and add an additional toggle with persistence. @vueuse/core has a really nice implementation for Vue3 projects: https://vueuse.org/core/useColorMode/


"The snippet `<!DOCTYPE html>` is an instruction to the web browser that the document is written in HTML5. This is important because it tells the browser how to interpret the code and display the content correctly."

So.. what happens when the DOCTYPE tag isn't there?

"Answer: Without the DOCTYPE tag, the browser won't know which version of HTML to use to display the page. This could cause the page to look different than it was intended."

Well.. yeah that's _sort of_ what happens, but I wouldn't call that teaching. I (and students for that matter) can infer this from context. I know that my intro course to web development specifically used the DOCTYPE tag as a segue to talk about different HTML versions, the HTML standardization process and specifically, that the browser enters quirks mode when it's not present.

Right now it feels like the AI is more on the level of a university student that's slightly ahead of you in the material.


I worked at Codecademy - you're right the getting stuck part hurts people moving forward, we spent a lot of time trying to help people without outright spitting it out - it was less about runtime errors and more about the concept. I'm not sure that an AI spitting out an answer is the right solution to help people learn - we also thought about different forms (projects, embeds in articles, tutors, etc)

Another Codecademy alumn is building a different learn to code platform - he wrote many courses and was always interested in teaching and providing resources.

https://www.codedex.io/

All in all, they problems were learning and education problems rather than technical.


Thanks Deltacoast, that's an interesting insight. I agree that being given the answer is the wrong approach, and the AI needs to be fine-tuned to not do that, similar to KhanAcademy's Khanmigo bot.

I do think having a knowledgeable assistant can be helpful though. I've watched some users literally get unstuck with the AI, and it's amazing.

Great work on Codecademy by the way, I'm (obviously) a big fan.


The site doesn't work for me in Safari. Tried 01. Setting Up and clicking run doesn't do anything but shake the Terminal box. Even if I put in garbage it does nothing.


The paragraph under the headline was almost not readable for me, had to highlight it. I think "text-neutral-500" blends too well with the dark background.


Updated


Please verify colour contrast using a tool like contrastchecker[1]

Also, there are some accessibility issues with the page, use one of the Web Accessibility Evaluation Tools to fix this.

[1] https://webaim.org/resources/contrastchecker/

[2] https://www.w3.org/WAI/ER/tools/


Thanks, I'll check it out.


"Fuck them" driven development


"Learn to Code like a Developer" is a funny tag line to me. Isn't the point to become a developer? Or are you saying: "you're going to learn with modern tools, just like how a may use tools to facilitate coding?"

Maybe highlight the fact that you have an AI bot helping you through it. I would not have known it unless I read your HN comment.


Great looking project. IMO it's good to see AI being used to teach as opposed to just plucking out the answer or generating a first draft (better if AI can add jobs rather than destroy them!). I hope you're on to something...


> Today's developers didn't learn binary before learning Python

I'm sure it's rare, but we had C in 10th grade, and then assembly and electronics in 11th grade at my high school. I remember my first assembly class 21 years ago. The teacher introduced mov, add, and some registers. It was mesmerizing.


Don't get it. Codecademy has Python and JS courses already. What does it do different?


OP gets to build it, unlike not getting to build it at Codecademy.


Ah yes, I can tell its a good way to learn how to be a developer by teaching you about how to write in an authoritative "intuition as fact" tone even before the course started.


My browser, Firefox mobile, isn't supported.


impressive, dont worry about codeacademy, with this project you can land a job most places.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: