Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you eliminated 100% of my code typing time with perfect effectiveness I think that'd make me maybe 10% or 20% more productive? Turning ideas of what the code should be doing into code just isn't a bottleneck for me in the first place. Are there people who just add net 1000 lines of code to whatever they're working on every single day or something?


I keep seeing this point made, but AI tools don't save you just typing time. They save you time you would previously use to lookup documentation, search the web and Stack Overflow answers. They save you time it takes to navigate and understand a codebase, write boilerplate code and tests, propose implementation suggestions, etc.

Dismissing them on the basis that they just save you typing time is not seeing their full potential.


there are people who code the whole thing in their head and have perfect recall of all language/api docs/references/syntax/features as that is how their brain works. ie they don't even type code until this step has occurred for them.

I think that is a small percentage of the dev community, so for me and people that don't operate like that, these tools are a game changer as you point out. I don't take what ChatGPT says at face value, I've got 15years of experience I'm weighting results against as well...

chatgpt's version of the above: Coding entirely in one's head is rare. Most developers need external resources, making development tools invaluable. While ChatGPT's input is valuable, it should be balanced against personal experience and expertise.


Navigating and understanding a codebase is the only one of those which would excite me, but it's also something I've never even someone propose using ChatGPT for. Do you have an example of what that would look like?


There have been a few announcements here just in the last week:

- https://news.ycombinator.com/item?id=35236275

- https://news.ycombinator.com/item?id=35248704

- https://news.ycombinator.com/item?id=35228808

And this is with the GPT-4 limitation of 32k input tokens. Imagine what will be possible in the next generation that increases the context size.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: