Hacker Newsnew | past | comments | ask | show | jobs | submit | tantricked's commentslogin

All of the recent LLM advancements have just been training the model to self-talk to force it to see clearly.

I hate to be a “proompter.” but I used this prompt and got the right answer without thinking:

  Before answering, do the following:

  Clearly restate the user’s actual objective.
  Identify what must physically or logically change for the objective to be achieved.
  Check for hidden assumptions or trick framing.
  Ask: “Does my answer actually accomplish the stated goal?”
  If multiple interpretations exist, briefly list them and choose the most logically consistent one.
  Do not optimize for surface efficiency if it conflicts with the core objective.
  Use strict common sense before answering.

Every programmer is basically a manager. Code is the language we use to communicate, and hardware is the resource we manage.

As a junior, I feel most complexity in software is manufactured. LLMs simplify that mess for me, making it easier to get things done. But I’m constantly hit with imposter syndrome, like I’m less skilled because I rely on AI to handle the tricky stuff. And Gemini is better than me!


Factorio is just programming in disguise.


I feel the prerequisite for simulations, game engines, and rendering is also maths


Will sure sell like hot cakes, just because of novelty


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: