Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You just tell it the problem and it'll fix it. It's almost never been an issue for me in Zig.




Do you really think the user didn't try explaining the problem to the LLM? Do you not see how dismissive the comment you wrote is?

Why are some of you so resistant to admit that LLMs hallucinate? A normal response would be "Oh yeah, I have issues with that sometimes too, here's how I structure my prompts." Instead you act like you've never experienced this very common thing before, and it makes you sound like a shill.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: