Everyone keeps telling me that it's good for bash scripts but I've never had real success.
Here's an example from today. I wanted to write a small script to grab my Google scholar citations and I'm terrible with web so I ask the best way to parse the curl output. First off, it suggests I use a python package (seriously? For one line of code? No thanks!) but then it gets the wrong grep. So I pull up the page source, copy paste some to it, and try to parse it myself. I already have a better grep command and for the second time it's telling me to use pearl regex (why does it love -P as much as it loves delve?). Then I'm pasting in my new command showing it my output asking for the awk and sed parts while googling the awk I always forget. It messes up the sed parts while googling, so I fix it, which means editing the awk part slightly but I already had the SO post open that I needed anyways. So I saved maybe one minutes total?
Then I give it a skeleton of a script file adding the variables I wanted and fully expected it to be a simple cleanup. No. It's definitely below average, I mean I've never seen an LLM produce bash functions without being explicitly told (not that the same isn't also true for the average person). But hey, it saved me the while loop for the args so that was nice. So it cost as much time as it gave back.
Don't get me wrong, I find LLMs useful but they're nowhere near game changing like everyone says they are. I'm maybe 10% more productive? But I'm not convinced that's even true. And sure, I might have been able to do less handholding with agents and having it build test cases but for a script that took 15 minutes to write? Feels like serious overkill. And this is my average experience with them.
Is everyone just saying it's so good at bash because no one is taking the time to learn bash? It's a really simple language that every Linux user should know the basics of...
Here's an example from today. I wanted to write a small script to grab my Google scholar citations and I'm terrible with web so I ask the best way to parse the curl output. First off, it suggests I use a python package (seriously? For one line of code? No thanks!) but then it gets the wrong grep. So I pull up the page source, copy paste some to it, and try to parse it myself. I already have a better grep command and for the second time it's telling me to use pearl regex (why does it love -P as much as it loves delve?). Then I'm pasting in my new command showing it my output asking for the awk and sed parts while googling the awk I always forget. It messes up the sed parts while googling, so I fix it, which means editing the awk part slightly but I already had the SO post open that I needed anyways. So I saved maybe one minutes total?
Then I give it a skeleton of a script file adding the variables I wanted and fully expected it to be a simple cleanup. No. It's definitely below average, I mean I've never seen an LLM produce bash functions without being explicitly told (not that the same isn't also true for the average person). But hey, it saved me the while loop for the args so that was nice. So it cost as much time as it gave back.
Don't get me wrong, I find LLMs useful but they're nowhere near game changing like everyone says they are. I'm maybe 10% more productive? But I'm not convinced that's even true. And sure, I might have been able to do less handholding with agents and having it build test cases but for a script that took 15 minutes to write? Feels like serious overkill. And this is my average experience with them.
Is everyone just saying it's so good at bash because no one is taking the time to learn bash? It's a really simple language that every Linux user should know the basics of...