Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Last night, I used GPT-4 to help me design a stereo matrix mixer circuit.

First, I used it to help me make sense of the datasheet for a crosspoint matrix IC, and when "we" determined that the IC I was planning to use didn't support some of the functions that were critical to my design goals, it suggested a number of alternative ICs which might work, along with listing potential tradeoffs that might impact my design.

In the process of doing this, I had it make suggestions on how I could use various combinations of resistors and capacitors to buffer (clean up noise) that might impact my signal. At one point, it generated a schematic so that I could see what it was talking about, and it was correct.

At one point, it imagined some functionality on an IC that does not exist, and when I asked it "on a scale of 1 to 11, how confident are you that the AD75019 supports EVR?" (essentially, variable resistance across all 256 crosspoints) and it went back to the datasheet to correct itself, saying "on a scale of 1 to 11, I am 100% confident that it does not support EVR", which is about as sassy as you can get while still being obsequiously polite.

During the entire conversation, it not only suggested that I verify our conclusions with a qualified EE, but kept recommending that I check out existing commercial products. Not because it didn't understand I was building a device, but because it kept telling me that purchasing an existing product would be less painful than the time, expense and difficulty of building my own.

I believe that it was (strongly) implying that my time is valuable and that I should stop while I'm ahead. I ended up ordering an Erica Synths Matrix Mixer today, though I still might build my dream device. I call that productive.



I think this is interesting, because it points out the ways in which a future GPT might be subtly trained to embed advertising. "It looks like you're doing X; have you considered a commercial solution, such as Y or Z?".

While you did wind up with a device that probably will suit your need, you also wound up out of a potentially fun hobby project. Not everyone will call that a win :)


It actually worries me what's going to happen to tools like GPT when it starts being influenced by commercial interests and manipulating people. What's going to happen when a sports drink manufacturer pays GPT to never ever ever mention water when people are asking about dehydration, but instead to pitch the sports drink? Are we gonna block all kinds of knowledge just because it interferes with some corporation making a profit? What happens when GPT starts promoting a political candidate or demoting another? Who controls which candidates GPT prefers? What are we doing to protect GPT from this kind of outcome?


While I don't think that you're wrong about being concerned, I would suggest that we currently exist in a society where teens are held in rapture by TikTok, Google serves custom results full of ads and Amazon is only barely trying to purge fake reviews. Kelly Anne Conway can describe lies as "alternative facts" on television and be lauded for being "good" at her "job".

We're already in the storm.


I think I once got it to get out of "buy" mode by lying to it and telling it I'm in a sanctioned country. Maybe it's a trick that could work for you :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: