Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's a dev here who is using ChatGPT extensively in his work. The rest of the team is just waiting for him to get caught and fired. Sharing company data with unapproved external entities is very definitely a firing offense.


Glad I work for a company where the CEO pays for everyones ChatGPT Plus for the devs. If you think your code is special then you're wrong.


But you created a throwaway account specifically to reply in this thread?

Unless your company really has nothing to hide, it's easy to accidentally dump a company secret or an API key in a chat session. Of course if everyone is aware of this and constantly careful then you may be OK.


That's because accounts get shadow banned all the time when people get upset when you point out hard truths.

If you're copy pasting API keys or such into ANYTHING, you probably shouldn't be a programmer to begin with.

It's like people who use root account key/secret credentials in their codebase. It's not AWSs fault you got a large bill or got hacked, its because you're dumb.


I regularly say shit that pisses people off here and I have never been shadow banned. It sounds like your "hard truths" are something other than just "hard truths", and/or you have a persecution complex.


Your Karma is over 7000, if you get downvoted your stuff is still visible.


Getting downvoted to gray isn't a "shadow ban" at all. It is however a signal that others didn't find your comment worthwhile.


if you didn't use throwaway accounts your karma would presumably be much higher?


I posted my openAI token into a GitHub issue today thinking I'd just kill it right away, which I did but there was already an email from openAI letting me know that it was noticed that my token had become public and was thus revoked.


Nice so avoiding getting shadowbanned on hackernews is fine but avoiding getting sued is petty ?


If your code has API keys in it, you have bigger problems than ChatGPT.


My code is "special" in the fact that the act of sharing it can carry civil and criminal liabilities for myself, essentially threatening my well-being and freedom.

Not my place legally or ethically to share code with 3rd parties that I've been paid to read and write.


If the contract says the code is special, then the code is special.


> If you think your code is special then you're wrong

Your code is not special, but customers data may be. Also, some companies needs to comply to various certifications, and proven leak of source code that was put into some third party tool may be a reason to revoke such certification. Which can cause a serious financial harm to a given company, as it can lead to ex. losing government clients.

This is just the tip of the iceberg.


You are still transferring your business data to an external entity, but on top of it you pay for it.

And if you think that there is no special code then you're wrong.


If you think random snippets of code are special you really don't understand the business you're writing code for. So no, your code is not special, and pasting code snippets is not transferring business data.


What do you know about the code written by the people you are replying to?

Lots of code expresses buisness strategy that is a competitive advantage/ sensitive.

My org has done a risk assessment and accepted the risk of using such tools; arguing there is no risk is short sighted.


> pasting code snippets is not transferring business data.

It literally is exactly that. You don't think the code a business creates is "business data"?


You forgot IANAL


That entirely depends on the code. It's not that the code is special, it's that the code can reveal things that are competitive advantages (future plans, etc.)


Code isn't special but what you're working on can be and also it's not your decision if you're not the shareholder.


Your code might not be special, but I think plenty of hedge funds and prop trading firms beg to differ.


Does chatgpt plus collect data for training, or does it have more privacy than the free offering?


Replying to myself, it seems your data is still used, unless you fill in a google form to opt out: https://help.openai.com/en/articles/6950777-chatgpt-plus


ChatGPT & DALL-E (non-API products) are opt-out while their API is opt-in https://help.openai.com/en/articles/7039943-data-usage-for-c...


This Google form requires an organization ID so may not apply to personal GPT+ accounts.


Mine is personal and I just filled out this form with the ID from the docs. Easy, worked fine. Also thanks to the grandparent comment for surfacing this!


If you really care about your company's security, you should report it, otherwise you are just complicit.


Why not talk to the employee first?


This could be valid...but with something as powerful as ChatGPT, if it is providing huge benefits for employee productivity, they are unlikely to dump it based off a co-worker's suggestion. Also, unless managing security is within your roles and responsibilities, this approach would likely turn messy from an interpersonal aspect. Lastly, the security issue has already happened, so if this is truly a security concern, the security team should know that (a) something is already out there (b) this could be a widespread problem in the future.

FWIW I don't think the employee should be fired for this or anything, if anything a company could embrace these new technological advances and provide training on using ChatGPT in a more secure manner(ie don't paste your customer's PII into a prompt, etc...).


> if it is providing huge benefits for employee productivity

With this particular employee, using chatGPT has not increased his productivity or the quality of his work by any noticeable degree.

> I don't think the employee should be fired for this or anything

The problem isn't using the technology. The problem is sharing confidential information with an unapproved entity. That is specifically and clearly spelled out as a firing offense, for pretty obvious reasons.

Even if some people feel that it's an overly tight policy, it's a the stated policy and the company has every right to put and enforce whatever rules it wishes about the use of its own data.


I agree.


If they're more productive by doing it, I think it's an equal chance said dev gets promoted.


He's not more productive, but even if he were, it wouldn't affect his getting fired.

Productivity isn't everything.


Why?

Uploading code to ChatGPT can be done by trainees.


Yeah, but the code coming out of ChatGPT is generally not in a state that you want to commit straight into you repo. Making adjustments (and writing the original prompt) is where your expertise comes in.


You can just submit the trash code and let a senior fix it for you in code review.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: