Generally rewriting emails for clarity... but I found another neat use of GPT-4.
For public APIs, I ask to make sure its aware of the api. Then I ask for endpoints. I find the endpoint I want. Then I ask it to code a request to the endpoint in language X (Ruby, Python, Elixir). It then gives me a starting point to jump off from.
Thirty seconds of prompt writing saves me about 20 minutes of getting setup. Yes, I have to edit it but generally it is pretty close.
You reminded me: I discovered that ChatGPT had invented an API for me. Has that happened to you yet?
Since it went to the trouble of writing code for the API as well, I contacted the API developers to follow up about the topic. The code given was kind of a hand-wave anyway so I'd need to polish it up.
The developers were surprised to hear they had an API. In truth, there was no such thing.
I then found myself in one of those awkward "welp, guess I can keep my job" conversations...good for them, but for me: Go home, no API here. A disappointment with some meta-commentary sprinkled on top.
Yeah I was coding up a fairly complicated payment form for a Stripe like processor the other day. I thought I’d give chatgpt a go and it confidently gave me the example code I needed and told me how to use it, etc. I was blown away until about 30 seconds later when I realised it was all complete bull crap. It was quite bizarre because this company didn’t really have any public docs out from when chatgpt supposedly harvested it’s data until, but it knew about the company and knew a couple of funny keywords this company uses in its form, so it was almost believable.
One shining use case though, I typically live in Ruby and the example code for this company was all Java and Python. Getting ChatGPT to convert the boring encryption methods into Ruby was amazing.
This has improved significantly between 3, 3.5 and now 4. It used to create a lot of Apple Frameworks/Classes and Methods, many of which would have been useful if they actually existed.
yeah even asking for common node library/sdk implementations has been off for me, calling functions with options that are not accepted, or what it thinks they should be
Yeah, I have found I need to be careful. When I have used it, there is no confidential information in the email. I do pay attention to that.
That said,I think it will be interesting as Microsoft introduces this into Office 365. You bring up a great point. Most people will not realize they are sending potentially confidential information to Microsoft.
Perhaps it's no different than Grammarly... But I think you are right that legal departments are going to be all over this.
Not only do you likely have access to all the other Microsoft stuff if your company is using teams, teams uses sharepoint for file sharing. If you use only teams for 2 years and one day login to office365.com, you’ll probably be surprised with the main screen that shows the files you’ve shared (without context, they’re just sitting there) and you’ll probably also be able to see what files your colleagues are sharing and working on.
It means quite literally what it says - if you have office 365 you most likely have all your data in the MS cloud sharepoint. MS also has a separate government cloud.
I think companies are fine with sending confidential data to Microsoft (Office, GitHub, Azure...). It's just so far unclear with ChatGPT if that data can come back out. It has apparently already leaked some user queries, so that was a very reasonable concern.
If they put it in Office and guarantee siloing information the legal departments will just have a regular contract to review and approve.
This is one of the causes there's a push to run your own engines for large language models: if you run your own service you can control the environment, data and reproducibility.
This is exactly what my employer is doing, they pay so that our internal data (from employee queries) does not become part of the model. They've blocked the public chat gpt etc.
Great extension! I used it recently, and had some trouble drafting email reminders (to respond to an email). Do you have any tips on how I could do that with the extension?
This is exactly the kind of thing I hope LLM chatbots will be genuinely useful for. Though, how often do you find it completely hallucinating endpoints / parameters etc. ?
I use it for similar things as GP, and find its strengths to be similar too.
ChatGPT hallucinates SVG path attributes. Ask it to make an svg of a unicorn - it will give you markup that looks okay, but if you look at the values of the paths, it's clearly gibberish.
(SVG is a particularly interesting case because it's XML on the outside, but several attributes are highly structured, esp g.transform and path.d. Path.d is basically the string of a Logo-like programming language. I was specifically looking at these attributes for realism, and didn't find it.)
Same experience for me with 4. It doesn't seem to have the ability to conceive of something visual and map it to SVG at the moment, or only extremely rudimentary.
Great question. If you ask it for an API endpoint that is described online but isn't well documented publicly, it seems to default back to what it thinks you should do. For example, in one example, it hallucinates that you need a bearer token.
I don't know whether that is because that is a common way of doing things or whether a previous prompt responded with a bearer token... But it wasn't right.
For me, it's a leaping off point that often saves time if I ask the right question. To your point, you have to be quick to know enough about the API to deduce whether you and Chat GPT are in the same universe.
Could you mock-up what might be a typical email written by you, then pass it through GPT, then post both responses here? I'd be curious to see what the difference looks like for someone else's writing. I've tried this exact use-case and noticed a drop in quality and clarity, rather than an improvement.
Note: I tried with GPT-3.5 and it doesn't respond with all the same APIs available. That said, if you want to try the above... It appears that the rates api isn't available in 3.5 but if you follow the example through.... it will still produce nearly identical code for the rates API even though it doesn't say that it is there.
For public APIs, I ask to make sure its aware of the api. Then I ask for endpoints. I find the endpoint I want. Then I ask it to code a request to the endpoint in language X (Ruby, Python, Elixir). It then gives me a starting point to jump off from.
Thirty seconds of prompt writing saves me about 20 minutes of getting setup. Yes, I have to edit it but generally it is pretty close.