Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Bing in creative mode really struggled with this task, and I don't think I could have successfully forced it.

Bing in precise mode had no trouble at all.

Me - Let's generate an acrostic sentence for "jack and jill went up the hill"

Bing - Sure! Here’s an acrostic sentence for the phrase “jack and jill went up the hill”:

Just as children know and never doubt, joyfully in love, laughing while enjoying nature together under pure trees, having endless happy interactions, laughing loudly.



A truly "intelligent" AI would probably have told me that the name for this is an acrostic sentence.

Now that I know, I tried a few more prompts.

Me: "Can you generate an acrostic sentence for "hello"?"

GPT-4: "Happy elephants love leaping over obstacles."

Me: "Can you pick a secret word and generate an acrostic sentence for it?"

GPT: "Secret word: KANGAROO. Kindly assisting neighbors, generous animals roam openly outdoors."

Me: "Can you pick a secret word, and without telling me the word, generate an acrostic sentence for it?"

GPT: "Playful, endearing creatures happily yawn, turning every day into a lovely event."

Me: "What was the secret word?"

GPT: "PECHYTIALE"

It's interesting that GPT seems to need to write the word first before making an acrostic sentence for it. Seems to me like a perfect illustration of the fact that it's just generating likely responses one token at a time rather than having any awareness or thought.


Another evidence is to ask GPT for a link with a reference for the answers it gives: it'll generate them instead of copying them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: