Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

From my experimentation, LLMs tend to kind of suck at rhyme and meter, and all but the simplest types of poetry, so even if you'd specified it probably wouldn't have been able to deliver.

This is definitely something they could be trained to be much better at, but I guess it's hasn't been a priority.



GPT4 is surprisingly good at it, considering BPE tokenization means it shouldn't be able to do rhyme at all.


Has anyone tried using phonetic tokens instead of text? I'm curious if that would help with things like rhyming.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: