Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
int_19h
52 days ago
|
parent
|
context
|
favorite
| on:
Heretic: Automatic censorship removal for language...
If you're running a local model, in most cases, jailbreaking it is as easy as prefilling the response with something like, "Sure, I'm happy to answer your question!" and then having the model complete the rest. Most local LLM UIs have this option.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: