Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Outright wrong. You are allowed to handle POST requests in a safe and idempotent way.

You clearly have no idea about what you are discussing. The definition of HTTP verb semantics clearly defines the concept of safe and idempotent methods. You should seriously learn about the concept before commenting on discussions on semantics. If that's too much work, RFC 9110 clearly and unambiguously states that only the GET, HEAD, OPTIONS, and TRACE methods are defined to be safe. Not POST.

It's completely irrelevant if you somehow implemented a POST method that neither changes the state of the server nor returns the same response whether you invoke it once or many times. All participants in a HTTP request adhere to HTTP's semantics, and thus POST requests are treated as unsafe and non-idempotent by all participants. That affects things like caching, firewalls, even security scanners. The HTTP specs determine how all participants handle requests, not just the weekend warrior stuff someone chose to write in a weekend.





You clearly do not understand protocol/interface design.

The behavior and constraints of a generic interface is a lower bound on the behavior and upper bound on the assumable constraints by all direct participants when handled in general. This is POST in general. This is the weakest usage.

A specialized interface may enhance behavior or tighten constraints further. This is actually using POST to achieve some website-specific end. You can not actually use a website/reason about usage in anything other than a read-only fashion without knowing what the endpoints actually do and when to issue requests at which endpoint.

Indirect or intermediary participants should maintain behavior and constraints at the lower of the level of specialization they are aware of and the participants they are between. If you are designing a generic intermediary, then you do not have access to or knowledge about the specialized interfaces, so are less able to leverage specialized behavior or constraints. In HTTP, various headers serve the purpose of communicating various classes of common interface specializations so someone can write more powerful generic intermediarys. However, you could always write a specific intermediary or handler if you control portions of the chain, which people do all the time.

This applies across all classes of software interfaces and protocols not just websites using HTTP. But I guess HTTP is just such a special snowflake that it is beyond learning good practice from other fields.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: