| ▲ | tmsbrg 9 hours ago |
| I'm surprised there's no mention of the SameSite cookie attribute, I'd consider that to be the modern CSRF protection and it's easy, just a cookie flag: https://scotthelme.co.uk/csrf-is-dead/ But I didn't know about the Sec-Fetch-Site header, good to know. |
|
| ▲ | tordrt 5 hours ago | parent | next [-] |
| Yep SameSite lax, and just make sure you never perform any actions using Get requests, which you shouldn’t anyway. |
| |
| ▲ | paulryanrogers 4 hours ago | parent [-] | | Unsubscribe often need to be GET, or at least start as GET | | |
| ▲ | eli 2 hours ago | parent [-] | | list-unsubscribe header sends a POST. Probably makes more sense to just use a token from an email anyway. | | |
| ▲ | hn_throwaway_99 an hour ago | parent [-] | | The way the list-unsubscribe header works, it essentially must use a token when one click unsubscribe (i.e when the List-Unsubscribe-Post: List-Unsubscribe=One-Click header is also passed) is used, and since GMail has required one click unsubscribe for nearly 2 years now, my guess is all bulk mail senders support this. Relevant section from the one click unsubscribe RFC: > The URI in the List-Unsubscribe header MUST contain enough information to identify the mail recipient and the list from which the recipient is to be removed, so that the unsubscription process can complete automatically. Since there is no provision for extra POST arguments, any information about the message or recipient is encoded in the URI. In particular, one-click has no way to ask the user what address or from what list the user wishes to unsubscribe. > The POST request MUST NOT include cookies, HTTP authorization, or any other context information. The unsubscribe operation is logically unrelated to any previous web activity, and context information could inappropriately link the unsubscribe to previous activity. > The URI SHOULD include an opaque identifier or another hard-to-forge component in addition to, or instead of, the plaintext names of the list and the subscriber. The server handling the unsubscription SHOULD verify that the opaque or hard-to-forge component is valid. This will deter attacks in which a malicious party sends spam with List-Unsubscribe links for a victim list, with the intention of causing list unsubscriptions from the victim list as a side effect of users reporting the spam, or where the attacker does POSTs directly to the mail sender's unsubscription server. > The mail sender needs to provide the infrastructure to handle POST requests to the specified URI in the List-Unsubscribe header, and to handle the unsubscribe requests that its mail will provoke. |
|
|
|
|
| ▲ | nhumrich 4 hours ago | parent | prev | next [-] |
| This is "not allowing cross site at all" so, technically it's not "request forgery" protection.
Yes, this is very semantic, but, CSRF is a vulnerability introduced by enabling CS and CORS.
So, technically, same-site cookies are not "protection" against CSRF. |
| |
| ▲ | hn_throwaway_99 3 hours ago | parent | next [-] | | I don't understand your distinction at all. I may not quite grok your meaning here, but CORS is usually discussed in the context of allowing cross-origin AJAX calls. But cross origin form posts are and have always been permitted, and are the main route by which CSRF vulnerabilities arise. Nothing on the client or server needs to be enabled to allow these form posts. Furthermore, the approach detailed in the article simply has the server block requests if they are cross site/origin requests, so I'm not sure what the semantic difference is. | | |
| ▲ | true_religion 2 hours ago | parent [-] | | Yeah, CORS is not a safety mechanism. It’s a procedure of loosening the default safety mechanism of not sharing any response data from a cross site request with client side JavaScript. |
| |
| ▲ | nchmy 2 hours ago | parent | prev [-] | | Cs and cors have nothing to do with csrf... Though, yes, neither does same-site |
|
|
| ▲ | miguelgrinberg 9 hours ago | parent | prev | next [-] |
| The OWASP CSRF prevention cheat sheet page does mention SameSite cookies, but they consider it defense in depth: https://cheatsheetseries.owasp.org/cheatsheets/Cross-Site_Re.... |
| |
| ▲ | hn_throwaway_99 3 hours ago | parent | next [-] | | I don't understand the potential vulnerabilities listed at the linked section here: https://datatracker.ietf.org/doc/html/draft-ietf-httpbis-rfc... They give 2 reasons why SameSite cookies are only considered defense in depth: ---- > Lax enforcement provides reasonable defense in depth against CSRF attacks that rely on unsafe HTTP methods (like "POST"), but does not offer a robust defense against CSRF as a general category of attack: > 1. Attackers can still pop up new windows or trigger top-level navigations in order to create a "same-site" request (as described in section 2.1), which is only a speedbump along the road to exploitation. > 2. Features like "<link rel='prerender'>" [prerendering] can be exploited to create "same-site" requests without the risk of user detection. > When possible, developers should use a session management mechanism such as that described in Section 8.8.2 to mitigate the risk of CSRF more completely. ---- But that doesn't make any sense to me. I think "the robust solution" should be to just be sure that you're only performing potential sensitive actions on POST or other mutable method requests, and always setting the SameSite attribute. If that is true, there is absolutely no vulnerability if the user is using a browser from the past seven years or so. The 2 points noted in the above section would only lead to a vulnerability if you're performing a sensitive state-changing action on a GET. So rather than tell developers to implement a complicated "session management mechanism", it seems like it would make a lot more sense to just say don't perform sensitive state changes on a GET. Am I missing something here? Do I not understand the potential attack vectors laid out in the 2 bullet points? | |
| ▲ | tptacek 3 hours ago | parent | prev [-] | | Because of clientside Javascript CSRF, which is not a common condition. | | |
| ▲ | nchmy 2 hours ago | parent [-] | | Client side js is not particularly relevant to csrf. | | |
| ▲ | tptacek 2 hours ago | parent [-] | | I mostly agree, but that's the logic OWASP uses to argue you should still be doing explicit tokens even if you're using SameSite and Sec-Fetch. |
|
|
|
|
| ▲ | hatefulheart 3 hours ago | parent | prev [-] |
| I’m confused, how does this prevent a CSRF attack? SameSite or not is inconsequential to the check a backend does for a CSRF token in the POST. |
| |
| ▲ | hn_throwaway_99 2 hours ago | parent | next [-] | | The only reason CSRF is even possible is because the browser sends (or, well, used to send) cookies for a particular request even if that request initiated from a different site. If the browser never did that (and most people would argue that's a design flaw from the get go) CSRF attacks wouldn't even be possible. The SameSite attribute makes it so that cookies will only be sent if the request that originated them is the same origin as the origin that originally wrote the cookie. | | |
| ▲ | hatefulheart 2 hours ago | parent [-] | | I think I understand now, the Cookie just is not present in the POST if a user clicked on, for example, a maliciously crafted post from a different origin? |
| |
| ▲ | tptacek 3 hours ago | parent | prev [-] | | No? The whole point of SameSite=(!none) is to prevent requests from unexpectedly carrying cookies, which is how CSRF attacks work. | | |
| ▲ | hatefulheart 2 hours ago | parent [-] | | What does this even mean? I’m not being rude, what does it mean to unexpectedly carry cookies? That’s not what I understand the risk of CSRF is. My understanding is that we want to ensure a POST came from our website and we do so with a double signed HMAC token that is present in the form AND the cookie, which is also tied to the session. What on earth is unexpectedly carrying cookies? |
|
|