Set-Cookie headers are handled differently on macOS and Linux

I'm working on a command line project that calls a network endpoint that sets cookies and requires those cookies to be sent with subsequent requests. While this all runs as expected on macOS, I've been experiencing a lot of difficulties on Linux. After a lot of debugging I figured out that on Linux I only get 2 cookies while on Mac i get 3.

Further digging in the FoundationNetworking I discovered that HTTPCookie's cookies(withResponseHeaderFields headerFields: , for URL:) method only considers Set-Cookie headers while leaving set-cookie headers out. This behaviour is technically correct (the best kind of correct), however, it deviates from macOS.
Here's the file on Github

My first thought after discovering this was to reach for the headers and save the cookies manually, but guess what: only one cookie is included in the headers (accessed through HTTPURLReques.allHeaderFields), and it seems to be a random one each time the app is run. This behaviour is identical on both Linux and Mac. On mac all cookies appear in the allHeaderFields joined under a single key.

Seeing how other languages such as Python and Javascript also seem to be case-insensitive when it comes to header field names/keys, it could be a good idea to consider allowing case insensitive set-cookie headers as an option, or just simply including all cookies in the allHeaderFields because, after all, this name is a little misleading otherwise. Is there a workaround for this using URLSession, or should I venture into using other HTTP libraries?


URLSession on Linux is a mess at the moment, so if you're hitting issues the best bet is to use AsyncHTTPClient.

Hopefully things improve with new Foundation!


thanks @0xTim! this is what I ended up doing, that solved the cookie parsing issue. AsyncHTTPClient does not have a cookie storage, but I wrote a primitive file-based storage just to keep the auth cookies I need for the next requests.

1 Like