Is there an existing framework for resumable file transfers?

Outside of highly connected urban office environments, Internet links can be slow, unreliable, or both, with clients making requests that might take longer than 20 hours to complete or which would otherwise necessitate pausing and resuming the transfer (e.g. mobile going through multiple dead spots).

Back in the days of modems, we addressed this kind of problem by a variety of methods -- chunking files into multiple parts, using ZMODEM, etc. -- but as high speed connections became more common these seem to have fallen out of use.

Is there a framework (middleware?) already available to address this issue? I'm thinking about situations where a mobile device might need to upload a large file (such as a photograph) or download a large file, while losing/regaining network connectivity every few minutes.

Try Downloading Files in the Background.

You may want to look into a ftp framework. FTP has a start stop and resume functionality if you choose to use it.

You should get this relatively easily with Vapor as long as you support byte range downloads. Apple clients can use URLSession's URLSessionDownloadTask to download and will (usually) be give a resume data blob on cancellation or error that they can use to resume the download, as long as the server supports the proper byte range. You just need to be able to respond with the proper slices according to the header, but Vapor will parse the header and ranges for you.