Outside of highly connected urban office environments, Internet links can be slow, unreliable, or both, with clients making requests that might take longer than 20 hours to complete or which would otherwise necessitate pausing and resuming the transfer (e.g. mobile going through multiple dead spots).
Back in the days of modems, we addressed this kind of problem by a variety of methods -- chunking files into multiple parts, using ZMODEM, etc. -- but as high speed connections became more common these seem to have fallen out of use.
Is there a framework (middleware?) already available to address this issue? I'm thinking about situations where a mobile device might need to upload a large file (such as a photograph) or download a large file, while losing/regaining network connectivity every few minutes.