curl-library
Re: Question: How to guarantee server data intact before resume download from it?
Date: Tue, 29 Nov 2005 09:47:25 -0800
On Wed, Nov 30, 2005 at 12:21:03AM +0800, yu kai wrote:
> I have some questions regarding libcurl resuming download from HTTP/FTP.
> 1. If there is a broken download before, and the client call libcurl to
> resume download again, how can the client guarantee the server file(either
> in HTTP or FTP server) is not changed (the worst case is, same filesize,
> while different content), before going on resume download from the last
> successful bit of local copy? What's the recommended way? Is it different
> for HTTP and FTP downloading?
With HTTP, looking at the ETag, Last-Modified and Content-Length headers
should give you a fairly reliable indication of whether the content has
changed or not. The Content-MD5 header would give you an (almost) perfect
indication of whether the content has changed, but it's an optional header
and I've never actually seen it being used in practice.
With FTP, you might be able to use one of the commands 'SITE EXEC md5sum file'
'SITE CHECKSUM' or 'SITE CHECKMETHOD' to get an MD5 digest of a file. These
are non-standard, though.
> 2. Also has the alike question on uploading, how can the client quarantee
> the last successful copy in the server is intact for going on?
You could use one of the commands above to generate an MD5 digest and compare
it to what you expect. Alternately, you could just download the file and
compare it!
>>> Dan
-- http://www.MoveAnnouncer.com The web change of address service Let webmasters know that your web site has movedReceived on 2005-11-29