cURL / Mailing Lists / curl-library / Single Mail


Re: volounteering file size fixer?

From: Daniel Stenberg <>
Date: Mon, 14 Jan 2002 09:22:51 +0100 (MET)

On Sun, 13 Jan 2002, SM wrote:

Thanks for grabbing this issue.

> >We need to add a few test cases and verify that libcurl treats big files
> >(>2GB and >4GB) as we want it to. Bug report #500977 already mentions one
> >problem with FTP downlaod resume with big files.
> Libcurl uses the fstat() function to determine the file size. That would
> have to be changed to build large file support into libcurl. Win32 has the
> fstati64() function.

I'm not very good at this, but AFAIK, the unix way of dealing with big files
is the exact same way as before: fstat(). The file size is given with a type
called 'off_t' (the size field of the struct) that is a 64 bit type on the
systems that support this.

For Linux systems, I belive you need a glibc 2.2 and 2.4 kernel.

I would appreciate if someone with more knowledge on this could correct me if
I'm wrong.

> Currently, I don't have sufficient disk space to run test cases for >2GB
> files. I will modify libcurl to implement large file support.

The only stat() or fstat() in libcurl is in the lib/file.c, for file:// URLs.
There are three calls to stat() in src/main.c though.

What also needs to be addressed in libcurl is other code parts that deal with
file sizes:

 o The code that receives the reply from a FTP server after 'SIZE' has been
 o The code that parses the FTP reply after 'RETR' has been issued.
 o The code that parses the HTTP header "Content-Length:".

... and code around those parts that might deal with sizes and estimates
based on those sizes.

    Daniel Stenberg -- curl groks URLs --
Received on 2002-01-14