cURL / Mailing Lists / curl-library / Single Mail

curl-library

libcurl and 'streamed' transfers

From: Apurva Mehta <apurva_at_mathmeth.com>
Date: Tue, 11 Nov 2008 15:20:03 -0800

Hi,

   I am considering using libcurl to do some HTTP heavy lifting, and
am wondering whether libcurl will actually meet my requirements:

   I will be working with two URLs . I need to retrieve a file from
one URL using a HTTP GET , and I need to upload that same file to
the second URL using a HTTP PUT . Storing the entire file in a
temporary buffer, and copying that entire buffer around between
independent GETS and PUTS is not acceptable for performance
reasons. The size of my files could be in the gigabyte range and I
would not like to write to disk if I can avoid it. But I don't want to
have a huge memory overhead either.

   I know that the function specified with CURLOPT_WRITEFUNCTION is
called multiple times as data is being retrieved. It would be ideal if
I could use it in conjunction with a CURLOPT_READFUNCTION so that
they behave as a producer/consumer on a shared buffer. Additionally,
it would be desirable that if either one of these curl instances
fails, the other should be aborted as well.

    All of this would mean that I could manage a transfer of arbitrary
size without writing to disk and keep the memory overhead tight as
well. Is it possible to achieve this behavior using libcurl and
curl_multi ?

Thanks,
Apurva

  
  
Received on 2008-11-12