cURL / Mailing Lists / curl-library / Single Mail

curl-library

Re: re-use data connection for FTP transfers

From: Daniel Stenberg <daniel-curl_at_haxx.se>
Date: Wed, 19 Jan 2005 23:34:14 +0100 (CET)

On Wed, 19 Jan 2005, Ashish Nigam wrote:

> OK, I started a new thread this time :-)

Great!

> I have to send about 100 MB of data to remote FTP server. I populate this
> data in memory at client side.

...

> It is not a good idea to create 100 MB of heap memory and then send all at
> once.

I agree it sounds like a lot of memory.

> So I send in chunks of 5 MB or so.

Hold it. What does the amount of data in RAM have to do with what you send?

Why can't you just read data from a file in the libcurl callback? Then you can
send insanely large files without wasting ram.

> Now data in file should be in right order and all 100 MB data is part of one
> file, I have to send data in append mode so that subsequent chunks are
> appended to the same file.

I understand what you're saying, I just don't understand why you do it this
way.

> I use same Curl Handle for all transactions but for data transfers, for
> every chunk, it creates new data connection. And I wanted to avoid the
> creation of new data connection for every chunk.

That is impossible. FTP is just not designed to support that.

> So I wanted to know if there is a way to tell Curl that even if I have send
> 5 MB of data at this time, there is more data ahead and re-use the same data
> connection. I hope I am able to convey the correct scenario this time.

I think you should think more about how you can send one large file without
disconnecting in between and do multiple appends.

Or live with the data connections having to get created and closed once for
every chunk.

-- 
      Daniel Stenberg -- http://curl.haxx.se -- http://daniel.haxx.se
       Dedicated custom curl help for hire: http://haxx.se/curl.html
Received on 2005-01-19