curl / Mailing Lists / curl-users / Single Mail


Re: Need some help about large data from web

From: Daniel Stenberg <>
Date: Thu, 10 Nov 2016 08:35:38 +0100 (CET)

On Thu, 10 Nov 2016, wrote:

> size_t write_data(void *ptr, long unsigned int size, long unsigned int nmemb, void *stream)
> {
> strcat(stream, (char *)ptr);
> return size*nmemb;
> }

First, note that the curl-library mailing list is a much more suitable list to
discuss libcurl stuff on.

Then, allow me to quote a section from the CURLOPT_WRITEFUNCTION man page that
is highly relevant for you:

   "The data passed to this function will not be zero terminated! "

So strcat() is completely inappropriate there. And frankly, even if the data
had been zero terminated, using strcat() like that is terribly ineffecient and
will make the transfer unnecessarily slow and CPU hogging.

> static unsigned char str[64000] = {""};

And this is like begging for problems down the line when something changes and
you get more than 64000 bytes downloaded.

Let me instead suggest that you look into the getinmemory.c example that we
provide which basically does what you want, but in a way that is faster and
that won't break if you grow beyond 64000 bytes:

See the online version at:

List admin:
Received on 2016-11-10