RE: curl --data slow with big file
Date: Thu, 12 Sep 2019 09:57:35 +0200
Here is a modified file2string
ParameterError file2string(char **bufp, FILE *file)
{
char *ptr;
size_t alloc = 512;
size_t alloc_needed;
char* string = malloc(alloc);
if (!string) {
Curl_safefree(string);
return PARAM_NO_MEM;
}
if(file) {
char buffer[256];
size_t stringlen = 0;
while(fgets(buffer, sizeof(buffer), file)) {
size_t buflen;
ptr = strchr(buffer, '\r');
if(ptr)
*ptr = '\0';
ptr = strchr(buffer, '\n');
if(ptr)
*ptr = '\0';
buflen = strlen(buffer);
alloc_needed = stringlen + buflen + 1;
if (alloc < alloc_needed) {
while (alloc < alloc_needed)
alloc *= 2;
ptr = realloc(string, alloc);
if(!ptr) {
Curl_safefree(string);
return PARAM_NO_MEM;
}
string = ptr;
}
strcpy(string + stringlen, buffer);
stringlen += buflen;
}
}
*bufp = string;
return PARAM_OK;
}
-----Message d'origine-----
De : Daniel Stenberg [mailto:daniel_at_haxx.se]
Envoyé : jeudi 12 septembre 2019 09:29
À : Gilles Vollant via curl-library
Cc : Gilles Vollant
Objet : Re: curl --data slow with big file
On Thu, 12 Sep 2019, Gilles Vollant via curl-library wrote:
> And in src/tool_paramhlp.c, the function file2string does very often
realloc
Oh right. Once *per line* of the file. For a 100MB json file, I bet that can
end up in quite a lot of reallocs!
> So I suggest modifying file2string to call realloc less often, like
> file2memory.
That's a very good idea. Are you willing to work on it?
-- / daniel.haxx.se | Get the best commercial curl support there is - from me | Private help, bug fixes, support, ports, new features | https://www.wolfssl.com/contact/
-------------------------------------------------------------------
Unsubscribe: https://cool.haxx.se/list/listinfo/curl-library
Etiquette: https://curl.haxx.se/mail/etiquette.html
- text/plain attachment: tool_paramhlp.c