curl-users
Re: Re: Re: Re: cURL TODO : 15.3 prevent file overwriting
From: Deepak Singh <deepak.sn_at_samsung.com>
Date: Thu, 05 Jun 2014 10:57:23 +0000 (GMT)
Samsung Enterprise Portal mySingle
> Please find attached the patch with above features.
Lovely. Some remaining problems though:
1) I'd like you to use "configure --enable-debug" as it'll switch on some
picky compiler warnings you want to fix:
tool_operate.c:210:50: error: conversion to 'int' from 'size_t' may alter its
value [-Werror=conversion]
tool_operate.c:220:5: error: passing argument 1 of 'curl_dofree' discards
'const' qualifier from pointer target type [-Werror]
tool_operate.c:227:3: error: return discards 'const' qualifier from pointer
target type [-Werror]
(they're errors for me since I also use --enable-werror to better catch these
problems)
2) try 'make checksrc' in the root dir to check for source code style
compliance:
../src/tool_getparam.c:1536:6: warning: Trailing whitespace
../src/tool_operate.c:203:3: warning: Trailing whitespace
3) Your use of '_url' will make the code reference a NULL pointer if
--no-clobber is used before -o or -O:
$ curl --no-clobber -O localhost/moo
And does it really work if we provide multiple URLs? I figure we'd require one
for each -o/-O then!
4) in your new function returnnewfilename(), why do you end up strdup()ing the
already fine file name? It seems like an extra malloc+copy operation you can
skip. It doesn't check for a NULL return from malloc, and I find your variable
names in that function to be overly long to the level that they actually make
the function hard to read. The function is 20 lines and you have 4 variables,
please use short names for them. The comment is also funnily formatted. I
would suggest the function is much easier to read like this:
static char *newname(const char *ofile)
{
int i;
size_t newlen = strlen(ofile) + EXTRACHARS;
char *onew = malloc(newlen);
if(!onew)
return NULL;
snprintf(onew, newlen, "%s", ofile);
for(i = 1; (fileexists(onew) && i < INT_MAX); i++)
snprintf(onew, newlen, "%s.%d", ofile, i);
if(ofile)
free((char *)ofile);
if(i >= INT_MAX && fileexists(onew))
return NULL; /* only filename.(INTMAX-1) filenames are */
return onew;
}
5) your change to using open() instead of fopen() did not fix the race
condition I mentioned, since you don't create the new file in that same
operation the way I suggested. Is that better? If you run 100 scrips in
parallel, all trying to download to the same file name you'll see that your
current solution will end up overwriting files in spite of the --no-clobber
option since another process can create the file between the check and the
creation of it.
--
/ daniel.haxx.se
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-users
FAQ: http://curl.haxx.se/docs/faq.html
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2014-06-05
Date: Thu, 05 Jun 2014 10:57:23 +0000 (GMT)
Dear Daniel,
I tried to solve the problems suggested by you, please find attached the patch with the solutions of problems.
I will try to follow these guidelines for further contributions.
Thanks & Regards,
Deepak Singh
------- Original Message -------
Sender : Daniel Stenberg<daniel@haxx.se>
Date : Jun 01, 2014 03:31 (GMT+05:30)
Title : Re: Re: Re: cURL TODO : 15.3 prevent file overwriting
On Thu, 29 May 2014, Deepak Singh wrote:
> Please find attached the patch with above features.
Lovely. Some remaining problems though:
1) I'd like you to use "configure --enable-debug" as it'll switch on some
picky compiler warnings you want to fix:
tool_operate.c:210:50: error: conversion to 'int' from 'size_t' may alter its
value [-Werror=conversion]
tool_operate.c:220:5: error: passing argument 1 of 'curl_dofree' discards
'const' qualifier from pointer target type [-Werror]
tool_operate.c:227:3: error: return discards 'const' qualifier from pointer
target type [-Werror]
(they're errors for me since I also use --enable-werror to better catch these
problems)
2) try 'make checksrc' in the root dir to check for source code style
compliance:
../src/tool_getparam.c:1536:6: warning: Trailing whitespace
../src/tool_operate.c:203:3: warning: Trailing whitespace
3) Your use of '_url' will make the code reference a NULL pointer if
--no-clobber is used before -o or -O:
$ curl --no-clobber -O localhost/moo
And does it really work if we provide multiple URLs? I figure we'd require one
for each -o/-O then!
4) in your new function returnnewfilename(), why do you end up strdup()ing the
already fine file name? It seems like an extra malloc+copy operation you can
skip. It doesn't check for a NULL return from malloc, and I find your variable
names in that function to be overly long to the level that they actually make
the function hard to read. The function is 20 lines and you have 4 variables,
please use short names for them. The comment is also funnily formatted. I
would suggest the function is much easier to read like this:
static char *newname(const char *ofile)
{
int i;
size_t newlen = strlen(ofile) + EXTRACHARS;
char *onew = malloc(newlen);
if(!onew)
return NULL;
snprintf(onew, newlen, "%s", ofile);
for(i = 1; (fileexists(onew) && i < INT_MAX); i++)
snprintf(onew, newlen, "%s.%d", ofile, i);
if(ofile)
free((char *)ofile);
if(i >= INT_MAX && fileexists(onew))
return NULL; /* only filename.(INTMAX-1) filenames are */
return onew;
}
5) your change to using open() instead of fopen() did not fix the race
condition I mentioned, since you don't create the new file in that same
operation the way I suggested. Is that better? If you run 100 scrips in
parallel, all trying to download to the same file name you'll see that your
current solution will end up overwriting files in spite of the --no-clobber
option since another process can create the file between the check and the
creation of it.
--
/ daniel.haxx.se
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-users
FAQ: http://curl.haxx.se/docs/faq.html
Etiquette: http://curl.haxx.se/mail/etiquette.html
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-users
FAQ: http://curl.haxx.se/docs/faq.html
Etiquette: http://curl.haxx.se/mail/etiquette.html
- application/octet-stream attachment: my-fixes.diff