curl-users
Re: Empty FTP path components support
Date: Sun, 05 Oct 2003 14:29:16 +0200
Daniel Stenberg wrote:
>
> On Mon, 29 Sep 2003, DAVID BALAZIC wrote:
>
> > /* we skip empty path components, like "x//y" since the FTP command CWD
> > requires a parameter and a non-existant parameter a) doesn't work on many
> > servers and b) has no effect on the others. */
> >
> > c) "CWD " returns to the login directory on some servers.
>
> Right. I guess it could also do something else on certain servers. RFC959
> doesn't tell.
>
> > I say it should be implemented :
> >
> > a) "doesn't work on many servers"
> > If the user supplies a nonworking URL that is his fault.
>
> True.
>
> > On the other hand, some server act on "CWD " and curl now fails to retreive
> > perfectly valid URLs from such servers ( URLs that have empty components,
> > example:
>
> I disagree a bit. RFC1738 defines that each part should be sent off with CWD,
> but RFC959 states that CWD *requires* a path. So, sending a CWD line without
> one means we break RFC959-compliance.
RFC959 does not say that it can't be an empty string ( is says "system dependent
file group designator" ) , while rfc1738 section 3.2.2 says cristally clear
that "...//..." sends a "CWD ". I quote :
On the other hand, <URL:ftp://myname@host.dom//etc/motd>, would "CWD " with
a null argument, then "CWD etc", and then "RETR motd".
--end quote--
>
> The question is then, the "perfectly valid URL" that contains a blank path
> part between two slahes, what is that REALLY expecting the user-agent to do?
To respect published standards ?
> I figure that you by "perfectly valid" mean that the URL works with your
> favourite browser? I acknowledge that we should strive at behaving similarly.
It means "compliant to definition" instead of compliant to my own idea.
We all like when MS interprets standards in his own weird way, don't we ?
> > b) "has no effect on the others"
> > So what is the problem if empty components are supported ?
>
> The "problem" we fixed with this, is that a) we again support the //-prefix to
> get to the root dir that was broken and b) we don't issue non-compliant FTP
> commands to servers.
>
> > To conclude, I request supporting empty path components, because :
> > - otherwise some valid URLs do not work
>
> But are those REALLY valid URLs to start with?
yes.
> If so, can you please tell me what standards you base this on?
RFC 1738
> And do they work like that using the common browsers?
Last time I checked, MS IE 6.0 could retreive only 2 URL out of 40 I tried,
failing on simple ones like "ftp://user:pass@server/test.txt", so I can't
say anything about IE in this regard.
Others ( mozilla, lynx , wget ) fail on ones like
"ftp://username:password@server.domain.net/subdir_of_login_dir/some_file_there".
So they can't really be used something as a goal or example IMO.
>
> > - there is no harm caused, there aren't any downsides of supporting them
>
> I would say that the breakage of the //-prefix caused a lot of noise all over
> and turned out to be a really bad idea. So, no matter what we decide to do
> with //-pieces in FTP URLs, the initial one needs to be dealt with specificly
> to make it possibly to specify the root dir that way.
Or you could add "unix like ftp url, familiar to all unix users", like
the one used in ncftp :
hostname:path , for example(s) :
ftp.luth.se:/pub/something
ftp.uni-kl.de:index.html
Long time since I used ncftp, so I don't recall all details...
Regards,
David Balazic
-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
Received on 2003-10-05