Re: curl vs wget
Date: Fri, 5 Oct 2007 16:20:20 +0200 (CEST)
On Fri, 5 Oct 2007, Doug McNutt wrote:
> bash, the shell, behaves like plain old sh when it's called that way. It's
> all in the magic of file system links and a startup-time determination by
> the tool as to how it was called.
> Wouldn't it be nice to have a tool - purl - which does an upload by
> magically inserting a -T before the URL and perhaps some .netrc password
> gimmicks? curl and purl would link to the same executable.
> I have never wanted to do an upload and a download using a single command
> line with a -T url followed by a simple url with a -O or -o. Are things like
> that part of the concept?
As I see it, there are numerous arguments against this concept:
#1 - documentation. It would look like two different tools and both need
documentation, and if it isn't two separate docs we'll need to specify all
over which parts that concern what tool.
#2 - symlink dependency. This concept work good only on (file) systems that
can do symlinks as otherwise the user will need two separate copies. Symlinks
are not universally present.
#3 - taking away focus. Related to the documentation issue. All of a sudden
we'll split the audience on two (seemingly slightly) different tools and it'll
be less focus on the single actual tool that does the work.
Besides, it is dead easy for someone to either implement this kind of 'purl'
with a simple shell script, or by writing another libcur-using application...
So no, I really have no plans to move towards such a concept.
-- Commercial curl and libcurl Technical Support: http://haxx.se/curl.htmlReceived on 2007-10-05