cURL / Mailing Lists / curl-users / Single Mail

curl-users

Location problems (was Re: release alert)

From: Daniel Stenberg <daniel_at_haxx.se>
Date: Wed, 6 Jun 2001 08:58:40 +0200 (MET DST)

On Tue, 5 Jun 2001, Pierre Z. wrote:

> I'm having problems using cURL (Windoze98/cURL 7.7.3) to fetch data from
> a secure site using https and cookies, which causes cURL to enter an
> infinite loop while trying to access a specific page (relocation from
> previous page accessed). I've read that a similar problem has been
> identified recently and apparently solved, so I'm wondering if the fix
> will make it to this next release.

Yes, there are several Location related fixes included in the upcoming 7.8.

> curl -A "Mozilla/4.75 [fr] (Win98; U)" -b empty.txt -L -D
> Server_Headers1.txt -o Server_login.htm
> https://secure.server.com:43/folder/website.shtml
>
> curl -A "Mozilla/4.75 [fr] (Win98; U)" -v -L -b Server_Headers1.txt -D
> Server_Headers2.txt -d "ID=abc123&PWD=12345&Login=Login" -o
> MyAccount.htm https://secure.server.com/folder/website/Login
>
> This seems to work fine in that it does pass the login (since I see it
> getting to the next page: /MainMenu)

Well it depends on what you define "works fine". The fact that curl didn't
fail doesn't necessarily mean that you got the page you wanted.

> but then it enters an endless loop and is never able to actually fetch
> the contents of the MainMenu page.

A nitpick here, but is 50 laps really endless?

> What I see in the MSDOS window during execution (this gets repeated
> several times) looks like:
> ....
> Follows location to: new URL:
> 'https://secure.server.com/folder/website/MainMenu'
> * DISABLES POST
> * Re-using existing connection! (#0)
> * Connected to https://secure.server.com/folder/website/MainMenu
> etc...
>
> Contents of Server_Headers2.txt:
>
> HTTP/1.1 200 OK
> Server: Netscape-Enterprise/3.6
> Date: Thu, 31 May 2001 23:43:25 GMT
> Content-type: text/html
> Content-length: 3531
>
> This block gets repeated several times (50?) until it finally stops.

Well, this is next to impossible to understand if you don't show more
detailed headers and the requests curl issued. Did they use cookies? Did curl
send back the correct cookies? Did curl follow the Location: headers correct?

Why does it say it follows locations if the Server_Headers2.txt contains a
200-code with no Location:?

In 99% of these cases, curl behaves correctly and all you need to do is to
understand what headers you get back and how you tell curl to react on those
correctly.

We might be able to point out some obvious flaws if you provide the facts.

-- 
     Daniel Stenberg -- curl dude -- http://curl.haxx.se/
Received on 2001-06-06