cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re: Fwd: Re: cURL usage dilemma

From: Ralph Mitchell <rmitchell_at_eds.com>
Date: Tue, 30 Apr 2002 01:03:29 -0500

Sorry, I didn't read properly what you wrote.

Have you tried saving the individual pages, starting from the top (https://domain/) and working down ? Check them for Javascript, for example, because AFAIK curl doesn't do javascript... Yer browser would happily do javastuff that curl will ignore.

I've run into javascript issues before, which is why I take it a page at a time, using "-b xxx -c xxx" to keep the cookies fresh and circulating.

Ralph

David Withnall wrote:

> From what I've seen in the manual and on the site if you set up a command with multiple sessions using only -b (file) it will collect and send any cookies it comes across.
>
> I have since modified my command to show all the headers and everything to me and AFAIK its seems to be technically correct.
>
> curl -vi -L -b null -u User:Pass -e https://DOMAIN/ https://DOMAIN/baseline/EquipmentAdd.do? https://DOMAIN/baseline/EquipmentAdd.do?
>
> Again from what i've seen this should set up a continuous session which follows any links sent to it and do all the cookie stuff.
> From the output it does seem to do this (except the link following as there doesn't seem to be any). however the first and second get commands return me with pages I don't want.
>
> As I said earlier, I believe it might have something to do with the referrer used tho i have no idea as to how to find out what is being used in the browser.
> However I don't know this for certain. I have tried modifying the referrer entry to list every page on the site that I come in contact with through the various pages with frames etc. without any success.
>
> Regards
> D.
>
> ------------------------------------------------------------------------------
> David Withnall Ph: 07 3406 8079
> Biomedical Engineer Mb: 0405 131 087
> Asset Management and Technical Audit Program
> Biomedical Technology Services
> Queensland Health
>
> ------------------------------------------------------------------------------
> David Withnall Ph: 07 3406 8079
> Biomedical Engineer Mb: 0405 131 087
> Asset Management and Technical Audit Program
> Biomedical Technology Services
> Queensland Health
>
> >>> Ralph Mitchell <rmitchell_at_eds.com> 30/04/02 3:07:11 pm >>>
> Using "-b empty.txt" just starts the cookie engine and, if there is anything in the file, sends those cookies. To save cookies between invocations of Curl, you need to use "-c empty.txt" as well.
>
> i.e. curl -b cookies.txt -c cookies.txt .......
>
> Ralph Mitchell
>
> David Withnall wrote:
>
> > Greetings,
> >
> > I'm new to all this but I think i've figured out most of my problems except this one.
> > I'm trying to use cURL to connect to a https server to upload form data for work. I've managed to get it to connect to the server, however when you try to connect directly to the page with the form on it you get redirected to the front page. This happens in both cURL & IE/Netscape. The difference being that when you try to go directly to the page with the form on it a second time in IE/Netscape you end up there with the form staring you in the face, I have not managed to get cURL to do this however.
> >
> > I'm using the syntax
> > curl -v -A "Mozilla/4.5 (compatible; MSIE 5.0; Windows NT)" -u User:Pass https://domain/page.html
> > (I'd pass on the username/password etc but i've signed confidentiality agreements and i need the money from my job :) )
> >
> > this connects without any problems to the front page (https://domain/) I then tried to do a simultaneous connection hoping to simulate reloading the page twice in a browser
> >
> > curl -v -A "Mozilla/4.5 (compatible; MSIE 5.0; Windows NT)" -u User:Pass https://domain/page.html https://domain/page.html
> >
> > still had the same problem.
> >
> > so next i tried cookies as the site sets a session cookie which relates to the whole site
> > so as per the manual
> >
> > curl -v -b empty.txt -L -A "Mozilla/4.5 (compatible; MSIE 5.0; Windows NT)" -u User:Pass https://domain/page.html https://domain/page.html
> >
> > i found a discussion on the forum archive where someone had a problem that was similar and his solution turned out to be setting the referrer to the page he wanted to go too.
> >
> > Basically I've tried everythign I can think of. I believe it has something to do with headers or cookies but I'm not sure. I can't find anythign which will show me the http headers being transfered in the browser to make sure.
> >
> > This is all being done in a windows environment and I don't have access to any *nix boxes. If anyone has any suggestions I'd be much appreciative. I'd rather automate the entry of the several thousand things that we have to type in if possible.
> >
> > Cheers
> > D.
> >
> > ------------------------------------------------------------------------------
> > David Withnall Ph: 07 3406 8079
> > Biomedical Engineer Mb: 0405 131 087
> > Asset Management and Technical Audit Program
> > Biomedical Technology Services
> > Queensland Health
> >
> > **********************************************************************
> > This e-mail, including any attachments sent with it, is confidential
> > and for the sole use of the intended recipient(s). This confidentiality
> > is not waived or lost if you receive it and you are not the intended
> > recipient(s), or if it is transmitted/ received in error.
> >
> > Any unauthorised use, alteration, disclosure, distribution or review
> > of this e-mail is prohibited. It may be subject to a statutory duty of
> > confidentiality if it relates to health service matters.
> >
> > If you are not the intended recipient(s), or if you have received this
> > e-mail in error, you are asked to immediately notify the sender by
> > telephone or by return e-mail. You should also delete this e-mail
> > message and destroy any hard copies produced.
> > **********************************************************************
>
> **********************************************************************
> This e-mail, including any attachments sent with it, is confidential
> and for the sole use of the intended recipient(s). This confidentiality
> is not waived or lost if you receive it and you are not the intended
> recipient(s), or if it is transmitted/ received in error.
>
> Any unauthorised use, alteration, disclosure, distribution or review
> of this e-mail is prohibited. It may be subject to a statutory duty of
> confidentiality if it relates to health service matters.
>
> If you are not the intended recipient(s), or if you have received this
> e-mail in error, you are asked to immediately notify the sender by
> telephone or by return e-mail. You should also delete this e-mail
> message and destroy any hard copies produced.
> **********************************************************************
>
> ------------------------------------------------------------------------
>
> Subject: Re: cURL usage dilemma
> Date: Tue, 30 Apr 2002 15:34:25 +1000
> From: "David Withnall" <David_Withnall_at_health.qld.gov.au>
> To: <#060#curl-users_at_lists.sourceforge.net#062#>
>
> From what I've seen in the manual and on the site if you set up a command with multiple sessions using only -b (file) it will collect and send any cookies it comes across.
>
> I have since modified my command to show all the headers and everything to me and AFAIK its seems to be technically correct.
>
> curl -vi -L -b null -u User:Pass -e https://DOMAIN/ https://DOMAIN/baseline/EquipmentAdd.do? https://DOMAIN/baseline/EquipmentAdd.do?
>
> Again from what i've seen this should set up a continuous session which follows any links sent to it and do all the cookie stuff.
> From the output it does seem to do this. however the first and second get commands return me with pages I don't want.
>
> As I said earlier, I believe it might have something to do with the referrer used tho i have no idea as to how to find out what is being used in the browser.
> However I don't know this for certain. I have tried modifying the referrer entry to list every page on the site that I come in contact with through the various pages with frames etc. without any success.
>
> Regards
> D.
>
> ------------------------------------------------------------------------------
> David Withnall Ph: 07 3406 8079
> Biomedical Engineer Mb: 0405 131 087
> Asset Management and Technical Audit Program
> Biomedical Technology Services
> Queensland Health
>
> >>> Ralph Mitchell <rmitchell_at_eds.com> 30/04/02 3:07:11 pm >>>
> Using "-b empty.txt" just starts the cookie engine and, if there is anything in the file, sends those cookies. To save cookies between invocations of Curl, you need to use "-c empty.txt" as well.
>
> i.e. curl -b cookies.txt -c cookies.txt .......
>
> Ralph Mitchell
>
> David Withnall wrote:
>
> > Greetings,
> >
> > I'm new to all this but I think i've figured out most of my problems except this one.
> > I'm trying to use cURL to connect to a https server to upload form data for work. I've managed to get it to connect to the server, however when you try to connect directly to the page with the form on it you get redirected to the front page. This happens in both cURL & IE/Netscape. The difference being that when you try to go directly to the page with the form on it a second time in IE/Netscape you end up there with the form staring you in the face, I have not managed to get cURL to do this however.
> >
> > I'm using the syntax
> > curl -v -A "Mozilla/4.5 (compatible; MSIE 5.0; Windows NT)" -u User:Pass https://domain/page.html
> > (I'd pass on the username/password etc but i've signed confidentiality agreements and i need the money from my job :) )
> >
> > this connects without any problems to the front page (https://domain/) I then tried to do a simultaneous connection hoping to simulate reloading the page twice in a browser
> >
> > curl -v -A "Mozilla/4.5 (compatible; MSIE 5.0; Windows NT)" -u User:Pass https://domain/page.html https://domain/page.html
> >
> > still had the same problem.
> >
> > so next i tried cookies as the site sets a session cookie which relates to the whole site
> > so as per the manual
> >
> > curl -v -b empty.txt -L -A "Mozilla/4.5 (compatible; MSIE 5.0; Windows NT)" -u User:Pass https://domain/page.html https://domain/page.html
> >
> > i found a discussion on the forum archive where someone had a problem that was similar and his solution turned out to be setting the referrer to the page he wanted to go too.
> >
> > Basically I've tried everythign I can think of. I believe it has something to do with headers or cookies but I'm not sure. I can't find anythign which will show me the http headers being transfered in the browser to make sure.
> >
> > This is all being done in a windows environment and I don't have access to any *nix boxes. If anyone has any suggestions I'd be much appreciative. I'd rather automate the entry of the several thousand things that we have to type in if possible.
> >
> > Cheers
> > D.
> >
> > ------------------------------------------------------------------------------
> > David Withnall Ph: 07 3406 8079
> > Biomedical Engineer Mb: 0405 131 087
> > Asset Management and Technical Audit Program
> > Biomedical Technology Services
> > Queensland Health
> >
> > **********************************************************************
> > This e-mail, including any attachments sent with it, is confidential
> > and for the sole use of the intended recipient(s). This confidentiality
> > is not waived or lost if you receive it and you are not the intended
> > recipient(s), or if it is transmitted/ received in error.
> >
> > Any unauthorised use, alteration, disclosure, distribution or review
> > of this e-mail is prohibited. It may be subject to a statutory duty of
> > confidentiality if it relates to health service matters.
> >
> > If you are not the intended recipient(s), or if you have received this
> > e-mail in error, you are asked to immediately notify the sender by
> > telephone or by return e-mail. You should also delete this e-mail
> > message and destroy any hard copies produced.
> > **********************************************************************
Received on 2002-04-30