Buy commercial curl support. We
help you work out your issues, debug your libcurl applications, use the API,
port to new platforms, add new features and more. With a team lead by the
curl founder Daniel himself.
Re: How do I get around a captcha challenge?
- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]
From: bruce via curl-users <curl-users_at_lists.haxx.se>
Date: Mon, 11 Aug 2025 00:03:50 -0400
just saw this. don't know your background. but, if u insert the url in
the browser chrome/firefox/etc, u can check the dev tab, and should be
able to see the 'curl' approx attributes to help u see what the browser is
doing..
hope this helps
On Sun, Aug 10, 2025, 11:35 PM ToddAndMargo via curl-users <
curl-users_at_lists.haxx.se> wrote:
> On 8/10/25 8:02 PM, Paul Gilmartin via curl-users wrote:
> > On 8/10/25 04:18, ToddAndMargo via curl-users wrote:
> >>
> >>
> >> From the command line, I am now getting
> >>
> >> curl -v --connect-timeout 20 https://www.eset.com/us/home/internet-
> >> security/download/#download-manually --output eraseme3.htm
> >>
> >> "permanently moved"
> >> ...
> > 729 $ man curl
> > -L, --location
> > (HTTP) If the server reports that the requested page has
> > moved
> > to a different location (indicated with a Location:
> > header and a
> > 3XX response code), this option makes curl redo the
> > request to
> > the new place. If used together with -i, --show-headers
> > or -I,
> > --head, headers from all requested pages are shown.
> >
>
> That did the trick!
>
> curl -L -v --user-agent "Mozilla/5.0 (X11; Linux x86_64; rv:52.0)
> Gecko/20100101 Firefox/52.0" --connect-timeout 20
> https://www.eset.com/us/home/internet-security/download/#download-manually
> --output eraseme4.html
>
>
> Without the user agent, I got a return of about a thousand
> web site addresses. The site was messing with me when I
> used the default curl user agent because it though I was
> a bot.
>
> Thank you all for the help!
>
> -T
> --
> Unsubscribe: https://lists.haxx.se/mailman/listinfo/curl-users
> Etiquette: https://curl.se/mail/etiquette.html
>
Date: Mon, 11 Aug 2025 00:03:50 -0400
just saw this. don't know your background. but, if u insert the url in
the browser chrome/firefox/etc, u can check the dev tab, and should be
able to see the 'curl' approx attributes to help u see what the browser is
doing..
hope this helps
On Sun, Aug 10, 2025, 11:35 PM ToddAndMargo via curl-users <
curl-users_at_lists.haxx.se> wrote:
> On 8/10/25 8:02 PM, Paul Gilmartin via curl-users wrote:
> > On 8/10/25 04:18, ToddAndMargo via curl-users wrote:
> >>
> >>
> >> From the command line, I am now getting
> >>
> >> curl -v --connect-timeout 20 https://www.eset.com/us/home/internet-
> >> security/download/#download-manually --output eraseme3.htm
> >>
> >> "permanently moved"
> >> ...
> > 729 $ man curl
> > -L, --location
> > (HTTP) If the server reports that the requested page has
> > moved
> > to a different location (indicated with a Location:
> > header and a
> > 3XX response code), this option makes curl redo the
> > request to
> > the new place. If used together with -i, --show-headers
> > or -I,
> > --head, headers from all requested pages are shown.
> >
>
> That did the trick!
>
> curl -L -v --user-agent "Mozilla/5.0 (X11; Linux x86_64; rv:52.0)
> Gecko/20100101 Firefox/52.0" --connect-timeout 20
> https://www.eset.com/us/home/internet-security/download/#download-manually
> --output eraseme4.html
>
>
> Without the user agent, I got a return of about a thousand
> web site addresses. The site was messing with me when I
> used the default curl user agent because it though I was
> a bot.
>
> Thank you all for the help!
>
> -T
> --
> Unsubscribe: https://lists.haxx.se/mailman/listinfo/curl-users
> Etiquette: https://curl.se/mail/etiquette.html
>
-- Unsubscribe: https://lists.haxx.se/mailman/listinfo/curl-users Etiquette: https://curl.se/mail/etiquette.htmlReceived on 2025-08-11