Buy commercial curl support. We
help you work out your issues, debug your libcurl applications, use the API,
port to new platforms, add new features and more. With a team lead by the
curl founder Daniel himself.
Re: How do I get around a captcha challenge?
- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]
From: ToddAndMargo via curl-users <curl-users_at_lists.haxx.se>
Date: Sun, 10 Aug 2025 20:35:12 -0700
On 8/10/25 8:02 PM, Paul Gilmartin via curl-users wrote:
> On 8/10/25 04:18, ToddAndMargo via curl-users wrote:
>>
>>
>> From the command line, I am now getting
>>
>> curl -v --connect-timeout 20 https://www.eset.com/us/home/internet-
>> security/download/#download-manually --output eraseme3.htm
>>
>> "permanently moved"
>> ...
> 729 $ man curl
> -L, --location
> (HTTP) If the server reports that the requested page has
> moved
> to a different location (indicated with a Location:
> header and a
> 3XX response code), this option makes curl redo the
> request to
> the new place. If used together with -i, --show-headers
> or -I,
> --head, headers from all requested pages are shown.
>
That did the trick!
curl -L -v --user-agent "Mozilla/5.0 (X11; Linux x86_64; rv:52.0)
Gecko/20100101 Firefox/52.0" --connect-timeout 20
https://www.eset.com/us/home/internet-security/download/#download-manually
--output eraseme4.html
Without the user agent, I got a return of about a thousand
web site addresses. The site was messing with me when I
used the default curl user agent because it though I was
a bot.
Thank you all for the help!
-T
Date: Sun, 10 Aug 2025 20:35:12 -0700
On 8/10/25 8:02 PM, Paul Gilmartin via curl-users wrote:
> On 8/10/25 04:18, ToddAndMargo via curl-users wrote:
>>
>>
>> From the command line, I am now getting
>>
>> curl -v --connect-timeout 20 https://www.eset.com/us/home/internet-
>> security/download/#download-manually --output eraseme3.htm
>>
>> "permanently moved"
>> ...
> 729 $ man curl
> -L, --location
> (HTTP) If the server reports that the requested page has
> moved
> to a different location (indicated with a Location:
> header and a
> 3XX response code), this option makes curl redo the
> request to
> the new place. If used together with -i, --show-headers
> or -I,
> --head, headers from all requested pages are shown.
>
That did the trick!
curl -L -v --user-agent "Mozilla/5.0 (X11; Linux x86_64; rv:52.0)
Gecko/20100101 Firefox/52.0" --connect-timeout 20
https://www.eset.com/us/home/internet-security/download/#download-manually
--output eraseme4.html
Without the user agent, I got a return of about a thousand
web site addresses. The site was messing with me when I
used the default curl user agent because it though I was
a bot.
Thank you all for the help!
-T
-- Unsubscribe: https://lists.haxx.se/mailman/listinfo/curl-users Etiquette: https://curl.se/mail/etiquette.htmlReceived on 2025-08-11