curl-users
Re: Using curl
Date: Thu, 12 Mar 2015 21:13:13 +0900
Hi,
I am afraid you posted your question on a wrong mailing list :-).
Try wget. It supports recursive download from web sites.
Example:
wget -v -rl0 -np -nd -Ajpg -P your_directory
"http://epaper.eenadu.net/pdf/2015/03/08/"
If you have further questions, try wget mailing list:
https://lists.gnu.org/mailman/listinfo/bug-wget
2015-03-12 18:48 GMT+09:00, bala pothula <balan.pothula_at_gmail.com>:
> Dear Users,
> This is Balaji. I want to curl all images from web directory. Like
> below.
>
> curl --get http://epaper.eenadu.net/pdf/2015/03/08/[*].jpg -o
> F:\EENADU\NEWS\.
>
> I want All images from "http://epaper.eenadu.net/pdf/2015/03/08/" by using
> wild card entris like * and also want to save in some folder.
> Thanks
>
> Regards,
> Balaji.
>
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-users
FAQ: http://curl.haxx.se/docs/faq.html
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2015-03-12