cURL / Mailing Lists / curl-library / Single Mail

curl-library

Re: writing bot with libcurl

From: Lars Nilsson <chamaeleon_at_gmail.com>
Date: Tue, 13 Jul 2004 23:29:04 -0400

If you enable cookie handling using a few lines of libcurl options,
you get session handling more or less for free (in the sense that
libcurl will send any cookies to the webserver that it may request. I
am assuming this is how the webserver would mantain a session for you,
instead of using the GET request with some parameters). Enabling the
cookies is the key for this (using the option CURLOPT_COOKIEJAR and
the like. Read up on those in the documentation, they're not too
many). Again, you only need a couple of lines setting the appropriate
options, and it'll work more or less out of the box.

Lars Nilsson

On Tue, 13 Jul 2004 20:21:30 -0700, Ryan <rcdetert_at_ucdavis.edu> wrote:
> So basically I need to login, by posting some form data for username and
> password fields. Then parse the response of the server. I need to keep
> track of a sessionid though don't I?
>
>
>
>
> On Tue, 2004-07-13 at 20:13, Lars Nilsson wrote:
> > On Tue, 13 Jul 2004 19:22:29 -0700, Ryan <rcdetert_at_ucdavis.edu> wrote:
> > > What I want to do is write a program to log into a site and parse some
> > > html pages so that I can import the data into a database, will libcurl
> > > allow me to do this easily?
> >
> > Yes, very easily (in my opinion). It will be up to you to maintain
> > your state and parsing the pages (libcurl for the most part only gives
> > you the content). I believe I've written between 15 and 20 different
> > programs for similar purposes, ftp and http alike, and each one does
> > usually not take more than a day to complete. Most of the time will be
> > spent figuring out what to do with the data, not how to make libcurl
> > give it to you. Personally, I use libxml (http://www.xmlsoft.org) for
> > my HTML / XML parsning needs in conjunction with libcurl.
> >
> > Regards,
> > Lars Nilsson
>
>
Received on 2004-07-14