curl-library
RE: Increasing connexion time and cpu usage with the number of simultaneous curl_easy_perform
Date: Wed, 10 Jul 2002 20:34:28 +0200
to this message, i would like to add that:
- if i launch the 50 threads that make the requests with 200ms delay between each, some request have the time to be responded before all threads can be launched.
Then everything goes all right (even with 50 simultaneous thread) until a point where the CPU groes from 2% to 100%.
The windows task manager also show lot of swapping between threads
Should i bufferize my requests so that only an x (5,10,15?) number of simultaneous request should be handled by CURL ?
Arnaud.
> -----Message d'origine-----
> De : Arnaud VALLEE
> Envoyé : mardi 9 juillet 2002 20:58
> À : 'curl-library_at_lists.sourceforge.net'
> Objet : Increasing connexion time and cpu usage with
> the number of simultaneous curl_easy_perform
>
> Hello all,
>
> My problem is the following:
> I have a multithread application. Each threads are
> independant and calls libcurl functionalities to fetch
> resources on the net.
> As i am using the curl_easy_perform, I noticed that the
> connexion time increase considerably with the number of thread.
> Up to 5, it is ok (a few seconds).
> But with 30 threads, it can take more than a minute.
> Furthermore, the cpu is always at 100%.
> Here is the code that will execute each thread:
>
> before executing the threads:
> curl_global_init(CURL_GLOBAL_DEFAULT);
>
> each thread will execute the following code:
> CURL_STATUS status;
>
> CURL *curl;
> char buf[] = "Expect:";
> struct curl_slist *headerlist=NULL;
>
>
> CURL_Write_UserData *my_userdata = new CURL_Write_UserData;
> my_userdata->object = this;
> my_userdata->req = request;
> CURLcode res;
> struct HttpPost *formPost=NULL;
>
> //Get a handle for the request
> curl = curl_easy_init();
>
> //Set all options
>
> //***************************************************
> //Common options
> //***************************************************
>
> //To follow the location in case of redirection
> curl_easy_setopt(curl,CURLOPT_FOLLOWLOCATION,1);
>
> //Setup cookies
> CURLcode code2 =
> curl_easy_setopt(curl,CURLOPT_COOKIEFILE,"D:\\DocumentAccessAc
> celerator\\cookies.txt");
>
> curl_easy_setopt(curl,CURLOPT_COOKIEJAR,"D:\\DocumentAccessAcc
> elerator\\cookies.txt");
>
>
> //How to write the received datas
> curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, CSD_CURL_Write);
> curl_easy_setopt(curl, CURLOPT_WRITEDATA,my_userdata);
>
> //How to retrieved the headers sent by the web server
> curl_easy_setopt(curl,
> CURLOPT_HEADERFUNCTION,CSD_CURL_WriteHeader);
> curl_easy_setopt(curl, CURLOPT_WRITEHEADER,request);
>
> /* initalize custom header list (stating that Expect:
> 100-continue is not
> wanted */
> headerlist = curl_slist_append(headerlist, buf);
> curl_easy_setopt(curl, CURLOPT_HTTPHEADER, headerlist);
>
>
> //***********************************
> //End of Common Options
>
>
> string l_newURL = "";
> string FormatedParam;
>
> if(request->get_MimeType()=="multipart/form-data")
> {
> formPost = FormatPostMultipart(request);
> curl_easy_setopt(curl, CURLOPT_HTTPPOST, formPost);
> curl_easy_setopt(curl, CURLOPT_URL,
> request->get_m_URL().c_str());
>
> }
> else if(request->isGet()==false)
> {
> curl_easy_setopt(curl,CURLOPT_POST,1);
> FormatedParam = FormatParam(request);
>
> curl_easy_setopt(curl,CURLOPT_POSTFIELDS,FormatedParam.c_str());
> curl_easy_setopt(curl, CURLOPT_URL,
> request->get_m_URL().c_str());
> }
> else
> {
> FormatedParam = FormatParam(request);
> l_newURL = request->get_m_URL();
> if(!FormatedParam.empty())
> {
> if(l_newURL.find("?")==-1)
> {
> l_newURL.append("?");
> }
> l_newURL.append(FormatedParam);
> }
> curl_easy_setopt(curl, CURLOPT_URL, l_newURL.c_str());
>
>
> }
>
>
> long l_timeout = request->get_m_timeout();
> l_timeout = l_timeout/1000;
> l_timeout = (long) (l_timeout*9/10);
> curl_easy_setopt(curl,CURLOPT_TIMEOUT,l_timeout);
>
>
>
> (*p_currentLog)<<start_msg(LEV_2)<<"Starting Download"<<end_msg;
>
> //Perform the request
>
> res = curl_easy_perform(curl);
> (*p_currentLog)<<start_msg(LEV_2)<<"res perform =
> "<<res<<end_msg;
>
>
> if(res==CURLE_OK)
> {
> status = CURL_NO_ERROR;
> }
> else
> {
> status = CURL_ERROR;
> }
>
> double returnCode;
> CURLcode code;
>
>
> long httpcode;
>
> code = curl_easy_getinfo(curl,CURLINFO_HTTP_CODE,&httpcode);
>
> if(code==CURLE_OK)
> {
> (*p_currentLog)<<start_msg(LEV_2)<<"HTTP return
> code = "<<httpcode<<end_msg;
> p_returnedHTTPCode = httpcode;
> }
>
>
> code = curl_easy_getinfo(curl,CURLINFO_TOTAL_TIME,&returnCode);
> (*p_currentLog)<<start_msg(LEV_2)<<"Total Downloading
> Time: "<<returnCode<<end_msg;
>
>
> (*p_currentLog)<<start_msg(LEV_2)<<"Cleaning CURL
> Session"<<end_msg;
>
>
> curl_easy_cleanup(curl);
>
> /* then cleanup the formpost chain */
> if(formPost!=NULL)
> {
> curl_formfree(formPost);
> }
>
> /* free slist */
> curl_slist_free_all (headerlist);
> delete my_userdata;
>
>
>
> Here is my configuration:
> - os win2000.
>
> I do not use any debug! (i noticed it could be longer with
> debug information as it is mentioned in the doc)
>
> The object request contains all the parameter for the
> request (url, parameters...)
> - Am I doing something wrong?
> - Is there a limitation that one of you notice when making
> more than x request without using the multi handle? Or is
> there something to pay attention to?
> - my write functionsimply copy the received content in a
> file. Do not think my problem comes from here since it seems
> like it it is a connexion issue.
> - does anybody know if there could be some limitations with
> the number of simultaneous connexion on windows?
>
> thanks your help.
>
> Regards,
>
> Arnaud.
>
>
>
> --------------------------------------------------------------
> ------------------------------
> << Fichier: Arnaud VALLEE.vcf>>
-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Two, two, TWO treats in one.
http://thinkgeek.com/sf
Received on 2002-07-10