cURL / Mailing Lists / curl-library / Single Mail

curl-library

Re: A Question On Libcurl Performance

From: Thomas Dineen <tdineen_at_ix.netcom.com>
Date: Sat, 31 Aug 2013 17:13:23 -0700

Daniel:

>> I am using libcurl and some of Curl Website example code as calling
>> routines for a multi-platform project where a webpage is read from
>> finance.yahoo.com.
>
> ...
>
>> Now when I perform the exact same access on Fedora 14 the read
>> performance is very slow. When the read access is executed by sending
>> the URL
>
> I don't understand this part. The read access is executed?
>
>> there seems to be a pause of one or two minutes to get a response.
>
> What!? Let me check I understand you correctly: You get an up to 120
> seconds pause when you use libcurl on this Fedora that you don't see
> if the same program run on Windows or Solaris?
>
> 120 seconds is an insanely long time!

Yes I counted 60 seconds of delay from url_fopen to url_fgets.
Code section is shown below and entire funcion is attached.

>
>> Also the libcurl version varies in all three environments. However
>> this morning I updated the Fedora 14 environment to the libcurl
>> newest version 7.32 and the performance did not improve.
>
>> From what version did you upgrade?
4.2.0

> Do you only transfer plain HTTP?
Yes application requires only http.

>> Have you tried it on more than one specific Fedora host?
No Not so far.

> Can you tell us which of the example codes that shows this problem if
> you run it against the yahoo http server? Are you using the example
> unmodified or can you provide us with the code you're trying?
Yes: See the entire source code file which is attached attached.

My read function is based on url_fopen.c

/******************************************************************************/
/* */
/* Read The Data From The Yahoo Web
Site: */
/* */
/******************************************************************************/

     if ( Print_Head->Print_Headers_Read == TRUE )
        printf ( "Start The Yahoo Web Site Read\n" );

     handle = url_fopen ( URL_String, "r" );

     if ( Print_Head->Print_Headers_Read == TRUE )
        printf ( "Handle = %d\n", handle );

     if( !handle )
        {
        printf( "Couldn't Open URL: url_fopen() %s\n", url );
        printf ( "Handle = %d\n", handle );
        fclose( outf );
        return -1;
        }

/******************************************************************************/
/* */
/* Read The Price / Volume
Data: */
/* */
/******************************************************************************/

     if ( Type == 'd' || Type == 'w' || Type == 'm' )
        { /* Start Type Price Data Block */

        if ( Print_Head->Print_Headers_Read == TRUE )
           printf ( "Start The Price Volume Read\n" );

        curr_ptr = Stock_Data_Head;
        prev_ptr = Stock_Data_Head;
        prev_ptr->high = 1000000.0;
        First_Record = TRUE;
        Entry_Count = 0;
        Price_High = 0.0;
        Price_Low = 1000000000.0;
        Split_Divisor = 1.0;
        Divisor = 0.0;
        Int_Divisor = 1.0;

        while( !url_feof( handle ))
           { /* Start While */
           url_fgets( buffer, sizeof( buffer ), handle);

           if ( strstr ( buffer, "404 Not Found" ) != NULL )
              {
              printf( "Error 404 Symbol Not Found\n" );
              fclose( outf );
              return ( -1 );
              }

           if ( Print_Head->Print_Data == TRUE )
              printf ( "%s", buffer );

           if ( First_Record == FALSE )
              { /* Start Not First_Record */
              sscanf ( buffer, "%4d", &year );
              sscanf ( buffer+5, "%2d", &month );
              sscanf ( buffer+8, "%2d", &day );
              sscanf ( buffer+11, "%f,%f,%f,%f,%lli,%f", &open_read,
&high_read,
                 &low_read, &close_read, &volume_read, &adj_close_read );

-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-library
Etiquette: http://curl.haxx.se/mail/etiquette.html

Received on 2013-09-01