curl-library
Aw: Re: problems with downloads after hibernation or sleep on Windows
From: michael haase via curl-library <curl-library_at_cool.haxx.se>
Date: Sun, 10 Mar 2019 12:19:45 +0100
(i) Most of the accessed FTP sites (weather forecasts) require a login (user and password). Also, most of the
sites are set up in such a way that the latest file is located in a separate directory (only one file) which
is accessed by the service. This file is replaced by the latest version of the weather forecast in this directory
(at least once a day with a different file name! – therefore the next day the file from the previous day has gone).
curl = GetCURL();
db = GetDb();
Date: Sun, 10 Mar 2019 12:19:45 +0100
Sorry for being a bit short in my first message and taking some time to respond.
I am happy to provide an overview of the service and the web services it feeds on:
I am happy to provide an overview of the service and the web services it feeds on:
The service traverses a list of FTP or HTTP Web services in predefined time intervals (e.g. every 5 minutes).
It then looks up in the list if for that particular entry it is about time to retrieve data from the referenced
FTP or HTTP site.
It then looks up in the list if for that particular entry it is about time to retrieve data from the referenced
FTP or HTTP site.
(i) Most of the accessed FTP sites (weather forecasts) require a login (user and password). Also, most of the
sites are set up in such a way that the latest file is located in a separate directory (only one file) which
is accessed by the service. This file is replaced by the latest version of the weather forecast in this directory
(at least once a day with a different file name! – therefore the next day the file from the previous day has gone).
(ii) The HTTP sites also require a login (user, password). If the login has been successful then the HTTP site
issues a “session ID” which is sent back to the service for accessing data to download. This “session ID” decays
after a certain time (in most cases 1 hour and in one case 4 hours) if not being used for downloads. For most HTTP
sites there will be no feedback in case of requesting data with an expired “session ID” from that site.
issues a “session ID” which is sent back to the service for accessing data to download. This “session ID” decays
after a certain time (in most cases 1 hour and in one case 4 hours) if not being used for downloads. For most HTTP
sites there will be no feedback in case of requesting data with an expired “session ID” from that site.
The service documents its activities in daily logfiles. I have noticed although on the previous day it seems to
work fine on the next day it may hang - it does not crash - as there are no entries in the logfile. However, I can
stop the service from within the Services window. My assumption is that it freezes somewhere in the libcurl
environment. The service works fine for 3 or 4 days but might hang on the 4th or 5th day which makes it hard for
me to trace it.
work fine on the next day it may hang - it does not crash - as there are no entries in the logfile. However, I can
stop the service from within the Services window. My assumption is that it freezes somewhere in the libcurl
environment. The service works fine for 3 or 4 days but might hang on the 4th or 5th day which makes it hard for
me to trace it.
I would very much appreciate any assistance on this issue.
Thank you, Michael
A sample peace of code for accessing HTTP sites is listed hereafter - the results are then inserted into a database:
<<
int last_year, last_month, last_day, last_hour, last_minute, last_second;
int no_tup = 0;
char options[BUF_SIZE];
char dest[BUF_SIZE];
char aggr_dt[VERY_TINY_BUF_SIZE];
char date[DATE_SIZE];
char fname[BUF_SIZE];
FILE *fp = NULL;
bool ok = true;
bool ok2 = true;
CURLcode ret;
CURL *curl = NULL;
Db *db = NULL;
<<
int last_year, last_month, last_day, last_hour, last_minute, last_second;
int no_tup = 0;
char options[BUF_SIZE];
char dest[BUF_SIZE];
char aggr_dt[VERY_TINY_BUF_SIZE];
char date[DATE_SIZE];
char fname[BUF_SIZE];
FILE *fp = NULL;
bool ok = true;
bool ok2 = true;
CURLcode ret;
CURL *curl = NULL;
Db *db = NULL;
struct data config;
if (GetDetail() >= REP_DEBUG)
config.trace_ascii = 1;
config.trace_ascii = 1;
curl = GetCURL();
db = GetDb();
if (curl && db)
{
for (int i = 0; (i < GetNumDbEntries()) && ServiceIsRunning(); i++)
{
if (bas[no].Base_ID == db[i].Base_ID)
{
ok2 = true;
curl_easy_reset(curl);
{
for (int i = 0; (i < GetNumDbEntries()) && ServiceIsRunning(); i++)
{
if (bas[no].Base_ID == db[i].Base_ID)
{
ok2 = true;
curl_easy_reset(curl);
strcpy(date, db[i].last_import);
ConvertForm1ToForm2(date, 1);
ConvertForm1ToForm2(date, 1);
/*
set options for http service
----------------------------
*/
strcpy(options, "sid=");
strcat(options, GetServiceSID());
strcat(options, "&submit=text");
strcat(options, "&workspace=");
strcat(options, db[i].workspace);
strcat(options, "&station=");
strcat(options, db[i].station);
strcat(options, "&sensor=");
strcat(options, db[i].sensor);
strcat(options, "&start=");
strcat(options, date);
strcat(options, "&end=");
strcat(options, GetPresentTime(date, 2));
if (db[i].aggr_dt > 0)
{
strcat(options, "&aggr_dt=");
sprintf(aggr_dt, "%d", db[i].aggr_dt);
strcat(options, aggr_dt);
}
set options for http service
----------------------------
*/
strcpy(options, "sid=");
strcat(options, GetServiceSID());
strcat(options, "&submit=text");
strcat(options, "&workspace=");
strcat(options, db[i].workspace);
strcat(options, "&station=");
strcat(options, db[i].station);
strcat(options, "&sensor=");
strcat(options, db[i].sensor);
strcat(options, "&start=");
strcat(options, date);
strcat(options, "&end=");
strcat(options, GetPresentTime(date, 2));
if (db[i].aggr_dt > 0)
{
strcat(options, "&aggr_dt=");
sprintf(aggr_dt, "%d", db[i].aggr_dt);
strcat(options, aggr_dt);
}
/*
curl options
------------
*/
curl_easy_setopt(curl, CURLOPT_POSTFIELDS, options);
curl_easy_setopt(curl, CURLOPT_URL, bas[no].IP_address);
curl options
------------
*/
curl_easy_setopt(curl, CURLOPT_POSTFIELDS, options);
curl_easy_setopt(curl, CURLOPT_URL, bas[no].IP_address);
SetCurlCommons(curl, bas[no].scheme, NULL);
if (GetDetail() >= REP_DEBUG)
{
CountCurlEasyPerform("DoGetHTTPData", bas[no].IP_address);
fprintLog("DoGetHTTPData: %s\n", options);
}
{
CountCurlEasyPerform("DoGetHTTPData", bas[no].IP_address);
fprintLog("DoGetHTTPData: %s\n", options);
}
strcpy(dest, GetPath());
strcat(dest, GetTmp());
strcat(dest, "data.txt");
strcat(dest, GetTmp());
strcat(dest, "data.txt");
/*
download data with curl
-----------------------
*/
SetCurlDownloadActive(true);
fp = fopen(dest, "wb");
if (fp)
{
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, write_data);
curl_easy_setopt(curl, CURLOPT_WRITEDATA, (void *) fp);
download data with curl
-----------------------
*/
SetCurlDownloadActive(true);
fp = fopen(dest, "wb");
if (fp)
{
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, write_data);
curl_easy_setopt(curl, CURLOPT_WRITEDATA, (void *) fp);
if (GetDetail() >= REP_DEBUG)
{
curl_easy_setopt(curl, CURLOPT_DEBUGFUNCTION, my_trace);
curl_easy_setopt(curl, CURLOPT_DEBUGDATA, &config);
curl_easy_setopt(curl, CURLOPT_VERBOSE, 1L);
}
{
curl_easy_setopt(curl, CURLOPT_DEBUGFUNCTION, my_trace);
curl_easy_setopt(curl, CURLOPT_DEBUGDATA, &config);
curl_easy_setopt(curl, CURLOPT_VERBOSE, 1L);
}
ret = curl_easy_perform(curl);
if (GetDetail() >= REP_DEBUG)
PrintLogTime("DoGetHTTPData finished: ");
PrintLogTime("DoGetHTTPData finished: ");
fclose(fp);
if (ret != CURLE_OK)
{
PrintLogTime2("could not retrieve data from: ", bas[no].IP_address);
fprintLog("(sensor: <%s>, station: <%s>)\n", db[i].sensor, db[i].station);
ok = PrintCurlError(ret);
ok2 = false;
}
{
PrintLogTime2("could not retrieve data from: ", bas[no].IP_address);
fprintLog("(sensor: <%s>, station: <%s>)\n", db[i].sensor, db[i].station);
ok = PrintCurlError(ret);
ok2 = false;
}
if (GetDetail() >= REP_DEBUG)
PrintLogTime2("end data retrieving from: ", bas[no].IP_address);
PrintLogTime2("end data retrieving from: ", bas[no].IP_address);
if (ok2)
{
/*
upload retrieved data into database
-----------------------------------
*/
DoUploadToMCH(dest, bas[no].source, &db[i], &no_tup);
if (!IsService())
progadd();
{
/*
upload retrieved data into database
-----------------------------------
*/
DoUploadToMCH(dest, bas[no].source, &db[i], &no_tup);
if (!IsService())
progadd();
strcpycat(fname, GetPath(), GetMap(), "", "", "", "");
if (!WriteFile(fname, GetMMName(), GetCommentsMMName(), 2))
fprintLog("could not write 'map mch' file: %s\\%s\n", fname, GetMMName());
}
}
else
{
fprintLog("DoGetHTTPData: could not open file: %s\n", dest);
ok = false;
}
SetCurlDownloadActive(false);
}
}
if (!WriteFile(fname, GetMMName(), GetCommentsMMName(), 2))
fprintLog("could not write 'map mch' file: %s\\%s\n", fname, GetMMName());
}
}
else
{
fprintLog("DoGetHTTPData: could not open file: %s\n", dest);
ok = false;
}
SetCurlDownloadActive(false);
}
}
fprintLog("in total %d tuples inserted from %s\n", no_tup, bas[no].IP_address);
}
>>
}
>>
Gesendet: Donnerstag, 07. Februar 2019 um 23:29 Uhr
Von: "Daniel Stenberg" <daniel@haxx.se>
An: "michael haase via curl-library" <curl-library@cool.haxx.se>
Cc: "michael haase" <hrmichael.haase@web.de>
Betreff: Re: problems with downloads after hibernation or sleep on Windows
Von: "Daniel Stenberg" <daniel@haxx.se>
An: "michael haase via curl-library" <curl-library@cool.haxx.se>
Cc: "michael haase" <hrmichael.haase@web.de>
Betreff: Re: problems with downloads after hibernation or sleep on Windows
On Tue, 5 Feb 2019, michael haase via curl-library wrote:
> This service runs fine but I noticed that in some cases in the aftermath of
> an "hibernation" or "sleep" command the service is freezed.
It would be most helpful if you can elaborate and debug a little around
*exactly* what "the service is freezed" means. What fails and how?
--
/ daniel.haxx.se
> This service runs fine but I noticed that in some cases in the aftermath of
> an "hibernation" or "sleep" command the service is freezed.
It would be most helpful if you can elaborate and debug a little around
*exactly* what "the service is freezed" means. What fails and how?
--
/ daniel.haxx.se
-------------------------------------------------------------------
Unsubscribe: https://cool.haxx.se/list/listinfo/curl-library
Etiquette: https://curl.haxx.se/mail/etiquette.html
Received on 2019-03-10