I apologize for the lack of information provided.
Anyway, I already found the problem. It turned out someone didn't closed the
file descriptor properly after fetching.
Thanks!
-Irvs
On Fri, Nov 20, 2009 at 4:34 PM, Guenter <lists_at_gknw.net> wrote:
> Hi,
> Irvs schrieb:
>
>> I need help...
>>
>> In my testing I successfully retrieved the http://www.google.com site
>> and saved it on disk. Everything was fine until I was trying to parse the
>> file and noticed that it seemed that the google file I was reading was
>> incomplete. I tried displaying the file size and I noticed that it was
>> different from the actual size on the disk when issuing the "ls -la"
>> command. The file size printed during runtime is lesser than the actual size
>> when ls command is issued.
>> Have you guys encountered this problem? I am quite sure I successfully
>> retrieved the www.google.com <http://www.google.com> file since I am
>> printing the data before it prints the data just to make sure and confirm
>> everything is written on the file. And the actual output file when read is
>> complete.
>>
>> What puzzles me is why the file size is smaller than the actual file on
>> disk?
>>
> beside the URL you didnt give us any info what you do and how, this is in
> any case insufficient if you expect any help.
> You need to tell us:
> - platform you use
> - libcurl version you use
> - program language / binding you use
> - small code example which shows the problem
> Also I recommend that you test / compare with a static resource rather than
> a google frontpage which is dynamically created, and where its size might
> change with every request.
>
> Günter.
>
>
> -------------------------------------------------------------------
> List admin: http://cool.haxx.se/list/listinfo/curl-library
>
> Etiquette: http://curl.haxx.se/mail/etiquette.html
>
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-library
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2009-11-23