[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Mldonkey-bugs] [bug #13712] ed2k_hash never completes for large files
From: |
me |
Subject: |
[Mldonkey-bugs] [bug #13712] ed2k_hash never completes for large files |
Date: |
Sun, 10 Jul 2005 13:15:22 +0000 |
User-agent: |
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.8) Gecko/20050511 |
URL:
<http://savannah.nongnu.org/bugs/?func=detailitem&item_id=13712>
Summary: ed2k_hash never completes for large files
Project: mldonkey, a multi-networks file-sharing client
Submitted by: skaus
Submitted on: Sun 07/10/2005 at 13:15
Category: None
Severity: 3 - Normal
Item Group: Program malfunction
Status: None
Assigned to: None
Open/Closed: Open
Release: 2.5.30.17
Release:
Platform Version: Linux
Binaries Origin: CVS / Self compiled
CPU type: Intel x86
_______________________________________________________
Details:
Hello,
I'm running
ed2k_hash "file" >file.hash 2>&1
for all my files. For several files this command never completes, the files
are, most of the time large, 1GB or more. I noticed this behaviour before with
elder releases, but had time right now to investigate it further.
ed2k_hash is running and outputs:
Current locale of the target machine is ANSI_X3.4-1968
Current language of the target machine is EN
ed2k://|<<snip>>|/
sig2dat://|<<snip>>|/
Hash: <<snip>>
then the program stalls, the urn_biprint is missing. It now runs for 2h30m
real time and 79m CPU time, it allocates all remaining CPU (Idle percentage is
approx 0), I have been running "strace -olog -p<<pid>>" for 20m now, the "log"
file empty. The file in question is ~1.1GB.
BTW: Why is "Current ..." displayed on stdout, but the hashes themselves on
stderr?
_______________________________________________________
Reply to this item at:
<http://savannah.nongnu.org/bugs/?func=detailitem&item_id=13712>
_______________________________________________
Message sent via/by Savannah
http://savannah.nongnu.org/
- [Mldonkey-bugs] [bug #13712] ed2k_hash never completes for large files,
me <=