[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

bug#9321: repeated segfaults sorting large files in 8.12

From: Andras Salamon
Subject: bug#9321: repeated segfaults sorting large files in 8.12
Date: Thu, 18 Aug 2011 15:30:05 +0100
User-agent: Mutt/1.5.21 (2010-09-15)

I am seeing repeated (but not reliably repeatable) segmentation faults
sorting datasets in the 100MB-100GB range on a 64-bit Debian system
using GNU sort 8.12 (and also 8.9).  Stack traces seem to indicate
problems during the merge phase, usually when the temporary files
are being combined.

This may or may not be related to the recent discussion about
#9307, but I am definitely using 8.12, rebuilt with CFLAGS=-g since
several indicative values were otherwise optimised out, configured
with --disable-nls --disable-threads, and am running with a fixed
buffer -S 100M and also --parallel=1 to try to isolate problems from
possible threading issues.  I was seeing these crashes with a vanilla
build also.

At least one crash occurred when comparing the very last entry in
the memory buffer to a non-existent entry, when merging large files.

There was also a crash with total_lines=851122 in mergelines_node,
which leads to node->hi containing what appears to be garbage, with

The repository changelog seems to indicate that the current development
release of sort has not changed since 8.12.  Will attempting to track
the problem down with 8.12 be useful?  If so I can post stack traces
and values of relevant variables from the core dump, or post a new
issue in the tracker, or reopen #9307.  If not, please suggest some
specific actions I should take to generate useful information.


-- Andras Salamon                   address@hidden

reply via email to

[Prev in Thread] Current Thread [Next in Thread]