Hi!
32-bit x86 GNU/Hurd recently switched from 2 GiB/2 GiB user/kernel
address space to a 3 GiB/1 GiB split. After that, I see test cases like
libstdc++-v3/testsuite/21_strings/basic_string/modifiers/insert/char/1.cc
allocate huge amounts of memory. Looking into this, this is coming from:
[...]
csize_type csz01, csz02;
const std::string str01("rodeo beach, marin");
[...]
csz01 = str01.max_size();
try {
std::string str04(csz01, 'b');
str03 = str04;
csz02 = str02.size();
try {
str03.insert(0, str02, 0, 5);
VERIFY( false );
}
catch(std::length_error& fail) {
VERIFY( true );
}
catch(...) {
VERIFY( false );
}
}
catch(std::bad_alloc& failure){
VERIFY( true );
}
catch(std::exception& failure){
VERIFY( false );
}
[...]
Calling std::string's max_size, csz01, evaluates to 2147483647
(0x7fffffff, nearly 2 GiB). (Same on x86 GNU/Linux.) A new std::string
object, str04, is then constructed with that huge size, and initialized
to "bbb[...]". Given the availability of 3 GiB user address space, the
kernel will serve this request, but with a lot of swapping. (The next
huge string then created as a copy, will then exhaust the 3 GiB user
address space.)
I also have to note that GNU/Hurd doesn't have a working RLIMIT
implementation (difficult to implement), so the call to
__gnu_test::set_memory_limits is basically a no-op. On system where that
is enforced (I tested x86 GNU/Linux), such a huge allocation results in a
std::bad_alloc exception, which is expected and caught.
I have not yet looked in detail into other test cases that also use
max_size similarly.
Assuming such usage of std::string's max_size is considered fine, I
suppose I should mark any such tests to be skipped for GNU/Hurd while
there is no functional RLIMIT implementation?