[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Chicken-hackers] SRFI-69 compatibility problem and hash table segfa
Jörg F . Wittenberger
Re: [Chicken-hackers] SRFI-69 compatibility problem and hash table segfaults
08 Jan 2012 16:13:49 +0100
On Jan 7 2012, Peter Bex wrote:
The thought just popped into my head that my hash table patch breaks
SRFI-69 compatibility. When the user passes a custom hash procedure,
That would be bad bad. In the end.
For fixing the extra argument problem, here are some options I
1) Pass the randomization value by a SRFI-49 parameter.
Of course, all approaches have disadvantages. 1 seems the least
invasive and error-prone but it's just ugly.
IMHO too guly.
2) Get rid of randomization-per-hashtable and have one global
randomization value instead, like low-level hash tables.
Symbol tables could still have per-table randomization.
Users must take care of randomization themselves.
Haven't had time to read things up. A question: is randomization
the only option to fix the issue? To my understanding (read
guesswork) additional randomisation would fix any vulnerable
hash function. But aren't there any good hash functions to begin
If we could have non-vulnerable hash functions for the defaults
and those hash-* functions as exported, then I'd feel much easier
with the next topic
3) Assume the user knows what they're doing when passing a custom
hash function and let them handle the randomization themselves.
This will mean when re-using existing core hash procedures
the user must remember to pass their own value to that
procedure. If the user wants per-hash-table randomization
they must take care of this themselves by passing a different
closure for each table.
given fixed defaults I'd vote for (3):
3 means that in some cases the
user might want to use the chicken-supplied ones but needs to
remember to use the randomization factor. 3 is of course the easiest
since it basically just punts on the issue and lets the user sort it
out (but only when passing a custom procedure!).
I'd assume that whoever passes a custom procedure either re-uses
the default hash-* functions on some value selector or really, really
knows what to do. For the latter a footnote in the manual to remind
them on the issue would do even better then a well meant solution
which doesn't fit their needs perfectly.
make use of the hash table's randomization value, this becomes
difficult because then when you use hash-table-copy the first hash
table stays around in the closure and never gets GCed.
Such a memory leak will soon show up and spoil the fun.
Sorry for the late reply.