texmacs-dev
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Texmacs-dev] Segmentation fault with gcc 3.2


From: Igor V. Kovalenko
Subject: Re: [Texmacs-dev] Segmentation fault with gcc 3.2
Date: Mon, 04 Nov 2002 01:55:57 +0300
User-agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.0.1) Gecko/20020809

Igor V. Kovalenko wrote:
Joris van der Hoeven wrote:

Anybody knows about a good way to debug memory corruptions?


To the limit valgrind may help, though it gives many positives in libguile
garbage collector...



And besides the "positive" inside TeXmacs which was already
reported before, are there any other ones?


By "positive" I refer to a valgrind's suspictious memory access.
The one in TeXmacs that was reported before is now eliminated (thanks!).

There are about 60 contexts reported ending up in libguile (file gc.c)
If you want the log, I'll try to collect it. Don't know if it'll help.

BTW the one I reported before should be reproducible at any site, would you
mind to try it yourself? Valgrind tool is at http://developer.kde.org/~sewardj/


To comment on more than 60 contexts of accesses to uninitialized data.

It is well-known :) that guile internal function scm_mark_locations()
does active walk through stack space from the location of the first
call to scm_boot_guile() till the current stack pointer, using this
area as one int[] (well, SCM[]) array.

So if some uninitialized variable is placed on stack in between mentioned
calls then scm_mark_locations() would also examine it's value. And the first one
would be (surprise!)
  SCM_STACKITEM dummy;
just at the very beginning of scm_boot_guile() itself, which address is taken
to be guile stack start address.

Also GCC may choose to align some variables it places on stack without writing
the padding so there may be even more "false positives".

To conclude I personally count most of the "positives" as errors. I simply 
dislike
to leave variables not initialized, this prevents from many subtle errors IMHO.

--
Regards,
Igor V. Kovalenko    mailto: iko at crec dot mipt dot ru





reply via email to

[Prev in Thread] Current Thread [Next in Thread]