pan-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Pan-users] Pan crasher in Task window


From: Douglas Bollinger
Subject: [Pan-users] Pan crasher in Task window
Date: Mon, 28 Aug 2006 19:43:27 -0400

Could someone else try to replicate this bug?  I would like a confirmation
before I post it in bugzilla.

I'm using pan-0.110.

Start Pan, goto a binary group and select several complete binary objects to
get some d/l tasks in the queue.  Select all the tasks in the queue and then
press the delete button.  Pan should crash.  It does this every time for me.

I think I got gdb working correctly.  This is with debugging activated for
glib, gtk+, gnome-vfs and of course Pan.  Here's the bt I get:

(gdb) run
Starting program: /usr/bin/pan
[Thread debugging using libthread_db enabled]
[New Thread -1219144016 (LWP 9172)]

Program received signal SIGSEGV, Segmentation fault.
[Switching to Thread -1219144016 (LWP 9172)]
0xb77c59bc in malloc_usable_size () from /lib/tls/libc.so.6
(gdb) thread apply all bt

Thread 1 (Thread -1219144016 (LWP 9172)):
#0  0xb77c59bc in malloc_usable_size () from /lib/tls/libc.so.6
#1  0xb77c652e in free () from /lib/tls/libc.so.6
#2  0xb77c7d5f in malloc () from /lib/tls/libc.so.6
#3  0xb7118658 in __libc_res_nquery () from /lib/libresolv.so.2
#4  0xb71187f8 in __res_nquery () from /lib/libresolv.so.2
#5  0xb7118bd8 in __libc_res_nsearch () from /lib/libresolv.so.2
#6  0xb71262be in _nss_dns_gethostbyname3_r () from /lib/libnss_dns.so.2
#7  0xb71265cb in _nss_dns_gethostbyname2_r () from /lib/libnss_dns.so.2
#8  0xb7806ddb in sched_setaffinity () from /lib/tls/libc.so.6
#9  0xb7807e87 in getaddrinfo () from /lib/tls/libc.so.6
#10 0x08184213 in (anonymous namespace)::create_channel (address@hidden,
    port=-1215850432) at socket-impl-gio.cc:181
#11 0x08184662 in pan::GIOChannelSocket::open (this=0x9f3de40,
    address@hidden, port=-1215850432, l=0xb7879840)
    at socket-impl-gio.cc:292
#12 0x080aab80 in pan::GIOChannelSocket::Creator::create_socket (
    this=0xbff5a3e0, address@hidden, port=1111490560) at socket-impl-gio.h:66
#13 0x08186ce8 in pan::NNTP_Pool::request_nntp (this=0x9f4d048)
    at basic_string.h:538
#14 0x0817b0f4 in pan::Queue::process_task (this=0xbff5a2e0, task=0x9f3dcd8)
    at stl_iterator.h:614
---Type <return> to continue, or q <return> to quit---
#15 0x0817b719 in pan::Queue::check_in (this=0xbff5a2e0, nntp=0x9f55948,
    is_ok=true) at queue.cc:542
#16 0x081600a9 in pan::Task::check_in (this=0x9f3dcd8, nntp=0x9f55948,
    is_ok=true) at task.cc:59
#17 0x081629a4 in pan::TaskArticle::on_nntp_done (this=0x9f3dcd8,
    nntp=0x9f55948, health=pan::OK) at task-article.cc:267
#18 0x08171ab9 in pan::NNTP::fire_done_func (this=0x9f55948,
    health=-1215850432) at nntp.cc:107
#19 0x081734f3 in pan::NNTP::onSocketResponse (this=0x9f55948, sock=0x9f552f0,
    address@hidden) at nntp.cc:260
#20 0x08185482 in pan::GIOChannelSocket::do_read (this=0x9f552f0)
    at string-view.h:149
#21 0x081856c1 in pan::GIOChannelSocket::gio_func (channel=0x42400000,
    cond=G_IO_IN, sock_gp=0x9f552f0) at socket-impl-gio.cc:438
#22 0xb7a7cb9d in g_io_unix_dispatch (source=0x9e9ab70,
    callback=0x8185640 <pan::GIOChannelSocket::gio_func(_GIOChannel*, 
GIOCondition, void*)>, user_data=0xb7879840) at giounix.c:162
#23 0xb7a55c8c in IA__g_main_context_dispatch (context=0x8272ad0)
    at gmain.c:1916
#24 0xb7a575d7 in g_main_context_iterate (context=0x8272ad0, block=1,
    dispatch=1, self=0x8275c18) at gmain.c:2547
#25 0xb7a578fa in IA__g_main_loop_run (loop=0x8eee258) at gmain.c:2751
#26 0xb7d5a2e3 in IA__gtk_main () at gtkmain.c:1003
---Type <return> to continue, or q <return> to quit---
#27 0x080a9650 in (anonymous namespace)::run_pan_in_window (address@hidden,
    address@hidden, address@hidden, address@hidden,
    address@hidden, window=0x88d9028) at pan.cc:131
#28 0x080aa706 in main (argc=1, argv=0xbff5a7c4) at pan.cc:271

-- 
The secret of happiness is total disregard of everybody.




reply via email to

[Prev in Thread] Current Thread [Next in Thread]