[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Discuss-gnuradio] Could This Be A Speed Problem? How Can I Make It

From: Kyeong Su Shin
Subject: Re: [Discuss-gnuradio] Could This Be A Speed Problem? How Can I Make It Faster
Date: Fri, 17 May 2019 06:33:54 +0000

Hello Pete,

This is not really an asnwer to your question lists, but  if you want to continuosly process data generated by GNU Radio using Python, you can simply write a GNU Radio Python block (using a 'Python Block' on GNU Radio Companion, or by creating a custom module). In that way, you do not need to use a File Sink, TCP, ZMQ, or whatever that is needed to transfer the data from GNU Radio to your own Python code. There are a few examples on the Internet ( https://www.google.com/search?client=firefox-b-d&q=embedded+python+block+gnuradio ).

Also, if the rate of the flow graph is limited by the incoming file stream (i am not 100% certain, but I think that could be the case), I recommend try dropping the Throttle block in your flow graph. It is only used when the rate of the program is not limitted by any other blocks.

In my experience, Python IS a bit slow for real-time data processing applications, IF the processing is not handled by external libraries. Writing programs in C/C++ does help if you cannot get the jobs done using existing libraries.


Kyeong Su Shin

보낸 사람: P C <address@hidden> 대신 Discuss-gnuradio <discuss-gnuradio-bounces+address@hidden>
보낸 날짜: 2019년 5월 17일 금요일 오전 4:13:43
받는 사람: GNURadio Discussion List
제목: [Discuss-gnuradio] Could This Be A Speed Problem? How Can I Make It Faster
I have what I think is a problem with processing speed.
I want to run the attached flow graph on my Raspberry Pi 3B.
It captures a serial bit stream.

The lower File Sink is a FIFO type file that is read and processed in
real time (I hope) by the Python script decoder.py (attached). Actually
decoder.py is a dummed down script I am using to isolate the problem. 
The script I hope to use is ~100 lines long.

The upper File Sink just writes a file that is processed off line by a
different script.

Note that in decoder.py there is a variable delay.  If I make delay
small, like 0 or 1, I believe the flow graph runs successfully.  But if
I make delay larger things start to fall apart.  By that I mean bits are
dropped.  Also the Time Sink falls behind. I test that by manually
changing the bit stream.  What I see is, if delay is small, the Time
Sink displays the changed bit stream immediately.   If delay is 2 or 3
the change to the bit stream is delayed 5 or 6 seconds.

Also, in the larger  delay case, the file Capture.sec is also missing
bits.  It seems that if decoder isn't taking bytes out of the FIFO fast
enough, the "backup" reflects back to the output of the Multiply
Constant.   I separated the paths to the File Sinks hoping that I could
avoid this effect but no luck.

So, I guess my questions are:
(1) Is my analysis correct?
(2) Is it reasonable that just one or two times through that "while"
loop could be such a problem?
(3) Would it be useful to rewrite the processing script as a C program? 
Keeping in mind that the program I really want to use is 10 or 20 times
(4) Am I doing things in a way that slows things down and could be fixed
by some manipulation?
(5) Shouldn't a FIFO be able to "absorb" the data flow?
(6) Would using something like ZMQ get around this problem?



reply via email to

[Prev in Thread] Current Thread [Next in Thread]