discuss-gnuradio
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Discuss-gnuradio] On the convolutional code performance of gr-ieee8


From: Ron Economos
Subject: Re: [Discuss-gnuradio] On the convolutional code performance of gr-ieee802-11
Date: Tue, 15 Sep 2015 01:38:42 -0700
User-agent: Mozilla/5.0 (X11; Linux i686; rv:24.0) Gecko/20100101 Thunderbird/24.6.0

Made a mistake. The DVB-T receiver is not part of the 3.7.8 release. It's a recent commit on the master branch (3.7.9git).

Ron

On 09/15/2015 01:31 AM, Ron Economos wrote:
The author of gr-dvbt looked at this issue when he developed the DVB-T receiver. Here's a blog entry on his findings.

http://yo3iiu.ro/blog/?p=1393

He benchmarked the IT++ decoder in last place.

IT++ = 2-3 Mbps
gr-trellis = 5 Mbps
Karn C = 7-8 Mbps
gr-dvbt = 39-40 Mbps

The gr-dvbt receiver is now part of GNU Radio as of the last release, 3.7.8. The SSE2 viterbi decoder module is here:

https://github.com/gnuradio/gnuradio/blob/master/gr-dtv/lib/dvbt/dvbt_viterbi_decoder_impl.cc

In addition, I believe there's another decoder in gr-fec. I'm not very familiar with it, so maybe someone else can comment on it. Looks like it's in this module:

https://github.com/gnuradio/gnuradio/blob/master/gr-fec/lib/cc_decoder_impl.cc

Ron

On 09/15/2015 12:47 AM, Jeon wrote:
I've measured time taken by convolutional decoding in gr-ieee802-11. The module is using Punctured Convolutional Code class from IT++ library (http://itpp.sourceforge.net/4.3.0/classitpp_1_1Punctured__Convolutional__Code.html)

I've used chrono (chrono.h, chrono) to measure time taken. You can see how I made it from the following page (https://gist.github.com/gsongsong/7c4081f44e88a7f4407a#file-ofdm_decode_mac-cc-L252-L257)

I've measure time with a loopback flow graph (w/o USRP; examples/wifi_loopback.grc)

The result says that it takes from 5,000 to 30,000 us, which is 5 to 30 ms to decode a signal with a length of 9,000 samples (samples are either 1 or -1.)

* Test environment: Ubuntu 14.04 on VMWare, 2 CPUs and 4 GB RAM allocated
* Host environmetn: Windows 7 with i7-3770 3.7 GHz

Since I am not familiar with error correcting codes, I have no idea how large the order of time taken is. But I think that one of the most efficient decoding algorithm is Viterbi and that IT++ must use it.'

Then I  can deduce that CC decoding takes a quite long time even though the algorithm (Viterbi) is very efficient. And is it a natural limitation of software decoding and SDR?

Another question comes that, the commercial off the shelf (COTS) Wi-Fi device achieves really high throughput and that must be based on super faster CC decoding. Is that because COTS is using heaviliy optimized FPGA and dedicated decoding chips?

Regards,
Jeon.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]