|
From: | Marcus Müller |
Subject: | Re: [Discuss-gnuradio] Microwave Link Demodulation |
Date: | Tue, 30 Aug 2016 09:23:46 +0200 |
User-agent: | Mozilla/5.0 (X11; Linux x86_64; rv:45.0) Gecko/20100101 Thunderbird/45.2.0 |
Hi Ihab, that's great!! So, I see that others already pointed out the very handy htop /
top utilities, which can give you a rough idea of where the CPU is
spent within the blink of an eye, as well as the far more
involved, but also far more detailed controlport approach that can
also tell you exactly where your sample flow is uneven or even
stalling. On an algorithmic level, have a look at "perf", which is an awesome tool that actually measures how much time is spent in individual C/C++ functions. It takes a bit of experience (read: trial and error) till you get the "perf report" to show exactly what you want to know, but often, this information really points out which function needs to be optimized. I can't find a post right now, but please go to the USRP-users and discuss-gnuradio mailing list archives and search for "perf report"; I'm pretty optimistic I wrote a usable guide for these once. Now, regarding "BER Bottlenecks": Where the computational
bottlenecks were to be solved using software engineering, this is
very much in the domain of communication engineering (and the fact
that these two disciplines are so closely linked is one of the
most exciting aspects of SDR, in my opinion :) ). Then: do your packet lengths reflect your application's transmission needs well? Do they work good with your channel code? Is this a one-way system, or do you have the chance to ARQ? What's the ARQ/FEC-redundancy tradeoff? Contention Windows? So, that really gets very dependent on what you actually want to do with the link; notice that different wireless standards developed at the same time use vastly different technology based on the specifics of the channel and wireless system they're used for (e.g. DVB-S was developed around 2000–2005 and used 8PSK-32APSK with pretty long LDPCs, which reflects the properties of the satellite-ground channel and the high-latency-tolerant, large-block-size-tolerant receivers respectively, whereas 3G/UMTS-CDMA typically use things like much shorter turbo codes with rate=1/3, reflecting a completely different channel, and a lower tolerance for latency). Hope that points you in the right direction, Best regards, Marcus
On 29.08.2016 15:46, Ihab Zine wrote:
|
[Prev in Thread] | Current Thread | [Next in Thread] |