discuss-gnuradio
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Discuss-gnuradio] usrp_spectrum_sense vs. usrp_fft


From: TMob
Subject: [Discuss-gnuradio] usrp_spectrum_sense vs. usrp_fft
Date: Wed, 28 Jan 2009 08:14:38 -0800 (PST)

Hi,

I am currently trying to use the USRP to sense the 802.11 channels for
activity. So far, I am using the usrp_spectrum_sense to do this. Each time I
get the callback from gr.bin_statistics_f, I calculate the signal power in
the returned data vector using the following formula:

        for bin in m.data:
            signalPower +=
20*math.log10(math.sqrt(bin)/tb.fft_size)-20*math.log10(tb.fft_size)-10*math.log(tb.power/tb.fft_size/tb.fft_size)
        signalPower /= tb.fft_size

According to previous posts, this should give me the signal power at the
given center frequency in dBm.
Unfortunately, it turned out that the values that I get using this code,
vary very much, e.g. with the FFT size and the gain. When I leave gain and
FFT size per default I get values from -28 through +5 (dBm) which definitely
does not correspond to dBm. Is there any mistake in the formula? Is this
really dBm that I get?

Because the usrp_fft.py example shows more realistic values (around -50 -
-60dBm) than the usrp_spectrum_sense.py, I was wondering if somebody could
explain how usrp_fft gets to these values. All I can see in the source code
there is that a USRP source is defined and connected to the scope. But where
is the conversion into dBm done? Can this be applied to usrp_spectrum_sense
somehow?

Thanks,

TMob
-- 
View this message in context: 
http://www.nabble.com/usrp_spectrum_sense-vs.-usrp_fft-tp21708925p21708925.html
Sent from the GnuRadio mailing list archive at Nabble.com.





reply via email to

[Prev in Thread] Current Thread [Next in Thread]