discuss-gnuradio
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Discuss-gnuradio] Re: Lots of mallocs in my gnuradio block ... is i


From: Kunal Kandekar
Subject: Re: [Discuss-gnuradio] Re: Lots of mallocs in my gnuradio block ... is it safe?
Date: Mon, 18 Oct 2010 11:17:20 -0400

It looks like only about 7 MB of memory... (204 + 5 + 204) * 2046 * 8 + 2046 * 8 =  6,776,352 bytes. It probably won't play too well with your CPU cache, but that still shouldn't be too much, right? 

Also, since you only need to calculate once, I assume you will be mallocing these arrays only once (e.g. in the constructor)... if you're not mallocing every time general_work is called, it should be even less of a problem. I myself haven't had to malloc more than about 1MB of arrays in a block, though, but I think it may be worth trying out.

Kunal


On Mon, Oct 18, 2010 at 1:01 AM, John Andrews <address@hidden> wrote:
haha! I just realized that this is a great deal of memory used. 250*2046*4.

I am so dumb. :) 

I guess I will have to calculate this on each iteration i use that data.



On Sun, Oct 17, 2010 at 9:08 PM, John Andrews <address@hidden> wrote:
Hi,
I am writing the code for a gnuradio block and I use a lot of dynamic memory allocation. The code isn't finished yet but I thought I would ask this question anyways. For an idea this is how the mallocs look like.

1. (204 + 5 + 204) gr_complex arrays of size 2046 each
2. 1 double array of size 2046

The reason why I have this is most of the data allocated in these arrays is reused and I don't have to calculate it again and again. i thought having this data ready when the block is initialized will speed up the processing.

Being pessimistic programmer I am circumspect about my method. What are your thoughts?

Thanks,
John


_______________________________________________
Discuss-gnuradio mailing list
address@hidden
http://lists.gnu.org/mailman/listinfo/discuss-gnuradio



reply via email to

[Prev in Thread] Current Thread [Next in Thread]