discuss-gnuradio
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Python embedded block stops working after consume()


From: Jeff Long
Subject: Re: Python embedded block stops working after consume()
Date: Wed, 27 Oct 2021 15:30:46 -0400

If you are feeding a flowgraph a finite number of samples, there is no guarantee the last samples will be processed before the flowgraph terminates. Could that be what you're seeing? Otherwise, post what you're doing and someone can try to help further.

On Wed, Oct 27, 2021 at 3:00 PM Verónica Toro Betancur <vetorobe@gmail.com> wrote:
Hi Jeff,

Thank you for your reply.

I have tried returning len(output_items[0]) and using it in the consume function and it still doesn't work.

Also, if I don't use consume() or consume_each(), it seems like the last part of the signal is dropped and I can't decode it correctly in the blocks that come afterwards.


Best regards,
Verónica

El mié., 27 de oct. de 2021 8:16 PM, Jeff Long <willcode4@gmail.com> escribió:
The input vector may contain more items than the scheduler is expecting you to return. Use len(output_items[0]) to determine how much to consume and return. For reference, here is the autogenerated code for a new module:

    def work(self, input_items, output_items):
        in0 = input_items[0]
        out = output_items[0]
        # <+signal processing here+>
        out[:] = in0
        return len(output_items[0])

On Wed, Oct 27, 2021 at 11:46 AM Verónica Toro Betancur <vetorobe@gmail.com> wrote:
Hi,

I've been trying to implement a sync python embedded block that processes all input_items. At the end of the work() function, I call 

output_items[0][:] = input_items[0]
self.consume_each(len(input_items[0]))
return len(input_items[0])

This works well the first time and all data is processed correctly, but then, the block stops working, i.e., it doesn't process any new upcoming data and input_items[0] is always filled with zeros.

I hope someone could help me with this.

Thanks in advance.


Best regards,
Verónica

reply via email to

[Prev in Thread] Current Thread [Next in Thread]