[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Discuss-gnuradio] multiple asynchronous channels in parallel
From: |
alyafawi |
Subject: |
Re: [Discuss-gnuradio] multiple asynchronous channels in parallel |
Date: |
Wed, 18 Sep 2013 00:20:56 -0700 (PDT) |
for a single channel (quite stable):
USRP -- LPF -- Module --
for multi channel (exit with segfault)
-- LPF -- Module --
USRP -- LPF -- Module --
...................
-- LPF -- Module --
I am using xlating_fir_filter_ccf with the proper frequency offset value.
I have attached a draft of my module block, pleased to have a look at it if
I am producing/consuming samples wrongly.
Yes, I am defining nitems_items_required in the forecast. input/ output size
as follow:
gr_make_io_signature(MIN_IN, MAX_IN, sizeof(gr_complex)),
gr_make_io_signature(MIN_OUT, MAX_OUT, 23)),
whenever I found I message, I convert it to char array, and send it to the
next module.
I am using gr_block where it accept different ratios of input/output rates.
But with my current implementation, this ratio is not M:N, where M, N are
integers, it can be any ratio (nitems:sample_number as in the attached file)
>From my understanding to the scheduler, the left over samples in the module
(provided = X, consumed = Y, left = X-Y) will be concatenated in the front
of the new stream, regardless how much was left.
could it be I have to keep the provided:consume ratio as an integers ratio?
then I achieve this using local buffers in the module ?
I checked the timestamp of messages captured by two USRPs running single
channel code in parallel, the resolution was within one symbol duration
(1/R). while the multi channel code, showed a difference of more than one
symbol duration (> 1/R) which I thought its due to samples disorder while
splitting the branches. That lead me to the first question about "faster"
--
View this message in context:
http://gnuradio.4.n7.nabble.com/multiple-asynchronous-channels-in-parallel-tp43656p43699.html
Sent from the GnuRadio mailing list archive at Nabble.com.