|
From: | Johannes Demel |
Subject: | Re: How to use 2 N310 for TX and RX |
Date: | Thu, 11 Feb 2021 11:04:43 +0100 |
User-agent: | Mozilla/5.0 (X11; Linux x86_64; rv:68.0) Gecko/20100101 Thunderbird/68.10.0 |
Hi, yes, I just attach a grc file for GR 3.9 that I use to test things.It works if I specify `addr=...` or `addr0=...` with one device. If I switch to 2 devices `addr=....,addr1=...` it fails.
It seems like UHD tries to initialize the devices twice. ----[INFO] [UHD] linux; GNU C++ version 9.3.0; Boost_107100; UHD_3.15.0.0-62-g7a3f1516 [INFO] [MPMD] Initializing 2 device(s) in parallel with args: mgmt_addr0=192.168.20.213,type0=n3xx,product0=n310,serial0=319841B,claimed0=False,mgmt_addr1=192.168.21.218,type1=n3xx,product1=n310,serial1=3180AF3,claimed1=False,addr0=192.168.20.213,addr1=192.168.21.218,master_clock_rate=122.88e6,clock_source=external,time_source=external [INFO] [MPM.PeriphManager] init() called with device args `time_source=external,clock_source=external,master_clock_rate=122.88e6,product=n310,mgmt_addr=192.168.20.213'.
[INFO] [0/Replay_0] Initializing block control (NOC ID: 0x4E91A00000000004)[INFO] [MPM.PeriphManager] init() called with device args `time_source=external,product=n310,master_clock_rate=122.88e6,clock_source=external,mgmt_addr=192.168.21.218'.
[INFO] [0/Radio_0] Initializing block control (NOC ID: 0x12AD100000011312) [...] [INFO] [1/FIFO_3] Initializing block control (NOC ID: 0xF1F0000000000000) [INFO] [MULTI_USRP] 1) catch time transition at pps edge [INFO] [MULTI_USRP] 2) set times next pps (synchronously)[INFO] [MPMD] Initializing 2 device(s) in parallel with args: mgmt_addr0=192.168.20.213,type0=n3xx,product0=n310,serial0=319841B,claimed0=True,mgmt_addr1=192.168.21.218,type1=n3xx,product1=n310,serial1=3180AF3,claimed1=True,addr0=192.168.20.213,addr1=192.168.21.218,master_clock_rate=122.88e6,clock_source=external,time_source=external
[ERROR] [RPC] Someone tried to claim this device again (From: 192.168.20.34) ---- If I use only one device, it looks like this: ----[INFO] [UHD] linux; GNU C++ version 9.3.0; Boost_107100; UHD_3.15.0.0-62-g7a3f1516 [INFO] [MPMD] Initializing 1 device(s) in parallel with args: mgmt_addr=192.168.20.213,type=n3xx,product=n310,serial=319841B,claimed=False,addr0=192.168.20.213,master_clock_rate=122.88e6,clock_source=external,time_source=external [INFO] [MPM.PeriphManager] init() called with device args `time_source=external,clock_source=external,master_clock_rate=122.88e6,product=n310,mgmt_addr=192.168.20.213'.
[INFO] [0/Replay_0] Initializing block control (NOC ID: 0x4E91A00000000004) [...] [INFO] [0/FIFO_3] Initializing block control (NOC ID: 0xF1F0000000000000) [INFO] [MULTI_USRP] 1) catch time transition at pps edge [INFO] [MULTI_USRP] 2) set times next pps (synchronously) [INFO] [MULTI_USRP] 1) catch time transition at pps edge [INFO] [MULTI_USRP] 2) set times next pps (synchronously) ----The last 4 lines are suspicious because they indicate that synchronization is performed twice. Also, most of the time during start up is spend there.
Anyways, I attached my MWE flowgraph. I'd be happy if you could tell me how to fix my issue.
Cheers Johannes On 10.02.21 22:52, Marcus D Leech wrote:
What happens if you just use a single N310 for both TX and RX? Just trying to figure out where the problem might be. Also please share a minimal flow graph that shows the problem. Sent from my iPhoneOn Feb 10, 2021, at 1:25 PM, Johannes Demel <demel@ant.uni-bremen.de> wrote: Hi all, I have a flowgraph where I want to use two N310s for TX and RX. If I run `benchmark_rate`, everything works fine. ``` ./benchmark_rate --pps external --ref external --rx_channels "0,4" --tx_channels "2,6" --rx_rate 61.44e6 --tx_rate 61.44e6 --args="addr0=192.168.21.218,addr1=192.168.20.213,master_clock_rate=122.88e6 ``` It's important that I use one RX and one TX channel each on those USRPs. But it seems like I can't do that with ``` [INFO] [UHD] linux; GNU C++ version 9.3.0; Boost_107100; UHD_3.15.0.0-62-g7a3f1516 [INFO] [MPMD] Initializing 2 device(s) in parallel with args: mgmt_addr0=192.168.20.213,type0=n3xx,product0=n310,serial0=319841B,claimed0=False,mgmt_addr1=192.168.21.218,type1=n3xx,product1=n310,serial1=3180AF3,claimed1=False,addr0=192.168.20.213,addr1=192.168.21.218,master_clock_rate=122.88e6,clock_source=external,time_source=external [INFO] [MPM.PeriphManager] init() called with device args `time_source=external,clock_source=external,master_clock_rate=122.88e6,product=n310,mgmt_addr=192.168.20.213'. [INFO] [0/Replay_0] Initializing block control (NOC ID: 0x4E91A00000000004) [INFO] [MPM.PeriphManager] init() called with device args `time_source=external,product=n310,master_clock_rate=122.88e6,clock_source=external,mgmt_addr=192.168.21.218'. [...] [INFO] [MULTI_USRP] 1) catch time transition at pps edge [INFO] [MULTI_USRP] 2) set times next pps (synchronously) [INFO] [MPMD] Initializing 1 device(s) in parallel with args: mgmt_addr=192.168.21.218,type=n3xx,product=n310,serial=3180AF3,claimed=True,addr=192.168.21.218,master_clock_rate=122.88e6,clock_source=external,time_source=external [ERROR] [RPC] Someone tried to claim this device again (From: 192.168.21.34) [WARNING] [MPM.RPCServer] Someone tried to claim this device again (From: 192.168.21.34) Traceback (most recent call last): File "gr-tacmac/examples/usrp_multi_test.py", line 360, in <module> main() File "gr-tacmac/examples/usrp_multi_test.py", line 338, in main tb = top_block_cls() File "gr-tacmac/examples/usrp_multi_test.py", line 133, in __init__ self.uhd_usrp_sink_0 = uhd.usrp_sink( RuntimeError: RuntimeError: Error during RPC call to `claim'. Error message: Someone tried to claim this device again (From: 192.168.21.34) ``` I can share the example flowgraph. It's just a USRP source connected to a Qt time sink (and a Qt Freq sink). And a USRP sink that's fed by two Signal sources. It works if I only have either a USRP sink or USRP source. But in case I try to use both. The configuration breaks with the above error. How do I fix this? Cheers Johannes
multi_usrp_test.grc
Description: Text document
[Prev in Thread] | Current Thread | [Next in Thread] |