Re: UDR56k-4 Sensitivity & BER

"Michael E. Fox - N6MEF" <n6mef@...>



It’s even worse than that, especially in the real world.  A 20% loss rate will cause enough retransmits that a cascade failure will occur.  In other words, for those 20% that must be retransmitted, 20% of those will need to be retransmitted again, and so on.  If the link is more than a single user occasionally checking his BBS for short messages, the link becomes worthless pretty quickly.  For example, if there are a 3-4 systems on the frequency, even at only 128 byte packets, the channel becomes hopelessly clogged in very short order.


Even your 10^-4 example shows 10% loss of just 128 byte packets.  Again, way too high for anything more than a single user talking to a single station.  So, 20% loss is NOT manageable for AX.25.  Not even 10%.


When we first deployed our BBS network, we performed real testing with 9600 baud TNCs (no FEC).  We were getting somewhere in the range of 5% to 10% packet loss at 9600 baud and 0% packet loss at 1200 baud.  (This was not a lab test.  This was between real sites using real antennas.)  Two different TNC brands were used and, yes, deviation was verified to be per manufacturer’s specs.  As a result, even with the higher baud rate, the effective throughput was lower on 9600 than on 1200 baud. 


To think that one could go higher in speed and not make things even worse, is just not facing reality.  In *real* environments, with more than just one user talking to one other station at a time, the BER must be much higher (10^-5 to 10^-6) in order for the channel to not rapidly degrade due to cascading retransmits.  For any *real* environment, FEC is going to be essential. 


The standard level for measuring receiver sensitivity in digital radios is 10^-5 BER.  Anyone who is familiar with P25 or DMR testing will be familiar with modulation fidelity vs. BER.  Those systems have error correction.  The modulation fidelity (measure of how accurately the symbols are being received) can degrade quite severely while still maintaining a 0% BER.  Without forward error correction, these systems would be unusable in any real environment. 


I’d love to deploy the UDR56 on our BBSs that share a common forwarding frequency.  But without FEC, there’s just no way.  The result would be predictably terrible.







From: UniversalDigitalRadio@... [mailto:UniversalDigitalRadio@...] On Behalf Of Darren Long
Sent: Tuesday, August 06, 2013 6:12 AM
To: UniversalDigitalRadio@...
Subject: [UniversalDigitalRadio] UDR56k-4 Sensitivity & BER



Hi all.

I was just reading the datasheet for the UDR56k-4 and noticed the receiver sensitivity figures:

Receiver Sensitivity, BER 10-3

•    4k8 -113 dBm

•    9k6 -110 dBm

•    56k -100 dBm

That BER of 10^-3 is pretty severe. According to my calcs in octave:

octave:10> ber = 10^-3
ber =  0.0010000
octave:11> ploss = (1-(1-ber)^(128*8))
ploss =  0.64103
octave:12> ploss = (1-(1-ber)^(256*8))
ploss =  0.87114
octave:13> ploss = (1-(1-ber)^(1500*8))
ploss =  0.99999

That's a 64% packet loss rate for 128 byte frames, 87% for 256 byte frames and almost 100% for a typical TCP/IP MTU.   Not a particularly good reference for planning link budgets, I wouldn't have thought.

A BER of 10^-4 looks to be a more useful baseline for a sensitivity figure:

octave:19> ber =  0.00010000
ber =  1.0000e-04
octave:20> ploss = (1-(1-ber)^(128*8))
ploss =  0.097336
octave:21> ploss = (1-(1-ber)^(256*8))
ploss =  0.18520
octave:22> ploss = (1-(1-ber)^(1500*8))
ploss =  0.69882

A ~20% loss rate would be manageable for say a reliable AX.25 connection mode link, but would kill TCP performance over an unreliable link.  NORM would cope OK, I've regularly seen it do well with >50% loss rates. 

I suppose the BER/(Eb/N0) curves for the modems in question would show how much margin one needed to get up to a BER of 10^e-4.  Would it be more useful to have the sensitivity figures quoted for a better BER?

What do you think?


Darren Long, G0HWW

Join to automatically receive all group messages.