Hi All,
In my network server logs (mLinux, v1.0.32) I can see packet error rate values in the range from 1% to 40%:
8:37:30:678|INFO| ED:55-66-77-88-00-00-00-04|PER|43.856331%
...
12:13:1:294|INFO| ED:55-66-77-88-00-00-00-03|PER|9.523809%
12:14:53:348|INFO| ED:55-66-77-88-00-00-00-05|PER|0.875274%
My nodes are on a test bench in the lab, just a few meters away from the gateway and all running the very same firmware.
Two questions:
- How is PER calculated exactly?
- Any ideas what might cause such a big difference in PER when the nodes are so similar to each other from every aspect.
Thanks,
-Tamás