I am simulating a simple GMSK modulation/demodulation process, without any channel model, in order to evaluate the BER under normal conditions and with noise. I would like to have your opinion about the way I calculate the BER: is it correct? Is there a better way to do it?
Here a brief description of the problem:
I created the following input binary sequence: 1110011100 as a .bin file and I loaded it in the File Source block. Then, I fed the GMSK Mod block with this binary sequence, I added noise to it and then I sent the resulting signal to a GMSK Demod unit. Finally, I saved the output sequence in a .bin file by means of File Sink block.
Apparently, the GMSK Demod block creates 8-bit packets for each input bit, by allocating them in the LSB position. In this case, instead of calculating the BER, I am calculating the PER (Packet Error Rate), by comparing the 0's with 00000000 and 1's with 00000001. And when I add noise to the model, the output sequence is modified (as expected).
In this case, I am proceeding with the same methodology for the PER calculation so every time a non-zero bit appears between the 1st and 7th position of a byte (ex: 01000000), I suppose this is caused by noise and I count it as a wrong packet.
Does my analysis make sense? Is there a more practical or correct way of evaluating the BER/PER?