I'm trying to send a request to a TCP/IP terminal. My LRC function as below:
public byte GetLRC(byte[] bArr)
{
byte LRC = 0x00;
foreach (byte b in bArr)
{
LRC ^= b;
}
return LRC;
}
But the problem is my LRC and their sample LRC are totally different. How can they calculate this?
My message data request is almost the same as their except the LRC.
Sample data:
Hex: 0 66 30 30 30 30 30 30 30 30 32 34 31 30 33 30 30 31 30 1C 54 32 0 2 30 31 1C 34 33 0 1 30 1C 34 30 0 12 30 30 30 30 30 30 30 30 30 30 30 31 1C 34 32 0 12 30 30 30 30 30 30 30 30 30 30 30 30 1C 79
Bytes:
{byte[68]}
[0]: 0
[1]: 102
[2]: 48
[3]: 48
[4]: 48
[5]: 48
[6]: 48
[7]: 48
[8]: 48
[9]: 48
[10]: 50
[11]: 52
[12]: 49
[13]: 48
[14]: 51
[15]: 48
[16]: 48
[17]: 49
[18]: 48
[19]: 28
[20]: 84
[21]: 50
[22]: 0
[23]: 2
[24]: 48
[25]: 49
[26]: 28
[27]: 52
[28]: 51
[29]: 0
[30]: 1
[31]: 48
[32]: 28
[33]: 52
[34]: 48
[35]: 0
[36]: 18
[37]: 48
[38]: 48
[39]: 48
[40]: 48
[41]: 48
[42]: 48
[43]: 48
[44]: 48
[45]: 48
[46]: 48
[47]: 48
[48]: 49
[49]: 28
[50]: 52
[51]: 50
[52]: 0
[53]: 18
[54]: 48
[55]: 48
[56]: 48
[57]: 48
[58]: 48
[59]: 48
[60]: 48
[61]: 48
[62]: 48
[63]: 48
[64]: 48
[65]: 48
[66]: 28
[67]: 121
My data:
0 66 30 30 30 30 30 30 30 30 32 34 31 30 33 30 30 31 30 1C 54 32 0 2 30 31 1C 34 33 0 1 30 1C 34 30 0 12 30 30 30 30 30 30 30 30 30 30 30 31 1C 34 32 0 12 30 30 30 30 30 30 30 30 30 30 30 30 1C 1F
{byte[68]}
[0]: 0
[1]: 102
[2]: 48
[3]: 48
[4]: 48
[5]: 48
[6]: 48
[7]: 48
[8]: 48
[9]: 48
[10]: 50
[11]: 52
[12]: 49
[13]: 48
[14]: 51
[15]: 48
[16]: 48
[17]: 49
[18]: 48
[19]: 28
[20]: 84
[21]: 50
[22]: 0
[23]: 2
[24]: 48
[25]: 49
[26]: 28
[27]: 52
[28]: 51
[29]: 0
[30]: 1
[31]: 48
[32]: 28
[33]: 52
[34]: 48
[35]: 0
[36]: 18
[37]: 48
[38]: 48
[39]: 48
[40]: 48
[41]: 48
[42]: 48
[43]: 48
[44]: 48
[45]: 48
[46]: 48
[47]: 48
[48]: 49
[49]: 28
[50]: 52
[51]: 50
[52]: 0
[53]: 18
[54]: 48
[55]: 48
[56]: 48
[57]: 48
[58]: 48
[59]: 48
[60]: 48
[61]: 48
[62]: 48
[63]: 48
[64]: 48
[65]: 48
[66]: 28
[67]: 31