Am working on decoding a protocol used by a certain gps device to communicate. Right now, am analyzing the first data packet that it sends. I am able to read it, but am thinking am not reading it properly.
Here is what i got so far:
public static String toHex(byte[] bytes) {
BigInteger bi = new BigInteger(1, bytes);
return String.format("%0" + (bytes.length << 1) + "X", bi);
}
private void ProcessInitialPacket(){
int port = 1954;
System.out.println("Listening on port :"+port);
byte[] data = new byte[17];
byte[] ackPacket = new byte[2];
byte[] dataPacket= new byte[15];
try {
ServerSocket sSocket = new ServerSocket(port);
Socket cSocket = sSocket.accept();
DataInputStream dataIN = new DataInputStream(cSocket.getInputStream());
int packetSize=dataIN.read(data,0,data.length);
System.arraycopy(data, 0, ackPacket, 0, 2);
System.arraycopy(data,2,dataPacket,0,15);
System.out.println("Total packet size: "+ packetSize);
System.out.println("ACK PACKET : "+ toHex(ackPacket));
System.out.println("DATA PACKET: "+ toHex(dataPacket));
System.out.println("FULL PACKET: "+ toHex(data));
} catch (IOException e) {
e.printStackTrace();
}
}
the output:
-PARSER()--
-INITSESSION-- Listening on port :1954
Total packet size: 17
ACK PACKET : 000F
DATA PACKET: 333532383438303236323631393534
FULL PACKET: 000F333532383438303236323631393534
------CLOSESESSION------------
Now, my problem:
what is happening here is that the device sends a [0x00][0x0F]xxxxxxxxxxxxxxx where the xxxxxxx is its imei (the data packet). My problem is that there are way too many 3´s on the data packet, so the real valid output is
352848026261954
which you obtain by removing 3´s. my question is: can this behavior come from my code or its part of the protocol? i can correct this programmatically but am wanting to know it there is a way that code can cause these extra 3s.