I'm trying to encode a TCP header myself, but can't understand what is the right order of bits/octets in it. This is what RFC 793 says:
0 1 2 3
0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Source Port | Destination Port |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Sequence Number |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
...
This means that Source Port
should take first two octets and the lowest bit should be in the first octet. This means to me that in order to encode source port 180 I should start my TCP header with these two bytes:
B4 00 ...
However, all examples I can find tell me to do it the other way around:
00 B4 ...
Why?