I am trying to learn about integer representations in c and I am having a lot of difficulty understand the fact that some operations are undefined meaning they are inconsistent among systems. I find a lot of sources claiming to teach about how these things work sometimes slip in things that are specific to the architecture they are coding for.
When I am coding I intend never to rely on undefined behaviors that happen to work a certain way on most processors.
What is the definitive truth of what happens in c (across all systems) when things like truncation, extension, comparison are done and when unsigned and signed types are included in casting and arithmetic. Which of these operations have a defined behavior across all systems?