0
signed char c1 = 128;    //case 1: leads to IB
signed char c2 = 128.0f; //case 2: leads to UB

Questions:

  1. Why there is no consistency in behavior? The scenarios are the same, except the source type (int vs. float).
  2. What it the rationale for UB here? Why not IB as in case 1?

Note: this is the follow-up question for:

  1. Type casting: double to char: multiple questions
  2. Assigning an unsigned value to a signed char
pmor
  • 5,392
  • 4
  • 17
  • 36
  • 1
    Is this the [same question](https://stackoverflow.com/questions/66589100/why-converting-out-of-range-integer-to-integer-leads-to-ib-but-converting-ou) previously asked? – Weather Vane Mar 14 '21 at 11:17
  • @WeatherVane Even if it is, I think that question was closed incorrectly. – HolyBlackCat Mar 14 '21 at 11:30
  • 1
    Your question can be generalised as "why some things are implementation-defined while others are undefined?" - I doubt it's reasonably answerable. It's mainly because of history, and based on the implementations, behaviours of hardware, etc when C was standardized. So "consistency" wasn't possible/practical (it's not like the committee decided to arbitrarily classify certain behaviours as IB, UB, or unspecified). – P.P Mar 14 '21 at 11:42
  • @HolyBlackCat Yes, the [question](https://stackoverflow.com/questions/66589100/why-converting-out-of-range-integer-to-integer-leads-to-ib-but-converting-ou) was closed, however, the question was _not_ about the definition of IB/UB. The question was about the rationale of the absence of the consistency. I've had to make a new question with code sample hoping that this one will not be closed. – pmor Mar 14 '21 at 12:09
  • @P.P Thanks, interesting. Yes, it seems _because of history_. However, I've thought that ensuring (providing) the consistency in behavior is one of the most important things while designing a language. Meaning that as soon as you've memorized the behavior for some scenario involving one source type (e.g. `int`), you can be sure that the same behavior applies for the same scenario involving another source type (e.g. `float`). Otherwise (in case of no consistency), for this scenario you have to remember the exact behavior for each source type. Which is somehow tiresome. – pmor Mar 14 '21 at 12:15
  • 1
    @pmor the point is that C evolved, and at first there was no standard. Like life, technology is not pure, simple or always rational. – Weather Vane Mar 14 '21 at 14:08
  • 1
    From the point of view of the Standard, the only difference between Implementation-Defined Behavior and Undefined Behavior is whether implementations would be required to document the behavior *even in cases where assuring any kind of remotely-predictable behavior would be expensive and useless*. Some compiler writers behave as though there's a huge difference between the two constructs, but the Standard was never intended to discourage compiler writers from documenting and upholding useful behavioral guarantees when practical. – supercat Mar 14 '21 at 22:01

0 Answers0