No. You have undefined behaviour possibilities.
Here is a counter-example that produces UB when assigning a negated unsigned int
to an int
:
unsigned u = (unsigned)std::numeric_limits<int>::max() - 1;
std::cout << "max int" << std::numeric_limits<int>::max() << '\n';
std::cout << "as unsigned - 1" << u << '\n';
std::cout << "negated:" << -u << '\n';
std::cout << std::boolalpha << ( std::numeric_limits<int>::max() < -u ) << '\n';
int s = -u;
std::cout << s << '\n';
On my machine:
int
's max value is 2'147'483'647, but the negated unsigned int
has a value of 2'147'483'650; that value is greater than the max value that can be represented by an int
. Know that signed overflow is undefined behaviour. Thus, the algorithm is not safe for all of its possible values.
The Standard's (2016-07-12: N4604) word:
If during the evaluation of an expression, the result is not
mathematically defined or not in the range of representable values for
its type, the behavior is undefined. [ Note: Treatment of division by
zero, forming a remainder using a zero divisor, and all floating point
exceptions vary among machines, and is sometimes adjustable by a
library function. — end note ]
In the future, you can use the {}
-style initialization to prevent such issues:
unsigned a = 5;
std::cout << -a << '\n';
int b{ -a }; // compiler detects narrowing conversions, warning/error
std::cout << b << '\n';
return 0;
Note that even though you know that -a
will be a value that can be represented by an int
, your compiler still warns you.
On signed overflow:
Is signed integer overflow still undefined behavior in C++?
On well defined unsigned overflow in both C and C++:
Why is unsigned integer overflow defined behavior but signed integer overflow isn't?
On implicit conversions:
http://en.cppreference.com/w/cpp/language/implicit_conversion