It seems like abit of overkill to allocate a different level just for a warning as opposed to error. If a value is wrong its wrongm its ok its not. Personally i tink these sort of fuzzy decisions make code difficult to understand because one know is not sure what consititutes acceptable valid input. If you do not accept crap and throw exceptions then hyour code will probably be the better for it, removing the need for warning altogether.
There are many other levels that get more logging that are probably more deserving of their own level - stuff like "config". However in the end it seems that most frameworks have settled on error, warning, info, debug and trace with variations thereof.
So how did warning survive and other levels did not ?