I have some validation code that checks a string against a regular expression.
Regex regex = new Regex(RegexPattern);
if (!regex.IsMatch(value))
{
errorMessage = "The data is not in the correct format.";
return false;
}
If I set the regular expression pattern to ^[0-9]*.[0-9]*.[0-9]*.[0-9]*$
, it correctly accepts 1.0.0.0
; however, it also accepts 1.0.0.
.
How can I modify the pattern so that 1.0.0.0
is accepted but 1.0.0.
is rejected?