The following JavaScript produces the results as indicated:
dat0 = new Date(2012, 9, 31);
dat0.setMonth(dat0.getMonth() + 1); // Oct 31 + 1 month = Dec. 1
dat1 = new Date(2011, 0, 31); // Jan 31 (non leap year)
dat1.setMonth(dat1.getMonth() + 1); // Mar 3
dat2 = new Date(2012, 0, 31); // Jan 31 (leap year)
dat2.setMonth(dat2.getMonth() + 1); // Mar 2
dat3 = new Date(2011, 11, 31); // Dec 31
dat3.setMonth(dat3.getMonth() - 1); // minus one month is Dec 1
dat4 = new Date(2011, 10, 30); // Nov 30
dat4.setMonth(dat4.getMonth() - 1); // minus one month is Oct 30
alert(dat0 + "\n" + dat1 + "\n" + dat2 + "\n" + dat3 + "\n" + dat4);
I've tested this in IE 9 and Chrome and the results are the same. Can I therefore assume that all JavaScript implementations will produce the same results? In other words, are these results dictated by an agreed to standard?
(I'm not surprised that adding one month to Oct. 31 results in Dec 1. But what I find very surprising is adding one month to Jan 31 results in two different dates in March with neither date being March 1st.)
See this jsFiddle.
Related Note: For those interested, I'm writing a closure
designed to handle additional date math functionality beyond what the Date
object provides. (Add/subtract months, months between two dates, is last day of month, is leap year, etc.) I know I could extend the Date
object (via prototypes) but there are reasons for not doing that. In any event, there is an addMonths
function that, when adding one month to Jan. 31, returns a result that is always the last day of Feb. Adding one month to Oct. 31 returns Nov 30.
You may try the code using this jsFiddle.
Comments, questions or suggestions welcomed.