I have written a code to calculate number of days between dates in following way
SimpleDateFormat date = new SimpleDateFormat("yyyy-MM-dd");
Date date1 = date.parse(startDate);
Date date2 = date.parse(endDate);
long difference = (date2.getTime()-date1.getTime())/(24*60*60*1000);
My job is to find whether term between startDate and endDate is exactly one year.
i am initially checking year type(whether leap year/normal year) after calculating difference between dates.
if non leap year, then i will check whether difference is 365
if leap year, then i will check whether difference is 366.
But for few dates in november say
startDate = 11/04/2015(MM/dd/yyyy) endDate = 11/04/2016(MM/dd/yyyy)
difference is calculated as 365 where code is expecting 366 as endDate year 2016 is a leap year and also end date's month is after february.
What really happening is absolute difference we are getting is 365.9 but not 366.0
This is happening for only few dates of november as per my observations.
11/02/2015 to 11/02/2016, 11/03/2015 to 11/03/2016 , 11/04/2015 to 11/04/2016, 11/05/2015 to 11/05/2016,11/06/2015 to 11/06/2016.
For remaining i am seeing difference as 366.0.
My question is, Why this peculiar behavior we are seeing for these few dates only. What is the problem with date.getTime when it always returns milliseconds passed since January 1, 1970, 00:00:00 GMT.