I've written Javascript code to loop through an array including information about certain restaurants. When applicable, the loop checks the distance between the user's device and the restaurant, and modifies the restaurant's object in the array to include the known distance.
In the code below, we loop through the array, and if we determine that the restaurant is a chain (ie. it does not have coordinates available) we skip it. Otherwise, we determine the distance between the user's zip code and the known coordinates of the restaurant using the Google Maps Javascript API.
for (var i = 0; i < restaurantMetadata.length; i++) {
console.log(i);
var rst = restaurantMetadata[i];
if (rst["chain"]) {} else {
console.log("getting distance");
var zip = window.localStorage.getItem("zipcode");
var coords = new google.maps.LatLng(rstLat, rstLon);
var distanceService = new google.maps.DistanceMatrixService;
distanceService.getDistanceMatrix({
origins: [zip],
destinations: [coords],
travelMode: "DRIVING"
}, function(result) {
restaurantMetadata[i]["distance"] = Math.round(((result.rows[0].elements[0].distance.value) / 1000) * 10)/10;
});
}
}
}
The distance seems to be working fine, but I get an error after the result is recieved:
TypeError: restaurantMetadata[i] is undefined
Logging i
to the console as seen on line 2 above shows 0 1 2 3
as it should, but later on, when I log it in the result body, it gets as high as four, but twice in a row. Is there a reason for this behavior? At first I thought it might be that i
was somehow being changed, but I can't find any reason for that to happen.