The following code snippet (full code) has already been through a few memory-related optimizations as the initial "bad" approach caused a typical stack overflow due to the basic recursion implementation I had used, which caused the Docker container where the code run to fail due to out-of-memory issues.
The next step was to actually change the recursion into a simple while(true)
loop, which did work but wasn't as elegant as I wanted, so finally switched back to the recursive approach but, keeping the invocation to the transfer
recursed function as a return statement on the last line as I was told by a work colleague that NodeJS implements an optimization on such scenarios which discards all other memory usages (I don't remember the exact name of this feature, tail recursion perhaps?).
The problem with this last approach is that when debugging/profiling with Chrome's built-in tool, I still see the memory ramping up gradually, slowly, but definitely going up, so I'm not sure what would be the next step on this path, so all suggestions are appreciated.
async function transfer (backoff) {
if (backoff > MAX_BACKOFF) {
// reset backoff
backoff = DEFAULT_BACKOFF
}
const response = await fetch(`${INBOX_URL}/next`, {
method: HTTP_POST_METHOD,
headers: {
'Content-type': JSON_CONTENT_TYPE,
Accept: JSON_CONTENT_TYPE
},
body: JSON.stringify({
uid: AGNI_AGENT
})
})
if (response.status === 502) {
// Status 502 is a connection timeout error,
// may happen when the connection was pending for too long,
// and the remote server or a proxy closed it
// let's reconnect
backoff = DEFAULT_BACKOFF
} else if (response.status !== 200) {
// An error - let's show it
console.log(`Non 200 response from AGNI: ${JSON.stringify(response)}`)
backoff *= getFactor()
} else {
// Get and show the message
const message = await response.text()
if (message && message !== 'null') {
publish(message)
backoff = DEFAULT_BACKOFF
} else {
backoff *= getFactor()
}
}
await sleep(backoff)
return transfer(backoff)
}