This article explains how you can execute a pipeline written in java via a Cloud Function. However, I'm trying to accomplish this with a pipeline written in python.
I'm able to do this successfully when executing a local Cloud Function using a virtualenv environment for python. This is before being packaged up as a zip.
exports.foo = function(event, callback) {
var spawn = require('child_process').spawn;
var child = spawn(
'ENV/bin/python',
["pipeline.py",
"--project $PROEJCT_ID",
"--temp_location gs://$BUCKET/temp",
"--staging_location gs://$BUCKET/staging",
"--runner DataflowRunner"],
{cwd: __dirname}
);
child.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
child.stderr.on('data', (data) => {
console.log(`stderr: ${data}`);
});
child.on('close', (code) => {
console.log(`child process exited with code ${code}`);
callback();
});
};
Though, when I do the actual deployment of the Function to GCP and run from there, the pipeline never executes.
Any insight on this would be appreciated.
Below is from the logs when running a deployed Function:
D foo vxvt93uc415v 2017-03-05 00:56:43.639 Function execution started
D foo vxvt93uc415v 2017-03-05 00:56:57.945 Function execution took 14308 ms, finished with status: 'ok'
UPDATE:
There was an error that I wasn't logging out correctly:
ENV/bin/python is not a supported ELF or interpreter script
I've reached out to the Cloud Functions team who then filed a bug report.