I am working on making a online compiler. I have a pod that receives post request with code. This pod will proceed to create a job and pass over the code so it can be compiled and process the output.
As of now, I have managed to reach up to creating job via kubernetes-client for javascript since my app is nodejs based. This is what I am doing right now more like a mockup .
File that received the request and proceeds to initiate the job.
import express, {Request, Response} from 'express';
import {SubmissionCreatedPublisher} from '../events/publishers/submission-created-publisher';
import {natsWrapper} from '../nats-wrapper';
import {createCompilerJob} from "../jobs/create";
const router = express.Router();
router.post('/api/submission', async (req: Request, res: Response) => {
const {code} = req.body;
res.status(200).send(code);
await new SubmissionCreatedPublisher(natsWrapper.client).publish({
code: code
})
const compilerJob = new createCompilerJob;
compilerJob.create(code);
})
export {router as createRouter};
File to create a job. Looking at here, for now I have only created a docker image and added file.js which has simple console.log('this is string')
line since I am testing this out. I am aware i need to pass the code
variable to the command line so it could write the code to the .js file and execute it.
const k8s = require('@kubernetes/client-node');
export class createCompilerJob {
protected kubeConfig: any;
constructor() {
this.kubeConfig = new k8s.KubeConfig();
}
create(input:string) {
this.kubeConfig.loadFromCluster();
const batchV1Api = this.kubeConfig.makeApiClient(k8s.BatchV1Api);
try {
batchV1Api.createNamespacedJob('default', {
apiVersion: 'batch/v1',
kind: 'Job',
metadata: {
name: 'compiler-job'
},
spec: {
ttlSecondsAfterFinished: 20,
template: {
metadata: {
name: 'compiler-job'
},
spec: {
containers: [{
image: 'woofboy/compiler',
name: 'compiler',
command: ["node", "file.js"]
}],
restartPolicy: "OnFailure"
}
}
}
}).catch((e: any) => console.log(e));
} catch (error) {
console.log(error);
}
}
}
This is the Dockerfile used for the image.
FROM node:alpine
WORKDIR /app
RUN echo "console.log('this is a line')" > file.js
I am out of clue on how to pass back the output of job back to my nodejs app pod. What I am thinking of using a PV that both Job and the Pod can have access and let the Job to write the output to file in the shared PV. The Pod checks the Job completion status then proceeds to read the output from the PV file?
Is this the correct approach or there is any other better way? As I am aware, we could only get the logs of the Job which I don't think its the right way to filter and find the correct output or stdout / stderr.