8

I would like to use the "input step" of Jenkins to upload a binary file to the current workspace.

However, the code below seems to upload the file to the Jenkins master, not to the workspace of the current job on the slave where the job is running. Is there any way to fix that?

Preferably without having to add an executor on the master or clutter the master disk with files.

def inFile = input id: 'file1', message: 'Upload a file', parameters: [file(name: 'data.tmp', description: 'Choose a file')]
Timmy Brolin
  • 1,101
  • 1
  • 8
  • 18

1 Answers1

10

Seems Jenkins officially doesn't support upload of binary file yet as you can see in JENKINS-27413. You can still make use of the input step to get binary file in your workspace. We will be using a method to get this working but we will not use it inside the Jenkinsfile otherwise we will encounter errors related to In-process Script Approval. Instead, we will use Global Shared Libraries, which is considered one of Jenkins' best practices.

Please follow these steps:

1) Create a shared library

  • Create a repository test-shared-library
  • Create a directory named vars in above repository. Inside vars directory, create a file copy_bin_to_wksp.groovy with the following content:

def inputGetFile(String savedfile = null) {
    def filedata = null
    def filename = null
    // Get file using input step, will put it in build directory
    // the filename will not be included in the upload data, so optionally allow it to be specified

    if (savedfile == null) {
        def inputFile = input message: 'Upload file', parameters: [file(name: 'library_data_upload'), string(name: 'filename', defaultValue: 'demo-backend-1.0-SNAPSHOT.jar')]
        filedata = inputFile['library_data_upload']
        filename = inputFile['filename']
    } else {
        def inputFile = input message: 'Upload file', parameters: [file(name: 'library_data_upload')]
        filedata = inputFile
        filename = savedfile
    }

    // Read contents and write to workspace
    writeFile(file: filename, encoding: 'Base64', text: filedata.read().getBytes().encodeBase64().toString())
    // Remove the file from the master to avoid stuff like secret leakage
    filedata.delete()
    return filename
}

2) Configure Jenkins for accessing Shared Library in any pipeline job

  • Go to Manage Jenkins » Configure System » Global Pipeline Libraries section
  • Name the library whatever you want (in my case, my-shared-library as shown below)
  • Keep the default to master (this is the branch where i pushed my code)
  • No need to check/uncheck the check-boxes unless you know what you're doing

enter image description here

3) Access shared library in your job

  • In Jenkinsfile, add the following code:

@Library('my-shared-library@master') _

node {
   // Use any file name in place of *demo-backend-1.0-SNAPSHOT.jar* that i have used below
   def file_in_workspace = copy_bin_to_wksp.inputGetFile('demo-backend-1.0-SNAPSHOT.jar')
   sh "ls -ltR"
}

enter image description here

You're all set to run the job. :)

enter image description here

Note:

  • Make sure Script Security plugin is always up-to-date
  • How are Shared Libraries affected by Script Security?
  • Global Shared Libraries always run outside the sandbox. These libraries are considered "trusted:" they can run any methods in Java, Groovy, Jenkins internal APIs, Jenkins plugins, or third-party libraries. This allows you to define libraries which encapsulate individually unsafe APIs in a higher-level wrapper safe for use from any Pipeline. Beware that anyone able to push commits to this SCM repository could obtain unlimited access to Jenkins.
  • Folder-level Shared Libraries always run inside the sandbox. Folder-based libraries are not considered "trusted:" they run in the Groovy sandbox just like typical Pipelines.

Code Reference: James Hogarth's comment

Technext
  • 7,887
  • 9
  • 48
  • 76
  • Your answer seems to be about how to pass a binary file as a parameter when starting a job. That is unfortunately not what I asked about. I need to pass a binary file to a running job in the "input step". See: https://jenkins.io/doc/pipeline/steps/pipeline-input-step/ – Timmy Brolin Aug 19 '19 at 07:19
  • @TimmyBrolin: Try the updated answer and see if it works for you. – Technext Aug 19 '19 at 09:44
  • Thank you. It works. But I think I can see two small issues. 1: There is a race condition for the file on the master disk, if multiple jobs run at the same time. This can probably be fixed by randomizing the filename on the master. 2: Some cluttering with temporary files on the master disk, and potential security concerns since the scripts now has permission to access the master disk. Perhaps there is no way around this? – Timmy Brolin Aug 20 '19 at 08:28
  • 1) Yes, you will have to randomize file names to avoid race-condition conflicts. 2) If you're referring to temporary files created due to step 1, then that can be taken care within the script itself i guess. For the potential risk issue, as mentioned in [this](https://stackoverflow.com/questions/38276341/jenkins-ci-pipeline-scripts-not-permitted-to-use-method-groovy-lang-groovyobject#comment85789440_39412951) comment and even as a best practice, `shared library` can be used. Please see my updated approach. – Technext Aug 20 '19 at 12:06
  • Hi, I'm not sure where to put the `file_in_workspace` as I need to access it in the shell script afterwards, however, I got error like this: ```WorkflowScript: 10: Expected a step @ line 10, column 21. def file_in_workspace = copy_bin_to_wksp.inputGetFile('demo-backend-1.0-SNAPSHOT.jar')``` – Zennichimaro Feb 08 '21 at 09:49