0

I deploy Azure functions using yaml scripts.

For some reasons, I do not put all my package requirements in the file requirements.txt, as this is used by some other processes. Yet, when deploying my web app through a YAML pipeline, I want to install additional python packages. However, the resulting app that is deployed crashes with errors saying that it does not have those packages installed.

my pipeline:

# Python to Linux Web App on Azure
# Build your Python project and deploy it to Azure as a Linux Web App.
# Change python version to one thats appropriate for your application.
# https://learn.microsoft.com/azure/devops/pipelines/languages/python

trigger:
- master

variables:
  # Azure Resource Manager connection created during pipeline creation
  azureServiceConnectionId: .........................

  # Web app name
  webAppName: .........................

  # Agent VM image name
  vmImageName: 'ubuntu-latest'

  # Environment name
  environmentName: .........................

  # Project root folder. Point to the folder containing manage.py file.
  projectRoot: $(System.DefaultWorkingDirectory)

  # Python version: 3.8
  pythonVersion: '3.8'

stages:
- stage: Build
  displayName: Build stage
  jobs:
  - job: BuildJob
    pool:
      vmImage: $(vmImageName)
    steps:
    - task: UsePythonVersion@0
      inputs:
        versionSpec: '$(pythonVersion)'
      displayName: 'Use Python $(pythonVersion)'

    - script: |
        python -m venv antenv
        source antenv/bin/activate
        python -m pip install --upgrade pip
        pip install setup
        pip install -r requirements.txt
        pip install pyodbc
      workingDirectory: $(projectRoot)
      displayName: "Install requirements"

    - task: ArchiveFiles@2
      displayName: 'Archive files'
      inputs:
        rootFolderOrFile: '$(projectRoot)'
        includeRootFolder: false
        archiveType: zip
        archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
        replaceExistingArchive: true

    - upload: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
      displayName: 'Upload package'
      artifact: drop

- stage: Deploy
  displayName: 'Deploy Web App'
  dependsOn: Build
  condition: succeeded()
  jobs:
  - deployment: DeploymentJob
    pool:
      vmImage: $(vmImageName)
    environment: $(environmentName)
    strategy:
      runOnce:
        deploy:
          steps:

          - task: UsePythonVersion@0
            inputs:
              versionSpec: '$(pythonVersion)'
            displayName: 'Use Python version'

          - task: AzureWebApp@1
            displayName: 'Deploy Azure Web App : .........................
            inputs:
              azureSubscription: $(azureServiceConnectionId)
              appName: $(webAppName)
              package: $(Pipeline.Workspace)/drop/$(Build.BuildId).zip
              startUpCommand: 'startup-nocontainer.sh'

Here, pyodbc is installed separately in the pipeline, and not present in the file requirements.txt. Checking the logs of the pipeline, I can see that the installation was performed successfully. Yet when the app starts, it crashes at first import of pyodbc, as if the app that is eventually deployed only relyed on requirements.txt.

Any idea?

scd75
  • 1

1 Answers1

0

I think pyodbc is not being installed in the venv you create. The venv contains only the packages you specify in requirements.txt

Check the solutions proposed in this old post

  • Well here both packages that are referenced in the "requirements.txt" file and pyodbc are installed in the very same way, as you can see in the yaml, and the venv has been created and activated: python -m venv antenv source antenv/bin/activate python -m pip install --upgrade pip pip install setup pip install -r requirements.txt pip install pyodbc – scd75 Jul 27 '21 at 15:43