4

I am trying to deploy python code to azure functions. The function downloads some files from a FTP Server, saves them in the function app directory of the host machine and uploads them to an Azure SQL DB.

The first part works correctly; the files are saved in the same directory as the main python file.

When the sqlalchemy methods are invoked (specifically create_engine() ) a ModuleNotFoundError: No module named 'pyodbc' is thrown. (return __import__('pyodbc') is called)

When I try to import pyodbc specifically, the same ModuleNotFoundError is thrown.

The code works fine locally on my machine.

I specified the path to the modules in the code, thus SQLAlchemy loads (and other modules, too like feedparser).

I installed the modules through the Kudu console following this tutorial: (tl;dr make venv and install with pip)

https://github.com/yokawasa/azure-functions-python-samples/blob/master/docs/install-python-modules.md

OS of host machine: OS version: Microsoft Windows NT 10.0.14393.0 64 bit System

# -*- coding: utf-8 -*-
import sys
import os

sys.path.append(os.path.abspath(os.path.join(os.path.dirname( __file__ ), 'env/Lib/site-packages')))

from ftplib import FTP
from ast import literal_eval
import csv
import json
from sqlalchemy import create_engine, MetaData, Table
import pyodbc


def post_processing(input, output, csv_counter, header):
    '''Removes uneccessary headers from the csv files and writes the data
    into a csv file'''
    # code...
    return csv_counter, header


def download():

        # first gets all the dates as the base for the full paths
        # afterwards build each possible path to a csv file
        # structure or dimensions of folders has to be known in advance
        # advantage: specific folders are not hardcoded
    # code...

    return response


def upload():
    postreqdata = json.loads(open(os.environ['req']).read())
    response = open(os.environ['res'], 'w')
    response.write("Finito" + postreqdata["name"])
    response.close()

    try:
        # dev settings
        f = open("connection\\con_info_sink.txt", "r")
        connection_info = literal_eval(f.read())
        f.close()
    except FileNotFoundError:
        # prod settings
        connection_info = {
            "db_user": os.environ["DB_USER"],
            "db_password": os.environ["DB_PASSWORD"],
            "db_name": os.environ["DB_NAME"],
            "db_host": os.environ["DB_HOST"],
        }

    engine = create_engine(
        "mssql+pyodbc://{db_user}:{db_password}@{db_host}/{db_name} ?driver=ODBC+Driver+13+for+SQL+Server".format(**connection_info), convert_unicode=True)

    # orm mapping
    metadata = MetaData(engine)
    pcards = Table("pcardlog", metadata, autoload=True, schema='staging')

    col_names_pcard = pcards.columns.keys()
    fakedata_pcard = []

    with open('source.csv', newline='') as csvfile:
        reader = csv.reader(csvfile, delimiter=';')
        for row in reader:
            fakedata_pcard.append(
                {col_names_pcard[0]: row[0],
                    col_names_pcard[1]: row[1],
                    col_names_pcard[2]: row[2],
                    col_names_pcard[3]: row[3],
                    col_names_pcard[4]: row[4],
                    col_names_pcard[5]: row[5],
                    col_names_pcard[6]: row[6],
                    })

    con = engine.connect()
    con.execute(pcards.insert(), fakedata_pcard)

    return response


if __name__ == "__main__":
    response = download()
    response = upload()

I expected pyodbc to load without problems, as I already deployed a web app in Azure once. There I installed pyodbc with wheelhouse directly from the binaries.

I tried installing pyodbc at this project with wheels, too, but that changed nothing.

Update 1: I found this post on Stackoverflow stating that installing custom drivers is not possible on azure function apps.

ODBC or OleDB database drivers under Azure App Service

Update 2: After stumbling over a GitHub issue for the azure functions cli

https://github.com/Azure/azure-functions-python-worker/issues/249#issuecomment-452478272

and tried deploying the function in a linux environment with docker and pyodbc finally loaded. However, the driver specification for SQLAlchemy is now wrong.

I will deploy the function in an Azure Web App as I know it will work there. Azure functions with Python is in Preview so the functionality might change in the future.

  • does the module show up when you try `python -m pip list` – mugabits Jan 08 '19 at 21:31
  • @jmugz3 Yes it does. In fact, sqlalchemy does too and is imported without issues. py -V Python 3.6.6 PS>py -m pip list py -m pip list Package Version ---------- ------- feedparser 5.2.1 pip 18.1 pymssql 2.1.4 pyodbc 4.0.25 setuptools 39.0.1 SQLAlchemy 1.2.15 – Alexander Dadiani Jan 08 '19 at 22:47

0 Answers0