I'm trying to write my own log files to Azure Datalake Gen 2 in a Python-Notebook within Databricks. I'm trying to achieve that by using the Python logging module.
Unfortunately I can't get it working. No errors are raised, the folders are created but no file with logging content is created. Even if the files exists, nothing is written to it.
A local python script works just fine, but I can't get it working in Databricks.
Here is my code:
# mount
if not any(mount.mountPoint == '/mnt/log' for mount in dbutils.fs.mounts()):
dbutils.fs.mount(
source = "abfss://log@datalake.dfs.core.windows.net/",
mount_point = "/mnt/log",
extra_configs = configs)
# vars
folder_log = '/mnt/log/test/2019'
file_log = '201904.log'
# add folder if not existent
dbutils.fs.mkdirs(folder_log)
# setup logging
import logging
logging.basicConfig(
filename=folder_log+'/'+file_log,
format='%(asctime)s | %(name)s | %(levelname)s | %(message)s',
datefmt='%Y-%m-%d %H:%M:%S UTC (%z)',
level=logging.NOTSET
)
# test
logging.info('Hello World.')
Mounting seems to be ok.
Adding and writing files with dbutils works fine:
dbutils.fs.put(folder_log+'/'+file_log, 'Hello World.')
Writing to file like that works fine too:
f = open('/dbfs/mnt/log/test/2019/201904.log', 'w+')
f.write("This is line %d\r\n")
f.close()
Also tried adding "dbfs" to path
filename='/dbfs'+folder_log+'/'+file_log,
Any ideas?