Questions tagged [databricks-dbx]
28 questions
3
votes
0 answers
Databricks DBX Artifact Location
We are using Databricks DBX the following way:
dbx execute for development in IDE.
Upload resulting package as Python wheel to GCS bucket using dbx deploy --assets-only. We don't create a permanent job in Databricks workflows.
Execute…

gamezone25
- 288
- 2
- 10
2
votes
1 answer
How to fix error with python spark UDF that runs ok on Databricks but not locally on DBX
I am getting an error when trying to use a python spark UDF. It works on databricks, but not in my local DBX environment. It seems to occur when I use an external library. Other UDFs work fine. Is there something I perhaps need to do to make the…

Stephen Bowser
- 21
- 2
2
votes
1 answer
Databricks DBX named parameters through job
I am trying to implement this(where I don't have variables in the conf file but passed it as named arguments)
mentioned here.
When running in local mode and a python debugger, I can easily pass this as:
Fundingsobj =…

Saugat Mukherjee
- 778
- 8
- 32
1
vote
0 answers
Need to restart Databricks 13.0 cluster to iterate on development
I want to iterate with development on a Databricks 13 cluster without the need to restart it for updating the code within my Python package.
It seems that dbx execute does the job on Databricks 12.1, but when I try to run it with Databricks 13, it…

André Salvati
- 504
- 6
- 17
1
vote
0 answers
DBX Databricks - installing private GitHub repositories on clusters in a workspace
I'm running code on Databricks clusters remotely using DBX - so my current directory is built into a wheel and then installed on the remote Databricks cluster. I'm having an issue where a private GitHub repo that I installed via poetry locally is…

flbzer
- 115
- 1
- 10
1
vote
1 answer
Job Sensors in Databricks Workflows
At the moment we schedule our Databricks notebooks using Airflow. Due to dependencies between projects, there are dependencies between DAGs. Some DAGs wait until a task in a previous DAG is finished before starting (by using sensors).
We are now…

gamezone25
- 288
- 2
- 10
1
vote
2 answers
Clear Databricks Artifact Location
I am using dbx cli to deploy my workflow into databricks. I have .dbx/project.json configured below:
{
"environments": {
"default": {
"profile": "test",
"storage_type": "mlflow",
"properties": {
…

jlim
- 909
- 2
- 12
- 24
1
vote
1 answer
How to use databricks dbx with an Azure VPN?
I am using dbx to deploy and launch jobs on ephemeral clusters on Databricks.
I have initialized the cicd-sample-project and connected to a fresh empty Databricks Free trial environment and everything works (this means, that I can successfully…

Enrico Mosca
- 11
- 1
1
vote
1 answer
How to create and read createOrReplaceGlobalTempView when using static clusters
In my deployment.yaml file I have defined a static cluster as such:
custom:
basic-cluster-props: &basic-cluster-props
spark_version: "11.2.x-scala2.12"
basic-static-cluster: &basic-static-cluster
new_cluster:
<<:…

Average_guy
- 509
- 4
- 16
1
vote
1 answer
Differences between databricks dbx execute and launch command
I have a project for which I want to be able to run some entry points on databricks. I used dbx for that, having the following deployment.yaml file:
build:
python: "poetry"
environments:
default:
workflows:
- name: "test"
…

Robin
- 1,531
- 1
- 15
- 35
1
vote
2 answers
Running local python code with arguments in Databricks via dbx utility
I am trying to execute a local PySpark script on a Databricks cluster via dbx utility to test how passing arguments to python works in Databricks when developing locally. However, the test arguments I am passing are not being read for some reason.…

bda
- 372
- 1
- 7
- 22
1
vote
2 answers
dbx databricks deploy named properties
Can anyone provide me a link to an example of using named properties in dbx. The documentation mentions an example of a .json file,
https://dbx.readthedocs.io/en/latest/named_properties.html
but it does not mention how we can invoke this file with…

Srinivas
- 2,010
- 7
- 26
- 51
1
vote
1 answer
How can I pass and than get the passed arguments in databricks job
I'm trying to pass and get arguments in my databricks job it's a spark_python_task type IT IS NOT A NOTEBOOK. I deployed my job with dbx from pycharm. I have deployment.json file where I configure deployment stuff.

Borislav Blagoev
- 187
- 5
- 15
0
votes
1 answer
Databricks dbx deploy error with authentication token
I'm trying to deploy a new workflow using the dbx cli from Databricks, when i'm running:
dbx deploy new_workflow
I'm receving the following error:
Exception: Provided configuration is not based on token authentication.Please
switch to token-based…

cilopez
- 75
- 2
- 9
0
votes
2 answers
Setup E-mail notification on failure run with DBX deployment
I am deploying workflows to Databricks using DBX. Here I want to add a step which will send an e-mail to email123@email.com whenever the workflow fails. The outline of my deloyment.yml file is as below:
deployments:
- name: my_workflow
…

andKaae
- 173
- 1
- 13