4

I am using the below command in Azure Databricks to try and copy the file test.csv from the local C: drive to the Databricks dbfs location as shown.

dbutils.fs.cp("C:/BoltQA/test.csv", "dbfs:/tmp/test_files/test.csv")

I am getting this error:

java.io.IOException: No FileSystem for scheme: C
---------------------------------------------------------------------------
ExecutionError                            Traceback (most recent call last)
<command-3936625823332356> in <module>
----> 1 dbutils.fs.cp("C:/test.csv", "dbfs:/tmp/test_files/test.csv")
      2 

/local_disk0/tmp/1605164901540-0/dbutils.py in f_with_exception_handling(*args, **kwargs)
    312                     exc.__context__ = None
    313                     exc.__cause__ = None
--> 314                     raise exc
    315             return f_with_exception_handling
    316 

Help please.

ibexy
  • 609
  • 3
  • 16
  • 34

1 Answers1

1

The local files can be recognised with file://... so make a change to the command similar to below

dbutils.fs.cp("file://c:/user/file.txt",<container path>)
shiva
  • 5,083
  • 5
  • 23
  • 42
Vhota
  • 11
  • 2
  • 1
    No, it won't work because in this case local means "local to the driver node", not to your local computer. – Alex Ott Jan 30 '23 at 12:56
  • I am a little late to the party here... but it is possible to upload to Databricks using a local file on local computer using Databricks-CLI. You have to run it subprocess terminal command (if you want to automate with it. But checkout here: https://docs.databricks.com/en/archive/dev-tools/cli/index.html) – bwan1011 Aug 18 '23 at 16:41