Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Important
This documentation has been retired and might not be updated.
This information applies to legacy Databricks CLI versions 0.18 and below. Databricks recommends that you use newer Databricks CLI version 0.205 or above instead. See What is the Databricks CLI?. To find your version of the Databricks CLI, run databricks -v
.
To migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above, see Databricks CLI migration.
You run Databricks workspace CLI subcommands by appending them to databricks workspace
. These subcommands call the Workspace API.
databricks workspace -h
Usage: databricks workspace [OPTIONS] COMMAND [ARGS]...
Utility to interact with the Databricks workspace. Workspace paths must be
absolute and be prefixed with `/`.
Common Options:
-v, --version [VERSION]
-h, --help Show this message and exit.
Commands:
delete Deletes objects from the Databricks workspace. rm and delete are synonyms.
Options:
-r, --recursive
export Exports a file from the Databricks workspace.
Options:
-f, --format FORMAT SOURCE, HTML, JUPYTER, or DBC. Set to SOURCE by default.
-o, --overwrite Overwrites file with the same name as a workspace file.
export_dir Recursively exports a directory from the Databricks workspace.
Options:
-o, --overwrite Overwrites local files with the same names as workspace files.
import Imports a file from local to the Databricks workspace.
Options:
-l, --language LANGUAGE SCALA, PYTHON, SQL, R [required]
-f, --format FORMAT SOURCE, HTML, JUPYTER, or DBC. Set to SOURCE by default.
-o, --overwrite Overwrites workspace files with the same names as local files.
import_dir Recursively imports a directory to the Databricks workspace.
Only directories and files with the extensions .scala, .py, .sql, .r, .R,
.ipynb are imported. When imported, these extensions are stripped off
the name of the notebook.
Options:
-o, --overwrite Overwrites workspace files with the same names as local files.
-e, --exclude-hidden-files
list Lists objects in the Databricks workspace. ls and list are synonyms.
Options:
--absolute Displays absolute paths.
-l Displays full information including ObjectType, Path, Language
ls Lists objects in the Databricks workspace. ls and list are synonyms.
Options:
--absolute Displays absolute paths.
-l Displays full information including ObjectType, Path, Language
mkdirs Makes directories in the Databricks workspace.
rm Deletes objects from the Databricks workspace. rm and delete are synonyms.
Options:
-r, --recursive
Delete an object from a workspace
To display usage documentation, run databricks workspace delete --help
or databricks workspace rm --help
.
databricks workspace delete --recursive "/Users/[email protected]/My Folder"
Or:
databricks workspace rm --recursive "/Users/[email protected]/My Folder"
If successful, no output is displayed.
Export a file from a workspace to your local filesystem
To display usage documentation, run databricks workspace export --help
.
databricks workspace export --overwrite --format JUPYTER "/Users/[email protected]/My Python Notebook" /Users/me/Downloads
This option can also be used to export notebooks from a Databricks Git folder:
databricks workspace export "/Repos/[email protected]/MyRepoNotebook" /Users/me/Downloads
If successful, no output is displayed.
Export a directory from a workspace to your local filesystem
To display usage documentation, run databricks workspace export_dir --help
.
databricks workspace export_dir --overwrite /Users/[email protected]/my-folder /Users/me/Downloads/my-folder
/Users/[email protected]/my-folder/My Python Notebook -> /Users/me/Downloads/my-folder/My Python Notebook.py
/Users/[email protected]/my-folder/My Scala Notebook -> /Users/me/Downloads/my-folder/My Scala Notebook.scala
/Users/[email protected]/my-folder/My R Notebook -> /Users/me/Downloads/my-folder/My R Notebook.r
/Users/[email protected]/my-folder/My SQL Notebook -> /Users/me/Downloads/my-folder/My SQL Notebook.sql
Import a file from your local filesystem into a workspace
To display usage documentation, run databricks workspace import --help
.
Only files with the extensions .scala
, .py
, .sql
, .r
, .R
can be imported.
When imported, these extensions are stripped from the notebook name.
databricks workspace import ./a.py /Users/[email protected]/example
./a.py -> /Users/[email protected]/example/a
Import a directory from your local filesystem into a workspace
To display usage documentation, run databricks workspace import_dir --help
.
This command recursively imports a directory
from the local filesystem into the workspace. Only directories and
files with the extensions .scala
, .py
, .sql
, .r
, .R
are imported.
When imported, these extensions are stripped from the notebook name.
To overwrite existing notebooks at the target path, add the flag --overwrite
or -o
.
tree
.
├── a.py
├── b.scala
├── c.sql
├── d.R
└── e
databricks workspace import_dir . /Users/[email protected]/example
./a.py -> /Users/[email protected]/example/a
./b.scala -> /Users/[email protected]/example/b
./c.sql -> /Users/[email protected]/example/c
./d.R -> /Users/[email protected]/example/d
databricks workspace ls /Users/[email protected]/example -l
NOTEBOOK a PYTHON
NOTEBOOK b SCALA
NOTEBOOK c SQL
NOTEBOOK d R
DIRECTORY e
List objects in a workspace
To display usage documentation, run databricks workspace list --help
or databricks workspace ls --help
.
databricks workspace list --absolute --long --id /Users/[email protected]
Or:
databricks workspace ls --absolute --long --id /Users/[email protected]
NOTEBOOK /Users/[email protected]/My Python Notebook PYTHON 1234567898012345
NOTEBOOK /Users/[email protected]/My Scala Notebook SCALA 2345678980123456
NOTEBOOK /Users/[email protected]/My R Notebook R 3456789801234567
DIRECTORY /Users/[email protected]/My Directory 4567898012345678
MLFLOW_EXPERIMENT /Users/[email protected]/My_Experiment 5678980123456789
Create a directory in a workspace
To display usage documentation, run databricks workspace mkdirs --help
.
databricks workspace mkdirs "/Users/[email protected]/My New Folder"
If successful, no output is displayed.