Skip to content

Commit

Permalink
Fix loading a spark model on databricks (#5299)
Browse files Browse the repository at this point in the history
Signed-off-by: Arjun DCunha <arjun.dcunha@databricks.com>
  • Loading branch information
arjundc-db committed Jan 24, 2022
1 parent a113cce commit 051e0cf
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion mlflow/spark.py
Expand Up @@ -614,7 +614,7 @@ def _load_model_databricks(model_uri, dfs_tmpdir):
# Copy the model to a temp DFS location first. We cannot delete this file, as
# Spark may read from it at any point.
fuse_dfs_tmpdir = dbfs_hdfs_uri_to_fuse_path(dfs_tmpdir)
os.mkdir(fuse_dfs_tmpdir)
os.makedirs(fuse_dfs_tmpdir)
# Workaround for inability to use shutil.copytree with DBFS FUSE due to permission-denied
# errors on passthrough-enabled clusters when attempting to copy permission bits for directories
_shutil_copytree_without_file_permissions(src_dir=local_model_path, dst_dir=fuse_dfs_tmpdir)
Expand Down

0 comments on commit 051e0cf

Please sign in to comment.