These features and Azure Databricks platform improvements were released in November 2019.
Releases are staged. Your Azure Databricks account may not be updated until a week or more after the initial release date.
Databricks Runtime 6.2 ML Beta
November 15, 2019
Databricks Runtime 6.2 ML Beta brings many library upgrades, including:
- TensorFlow and TensorBoard: 1.14.0 to 1.15.0.
- PyTorch: 1.2.0 to 1.3.0.
- tensorboardX: 1.8 to 1.9.
- MLflow: 1.3.0 to 1.4.0.
- Hyperopt: 0.2-db1 with Azure Databricks MLflow integrations.
- mleap-databricks-runtime to 0.15.0 and includes mleap-xgboost-runtime.
For more information, see the complete Databricks Runtime 6.2 ML release notes.
Databricks Runtime 6.2 Beta
November 15, 2019
Databricks Runtime 6.2 Beta brings new features, improvements, and many bug fixes, including:
- Delta Lake insert-only merge optimized
For more information, see the complete Databricks Runtime 6.2 release notes.
Configure clusters with your own container image using Databricks Container Services
The GA release of Databricks Container Services for Azure Databricks, which was previously announced, is now targeted for the beginning of 2020. If you are interested in trying a preview version of this service that lets you configure a cluster with your own container image, contact your Azure Databricks representative.
Cluster detail now shows only cluster ID in the HTTP path
November 19 - December 3, 2019: Version 3.6
When you connect a BI tool to Azure Databricks using JDBC/ODBC, you should use the cluster ID variant of the HTTP path, because it is unique. The cluster name option no longer appears on the JDBC/ODBC tab on the cluster detail page.
Secrets referenced by Spark configuration properties and environment variables (Public Preview)
November 7, 2019
Available in Databricks Runtime 6.1 and above.
As of the November 7 maintenance update of Databricks Runtime 6.1, the ability to reference a secret in a Spark configuration property or environment variable is in Public Preview. For details, see Secret paths in Spark configuration properties and environment variables.