Work with online stores

Databricks Feature Store supports publishing features to online feature stores for real-time serving and automated feature lookup.

Databricks Feature Store supports these online stores:

Online store provider Publish Feature lookup in model serving
Azure Cosmos DB (Databricks Runtime 11.1 ML and above) (Experimental) X X
Azure MySQL (Single Server) X X
Azure SQL Server X

Note

To use Azure Cosmos DB, the account must be created with the Core (SQL) API and the networking connectivity method must be set to All networks. The appropriate Azure Cosmos DB Spark 3 OLTP Connector for SQL API must be installed on an Azure Databricks cluster. Databricks recommends that you install the latest connector version for Spark 3.2 until a connector for Spark 3.3 is released.

Publish features to an online feature store

To publish features to an online feature store for real-time serving, use publish_table.

Authentication for publishing feature table to online stores

To publish feature tables to an online store, you must provide write authentication to the online store.

Databricks recommends that you store credentials in Databricks secrets, and then refer to them using a write_secret_prefix when publishing.

Provide write credentials using Databricks secrets

See Provide online store credentials using Databricks secrets.

Authentication for looking up features from online stores with served MLflow models

To enable Databricks-hosted MLflow models to connect to online stores and look up feature values, you must provide read authentication.

These credentials must be kept in Databricks secrets, and you must pass a read_secret_prefix when publishing.

Provide online store credentials using Databricks secrets

  1. Create two secret scopes that contain credentials for the online store: one for read-only access (shown here as <read_scope>) and one for read-write access (shown here as <write_scope>). Alternatively, you can reuse existing secret scopes.

  2. Pick a unique name for the target online store, shown here as <prefix>.

    For Cosmos DB (requires Databricks Runtime 11.1 ML or above), create the following secrets:

    • Authorization key (account primary or secondary key) with read-only access to the target online store: databricks secrets put --scope <read_scope> --key <prefix>-authorization-key
    • Authorization key (account primary or secondary key) with read-write access to the target online store: databricks secrets put --scope <write_scope> --key <prefix>-authorization-key

    For SQL stores, create the following secrets:

    • User with read-only access to the target online store: databricks secrets put --scope <read_scope> --key <prefix>-user
    • Password for user with read-only access to the target online store: databricks secrets put --scope <read_scope> --key <prefix>-password
    • User with read-write access to the target online store: databricks secrets put --scope <write_scope> --key <prefix>-user
    • Password for user with read-write access to the target online store: databricks secrets put --scope <write_scope> --key <prefix>-password

Note

There is a limit on the number of secret scopes per workspace. To avoid hitting this limit, you can define and share a single secret scope for accessing all online stores.

Publish batch-computed features to an online store

You can create and schedule an Azure Databricks job to regularly publish updated features. This job can also include the code to calculate the updated features, or you can create and run separate jobs to calculate and publish feature updates.

For SQL stores, the following code assumes that an online database named “recommender_system” already exists in the online store and matches the name of the offline store. If there is no table named “customer_features” in the database, this code creates one. It also assumes that features are computed each day and stored as a partitioned column _dt.

The following code assumes that you have created secrets to access this online store.

Cosmos db

Cosmos DB support requires Databricks Runtime 11.1 ML or above.

import datetime
from databricks.feature_store.online_store_spec import AzureCosmosDBSpec

online_store = AzureCosmosDBSpec(
  account_uri='<account_uri>',
  read_secret_prefix='<read_scope>/<prefix>',
  write_secret_prefix='<write_scope>/<prefix>'
)

fs.publish_table(
  name='recommender_system.customer_features',
  online_store=online_store,
  filter_condition=f"_dt = '{str(datetime.date.today())}'",
  mode='merge'
)

Sql stores

import datetime
from databricks.feature_store.online_store_spec import AzureMySqlSpec

online_store = AzureMySqlSpec(
  hostname='<hostname>',
  port='<port>',
  read_secret_prefix='<read_scope>/<prefix>',
  write_secret_prefix='<write_scope>/<prefix>'
)

fs.publish_table(
  name='recommender_system.customer_features',
  online_store=online_store,
  filter_condition=f"_dt = '{str(datetime.date.today())}'",
  mode='merge'
)

Publish streaming features to an online store

To continuously stream features to the online store, set streaming=True.

fs.publish_table(
  name='recommender_system.customer_features',
  online_store=online_store,
  streaming=True
)

Publish selected features to an online store

To publish only selected features to the online store, use the features argument to specify the feature name(s) to publish. Primary keys and timestamp keys are always published. If you do not specify the features argument or if the value is None, all features from the offine feature table are published.

fs.publish_table(
  name='recommender_system.customer_features',
  online_store=online_store,
  features=["total_purchases_30d"]
)

Publish a feature table to a specific database

In the online store spec, specify the database name (database_name) and the table name (table_name). If you do not specify these parameters, the offline database name and feature table name are used. database_name must already exist in the online store.

online_store = AzureMySqlSpec(
  hostname='<hostname>',
  port='<port>',
  database_name='<database_name>',
  table_name='<table_name>',
  read_secret_prefix='<read_scope>/<prefix>',
  write_secret_prefix='<write_scope>/<prefix>'
)

Overwrite an existing online feature table or specific rows

Use mode='overwrite' in the publish_table call. The online table is completely overwritten by the data in the offline table.

Note

Azure Cosmos DB does not support overwrite mode.

fs.publish_table(
  name='recommender_system.customer_features',
  online_store=online_store,
  mode='overwrite'
)

To overwrite only certain rows, use the filter_condition argument:

fs.publish_table(
  name='recommender_system.customer_features',
  online_store=online_store,
  filter_condition=f"_dt = '{str(datetime.date.today())}'",
  mode='merge'
)

Enable Databricks hosted MLflow models to look up features from online stores

Important

This feature is in Public Preview.

Classic MLflow Model Serving on Databricks can automatically look up feature values from published online stores.

Requirements

Note

You can publish the feature table at any time prior to model deployment, including after model training.

Automatic feature lookup

Databricks Model Serving supports automatic feature lookup from these online stores:

  • Azure Cosmos DB (Databricks Runtime 11.1 ML and above)
  • Azure Database for MySQL

Automatic feature lookup is supported for the following data types:

  • IntegerType
  • FloatType
  • StringType
  • DoubleType
  • LongType
  • TimestampType
  • DateType
  • ShortType
  • DecimalType
  • ArrayType
  • MapType

Override feature values in online model scoring

All features required by the model (logged with FeatureStoreClient.log_model) are automatically looked up from online stores for model scoring. To override feature values when scoring a model using a REST API with Classic MLflow Model Serving, include the feature values as a part of the API payload.

Note

The new feature values must conform to the feature’s data type as expected by the underlying model.