將模型匯入您的應用程式 Import models into your application

本文討論 DATABRICKS ML 模型匯出 工作流程的匯入和計分部分;請參閱 匯出 APACHE SPARK ML 模型和管線 ,以取得工作流程的匯出部分。This article discusses the import and scoring parts of a Databricks ML Model Export workflow; see Export Apache Spark ML models and pipelines for the export part of the workflow.

若要使用透過 Databricks ML 模型匯出匯出的模型,您可以在程式庫中呼叫 Api dbml-localTo use models exported via Databricks ML Model Export, you call APIs in the library dbml-local. 此程式庫提供 Scala 和 JAVA Api 來匯入模型,以及執行低延遲評分 (預測或推斷) 。This library provides Scala and Java APIs for importing models and performing low latency scoring (prediction or inference).

在 JAVA 應用程式中使用已儲存的模型Using saved models in Java applications

假設您已匯出羅吉斯回歸管線,並將其儲存在下方 my_models/lr_pipelineSuppose that you have exported a logistic regression pipeline and saved it under my_models/lr_pipeline. 您可以使用 ModelFactory LocalModel 從儲存的模型目錄建立,以及對新資料執行評分。You can use ModelFactory to create a LocalModel from the saved model’s directory and perform scoring on new data.

// Load exported model
String modelPath = "my_models/lr_pipeline";
LocalModel model = ModelFactory.loadModel(modelPath);

// The model input is a standard JSON string.
// The input schema here is: [origLabel: Double, features: Vector].
String input =
  "{\"origLabel\":-1.0," +
  "\"features\":{\"type\":0,\"size\":13," +
  "\"indices\":[0,2,3,4,6,7,8,9,10,11,12]," +
  "\"values\":[74.0,2.0,120.0,269.0,2.0,121.0,1.0,0.2,1.0,1.0,3.0]}" +
  "}";

// The model output is also a standard JSON string, with the expected output fields.
String output = model.transform(input);

輸入會使用方法來接受 Apache Spark 資料集和資料框架所產生的相同 JSON 格式 Dataset.toJSON (請參閱 資料集 API 檔) 。The input accepts the same JSON format produced by Apache Spark Datasets and DataFrames using the Dataset.toJSON method (see Dataset API docs). 如需詳細資訊,請參閱 dbml-本機 API 檔。See the dbml-local API docs for more details.

dbml-local在 Maven 中指定程式庫相依性Specifying the dbml-local library dependency in Maven

您可以 dbml-local 使用 Maven 座標,將程式庫相依性指定為應用程式,就像任何其他相依性一樣。You specify the dbml-local library dependency to an application just like any other dependency, with a Maven coordinate. 下列程式碼片段提供 dbml-local 在 Maven 專案組建檔案中包含的範例 pom.xmlThe following code snippet gives an example of including dbml-local in a Maven project pom.xml build file.

<!-- Add repository for dbml-local dependency -->
 <repositories>
   <repository>
     <snapshots>
       <enabled>false</enabled>
     </snapshots>
     <id>bintray-databricks-maven</id>
     <name>bintray</name>
     <url>https://dl.bintray.com/databricks/maven</url>
   </repository>
 </repositories>

 <dependencies>
   <!-- Main dependency for Model Scoring -->
   <dependency>
     <groupId>com.databricks</groupId>
     <artifactId>dbml-local</artifactId>
     <version>0.2.2-spark2.2</version>
   </dependency>
 </dependencies>

下載 dbml-local jarDownloading dbml-local JARs

dbml-localJar 可從bintray取得。The dbml-local JARs are available from bintray.

dbml-local 許可證dbml-local license

連結 dbml-local 庫會在 MIT 授權下發布。The dbml-local library is published under the MIT license.

範例應用程式Example application

您可以查看一個非常簡單的範例應用程式,示範如何 dbml-localDatabricks Ml 範例 Github 存放庫中使用 Databricks ML 模型匯出附屬程式庫。You can view a very simple example application that shows how to use the Databricks ML Model Export companion library dbml-local in the databricks-ml-examples Github repository. 此示範包含用於定型和匯出 MLlib 模型的 Databricks 筆記本。This demo includes Databricks notebooks for training and exporting MLlib models. 這些筆記本會與簡單的 JAVA 應用程式配對,以示範如何匯入模型並進行預測。These notebooks are paired with simple Java applications that show how to import models and make predictions.