thub.users.profile.tabs.comments.personalized


If you want to run batch inference, you don't need to convert to batch-inference pipeline. You can publish it directly by following this document: https://docs.microsoft.com/en-us/azure/machine-learning/how-to-retrain-designer. If you need to run a real-time inference pipeline, the workaround is to clone the pipeline first and then submit the run. After that, you should be able to deploy a real-time endpoint.

Yes. Web service input/output are only applicable to real-time endpoints. Could you please reach out to me via: luzhan@microsoft.com? We can have a quick call to discuss the potential solutions to your problem, as well as the documentation improvement we can do.

@SriramNarayanan-6939, it seems customer insight only supports batch inference pipeline for now. Please refer to this document: https://docs.microsoft.com/en-us/dynamics365/customer-insights/audience-insights/azure-machine-learning-experiments.

Sorry for the inconvenience caused. This is a known issue and we are proactively working on the fix. Will keep you updated on the progress:)