question

ChowAndy-9377 avatar image
0 Votes"
ChowAndy-9377 asked ·

AutoML error: Microsoft.DPrep.SharedLibrary.ErrorHandling.UnexpectedException: 'Access token authentication is not supported.'

I'm implementing a pipeline where a step involves creating and submitting an AutoML run. The issue I'm seeing occurs when the run starts "ActivityStarted: StreamingFit". I've gone ahead and pasted the error trace below.

My best guess would be something to do with authentication to our ADLS gen 2 which houses the files for our Dataset. I've tried remaking the same Dataset under the workspace default storage and it works fine. Both the cluster and the service principal associated with the AutoML run have full blob storage access.

Appreciate any help on this.

Edit: for clarity. I have a PythonScriptStep that sets up and submits an AutoMLRun.


 2021-02-18 14:52:12.367 - INFO - ActivityStarted: StreamingFit
 Error: *** Microsoft.DPrep.SharedLibrary.ErrorHandling.UnexpectedException: 'Access token authentication is not supported.' StackTrace: Elapsed time: 00:00:06.7332685
 2021-02-18 14:52:19.361 - CRITICAL - Type: AutoMLInternal
 Class: FitException
 Message: FitException:
  Message: Error: *** Microsoft.DPrep.SharedLibrary.ErrorHandling.UnexpectedException: 'Access token authentication is not supported.' 
  InnerException: BridgeRuntimeError: Error: *** Microsoft.DPrep.SharedLibrary.ErrorHandling.UnexpectedException: 'Access token authentication is not supported.' 
  ErrorResponse 
 {
     "error": {
         "code": "SystemError",
         "message": "Encountered an internal AutoML error. Error Message/Code: FitException. Additional Info: FitException:\n\tMessage: Error: *** Microsoft.DPrep.SharedLibrary.ErrorHandling.UnexpectedException: 'Access token authentication is not supported.' \n\tInnerException: None\n\tErrorResponse \n{\n    \"error\": {\n        \"message\": \"Error: *** Microsoft.DPrep.SharedLibrary.ErrorHandling.UnexpectedException: 'Access token authentication is not supported.' \",\n        \"target\": \"NimbusML\",\n        \"reference_code\": \"NimbusML\"\n    }\n}",
         "details_uri": "https://docs.microsoft.com/azure/machine-learning/resource-known-issues#automated-machine-learning",
         "target": "NimbusML",
         "inner_error": {
             "code": "ClientError",
             "inner_error": {
                 "code": "AutoMLInternal"
             }
         },
         "reference_code": "NimbusML"
     }
 }
 Traceback:
   File "telemetry_activity_logger.py", line 57, in _log_activity
     yield
   File "streaming_featurizer.py", line 165, in learn_transformations
     estimator.fit(self._training_data)
   File "streaming_estimator.py", line 70, in fit
     "nimbus ml failed to fit during featurization at {0}".format(bre.callstack))
   File "streaming_estimator.py", line 66, in fit
     self._pipeline.fit(datastream_X)
   File "utils.py", line 220, in wrapper
     params = func(*args, **kwargs)
   File "pipeline.py", line 1086, in fit
     raise e
   File "pipeline.py", line 1073, in fit
     **params)
   File "entrypoints.py", line 460, in run
     output_predictor_modelfilename)
   File "entrypoints.py", line 315, in _try_call_bridge
     model=output_modelfilename)



azure-machine-learning
10 |1000 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

ramr-msft avatar image
0 Votes"
ramr-msft answered ·

@ChowAndy-9377 Thanks for the question, AML allows to register ADLS Gen2 as data store with Workspace
https://docs.microsoft.com/en-us/azure/machine-learning/how-to-access-data#azure-data-lake-storage-generation-2

Currently you can register ADLS Gen2 container as blob using account key or SaS key.


· 1 ·
10 |1000 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

@ramr-msft Would you suspect the error to be caused by us registering the datastore of type "Azure Data Lake Storage Gen2"? I believe in the past we've done the same where we registered the data lake as a blob and it worked with no issues.

We're trying to move away from registering it as a blob and instead as a data lake so we can leverage service principal authentication instead of account key.

Thanks for the response!

1 Vote 1 ·