I have the same problem . did you get a solution ?
Loading pickle object in entry script in Azure ML
2JK
241
Reputation points
I have an entry script that loads a pickled tokenizer object from Tensorflow and the model itself. When I try to deploy, locally or otherwise, I get an error saying something broke in the init function in the score.py script. Commenting out the tokenizer and the deployment works so I'm sure it's because of it. This is how I define the function:
def init():
global tokenizer, model
tokenizer_path = os.path.join('./objs', 'tokenizer.pkl') # tried absolute path as well, didn't work
tokenizer = pickle.load(tokenizer_path)
# tokenizer = pickle.load(open(tokenizer_path, 'rb')) # also tried this, didn't work
model = tf.keras.models.load_model(os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'model.h5'))
Is that the correct way to load a pickle object in the entry script? Any tips would be appreciated.
2 answers
Sort by: Newest
-
-
romungi-MSFT 42,286 Reputation points Microsoft Employee
2021-10-04T13:34:19.55+00:00 @2JK I think this should help.
from sklearn.externals import joblib
tokenizer_path = os.path.join('./objs', 'tokenizer.pkl') # tried absolute path as well, didn't work tokenizer = joblib.load(tokenizer_path)
Did you also try the absolute path in your tokenizer_path?