I have an entry script that loads a pickled tokenizer object from Tensorflow and the model itself. When I try to deploy, locally or otherwise, I get an error saying something broke in the init function in the score.py script. Commenting out the tokenizer and the deployment works so I'm sure it's because of it. This is how I define the function:
def init(): global tokenizer, model tokenizer_path = os.path.join('./objs', 'tokenizer.pkl') # tried absolute path as well, didn't work tokenizer = pickle.load(tokenizer_path) # tokenizer = pickle.load(open(tokenizer_path, 'rb')) # also tried this, didn't work model = tf.keras.models.load_model(os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'model.h5'))
Is that the correct way to load a pickle object in the entry script? Any tips would be appreciated.