2

I have a pickle file parameters.pkl containing some parameters and their values of a model. The pickle file has been created through the following process:

dict={'scaler': scaler,
'features': z_tags,
'Z_reconstruction_loss': Z_reconstruction_loss}

pickle.dump(dict, open('parameters.pkl', 'wb'))

model_V2.hdf5

I am new to azure machine learning studio.It will be helpful to know, how the pickle file and hdf5 files can be stored in Azure machine Learning Studio and an API endpoint be created, so that the the pickle file can be accessed through API. Objective is to access the pickle file and its contents through API.. Thanks

I upload the snapshot,objective is to access the content of the files. Examples I have seen all examples which show deployment of models where scoring scripts are used. Here I expect the API endpoints to send the parameter values. enter image description here

shan
  • 553
  • 2
  • 9
  • 25
  • I have tried https://medium.com/featurepreneur/pickle-to-api-azure-48497986868, but here, we want to access the parameters within the pickle file..The link tells about scoring script.. But here we only want to get the contents within pickle and hdf5 file – shan Aug 12 '22 at 07:23
  • Does this answer your question? [How to save and access pickle/hdf5 files in azure machine learning studio](https://stackoverflow.com/questions/73334181/how-to-save-and-access-pickle-hdf5-files-in-azure-machine-learning-studio) – SwethaKandikonda Sep 03 '22 at 12:17

0 Answers0