0

I want to be able to load data from within the model, but without having to deliver the data files with the container that the model is built upon, i have used that command to upload the model to the platform:

gcloud beta ai models upload \
  --display-name=MODEL_NAME \
  --region=us-central1 \
  --container-image-uri=us-central1-docker.pkg.dev/MODEL_URI \
  --container-predict-route=/predict \
  --container-health-route=/health \
  --container-ports=80  \
  --artifact-uri=gs://BUCKET_NAME/DIR_WHERE_DATA_IS

So, if in: gs://BUCKET_NAME/DIR_WHERE_DATA_IS there exist a file named foo.bar i expect that i will be able to read it from within the code built by the docker as if it was local, is that correct? because when trying to deploy the model i get that files (say: "foo.bar" that is in gs://BUCKET_NAME/DIR_WHERE_DATA_IS) can't be found

Tsvi Sabo
  • 575
  • 2
  • 11

1 Answers1

1

Specifying --artifact-uri=gs://BUCKET_NAME/DIR_WHERE_DATA_IS only makes the bucket available with the correct credentials from within the container that is running on the platform, you should read the env "AIP_STORAGE_URI" for the gcs path where a copy of the data is located and download from that uri the data that your model needs https://cloud.google.com/ai-platform-unified/docs/predictions/custom-container-requirements#artifacts

Bear in mind that you will also need to authenticate your account from within the container How to authenticate google cloud SDK on a docker Ubuntu image?

Tsvi Sabo
  • 575
  • 2
  • 11