0

Adding Private GCP Repo Breaks normal pip behaviour

When using Google Cloud Platform's Artifact Repository, you have to alter your .pypirc file for any uploads (twine) and your pip.conf for any downloads (pip).

For the downloads specifically, you have to add something like:

[global]
extra-index-url = https://<YOUR-LOCATION>-python.pkg.dev/<YOUR-PROJECT>/<YOUR-REPO-NAME>/simple/

However, by doing this, now anything that will call pip will also check this extra repository, and when doing so, it will ask for a user name and password. This means that anything, like calls behind the scenes that poetry, pdm, pip, or pipx do will all ask for this username and password. Often behind the scenes in a way that is not exposed to the user, so that everything just stalls.

Non-ideal, but working, solution:

I ran across this "solution", which does indeed work, but which the author himself says is not the right way to do things because it compromises security, bringing us back to the "infinitely live keys stored on a laptop" days.

More secure solution??

But what is the right solution? I want the following:

  1. To be able to run things like pip, pdm, etc. on my local machine and not have them stall, waiting for a username and password that I cannot fill out.
    • This is both for things that are in fact in my private repository, but also things living in normal PYPI or wherever I look.
  2. To keep the security in place, so that I am being recognized as "ok to do this" because I have authorized myself and my computer via gcloud auth login or something similar (gcloud auth login does nothing to assist with this repo issue, at least not with any flags I tried).
  3. And still be able to perform twine actions (upload to registry) without problems.
Mike Williamson
  • 4,915
  • 14
  • 67
  • 104
  • 1
    Have checked these threads: [thread1](https://stackoverflow.com/a/72280347/18265570) & [thread2](https://stackoverflow.com/a/68998902/18265570)? – Roopa M Jun 20 '23 at 12:18

0 Answers0