1

After upgrading GCP cloud function to python3.8 started getting this error

OpenBLAS WARNING - could not determine the L2 cache size on this system

It appears in the stackdriver logs sometime after the functions gets invoked. No other errors are raised. GCF executes normally.

GCFs don't have 'cache' settings, only memory limits and those are ok.

syldman
  • 505
  • 1
  • 6
  • 18
  • Does this answer your question? [AppEngine warning - OpenBLAS WARNING - could not determine the L2 cache size on this system](https://stackoverflow.com/questions/55016899/appengine-warning-openblas-warning-could-not-determine-the-l2-cache-size-on) – DazWilkin Aug 20 '21 at 15:30
  • Saw that one, but it's not that much relevant for cloud functions as I can't set instance type or cache size on the cloud functions. – syldman Aug 21 '21 at 15:07

1 Answers1

2

Serverless environments such as App Engine, Cloud Functions and Cloud Run, run in a sandbox, similar to gVisor. This sandbox protect the system to malicious call and block some low level instruction. This one to get the CPU capabilities should be discarded.

I got the same when I ran Tensorflow Serving on Cloud Run.

guillaume blaquiere
  • 66,369
  • 2
  • 47
  • 76
  • It's weird as neither CloudRun nor CloudFunctions can select instance type / cache size. I hope I won't get a surprise bill at the end. – syldman Aug 21 '21 at 15:13
  • 1
    It's just that your app won't be optimized to leverage all the CPU capabilities and will use generic/low profile optimisation. it will take few CPU cycles more and thus cost more. But not so much at the end. – guillaume blaquiere Aug 21 '21 at 18:06