1

When trying to import python libraries at a spark pool level by applying an uploaded requirements.txt file and custom packages, I get the following error with no other details:

CreateOrUpdateSparkComputeFailed Error occured while processing the request

It was working perfectly fine few days back. Last upload was successful on 12/3/2021.

Also SystemReservedJob-LibraryManagement application job not getting triggered.

Environment Details:

  • Azure Synapse Analytics
  • Apache Spark pool - 3.1

We tried below things:

  1. increase the vcore size up to 200
  2. uploaded the same packages to different subscription resource and it is working fine.
  3. increased the spark pool size.

Please suggest

Thank you

Vincent Doba
  • 4,343
  • 3
  • 22
  • 42
Rajesh
  • 11
  • 1

1 Answers1

1

Make sure you have below packages in your requirement.txt

Before that we need to check about the packages which are installed and which are not. You can get all the details of packages install by running below lines of code and can conclude which packages are missing and can keep them in place:

import pkg_resources 
for d in pkg_resources.working_set: 
    print(d)
 

Install the missing libraries with Requirement.txt.

I faced the similar use case where I got good information and step procedure from MS Docs, have a look on it to handle workspace libs

SaiKarri-MT
  • 1,174
  • 1
  • 3
  • 8