0

Good evening,

I will have to use Spark over S3 , using parquet as file format and Delta Lake for "Data Management". The link between Spark and S3 has been solved. But when I try to use DeltaLake with Spark (using python) ... I get this error :


----------------------------------------------------------------------------------

>           ::::::::::::::::::::::::::::::::::::::::::::::
>     
>           ::          UNRESOLVED DEPENDENCIES         ::
>     
>           ::::::::::::::::::::::::::::::::::::::::::::::
>     
>           :: io.delta#delta-core_2.12;2.2.0: not found
>     
>           ::::::::::::::::::::::::::::::::::::::::::::::
>     
>     
>     :::: ERRORS
>       Server access error at url https://repo1.maven.org/maven2/io/delta/delta-core_2.12/2.2.0/delta-core_2.12-2.2.0.pom
> (javax.net.ssl.SSLException: Unexpected error:
> java.security.InvalidAlgorithmParameterException: the trustAnchors
> parameter must be non-empty)
>     
>       Server access error at url https://repo1.maven.org/maven2/io/delta/delta-core_2.12/2.2.0/delta-core_2.12-2.2.0.jar
> (javax.net.ssl.SSLException: Unexpected error:
> java.security.InvalidAlgorithmParameterException: the trustAnchors
> parameter must be non-empty)
>     
>       Server access error at url https://repos.spark-packages.org/io/delta/delta-core_2.12/2.2.0/delta-core_2.12-2.2.0.pom (javax.net.ssl.SSLException: Unexpected error:
> java.security.InvalidAlgorithmParameterException: the trustAnchors
> parameter must be non-empty)
>     
>       Server access error at url https://repos.spark-packages.org/io/delta/delta-core_2.12/2.2.0/delta-core_2.12-2.2.0.jar (javax.net.ssl.SSLException: Unexpected error:
> java.security.InvalidAlgorithmParameterException: the trustAnchors
> parameter must be non-empty)
>     
>     ---------------------------------------------------------------------------
> 

I know that this error is linked with the HTTPS and SSL certificates (when trying to connect to maven or spark repo), as well with the certificates stored in the server (/etc/ssl/certs/java/cacerts). I have already reinstall openjdk 11 , updates certificates, doing post insall ca-certificates-java. But the error is still here. I would like someone to tell me how to find more informations about this error and to know if java is looking on the good directory for the certificates, is the certificate is good or not. The log of the error is not that explicite .. And I have had a deep look on StackOverFlow to find solutions. But nothing solved the problem.

Thanks for your help

Have a nice evening

B.

Reinstall openjdk-11 Udpate management.properties, security in java configuration (inform java about JKS format and cacerts location)

Alex Ott
  • 80,552
  • 8
  • 87
  • 132
BFR92
  • 23
  • 6

2 Answers2

0

java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty on Linux, or why is the default truststore empty

Are you sure it is not a JAVA pathing issue. Spark is based on Scala which runs on a Java Virtual machine. Just a thought.

CRAFTY DBA
  • 14,351
  • 4
  • 26
  • 30
0

I have found my way. So here is the full explanation:

  1. I use a Debian 11 distribution and as I am not really confident with my modifications on the JVM, I did a remove of the openjdk-11-jdk package.

  2. I have rename the file cacerts to OLD_cacerts located in /etc/ssl/certs/java

  3. I did reinstall the package openjdk-11-jdk

  4. I went to the directory /usr/lib/jvm/java-11-openjdk-amd64/lib/security and install the needed certificate (the one for Maven repository) :

    sudo keytool -import -file /your-path-to-the-file/upload/repo1-maven-org.pem -alias Maven -keystore cacerts 
    
  5. There is no more errors now when spark try to solve package dependencies.

Jeremy Caney
  • 7,102
  • 69
  • 48
  • 77
BFR92
  • 23
  • 6