0

I have an application that requires accessing DynamoDB tables. Each worker establishes a connection with the database on its own.

I have added both AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY to both master's and workers spark-env.sh file. I have also run the file using sh to make sure the variables are exported.

When the code runs, I always get the error:

Caused by: com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain
    at com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials(AWSCredentialsProviderChain.java:131)
    at com.amazonaws.http.AmazonHttpClient.getCredentialsFromContext(AmazonHttpClient.java:774)
    at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:800)
    at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:695)
    at com.amazonaws.http.AmazonHttpClient.doExecute(AmazonHttpClient.java:447)
    at com.amazonaws.http.AmazonHttpClient.executeWithTimer(AmazonHttpClient.java:409)
    at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:358)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.doInvoke(AmazonDynamoDBClient.java:2051)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.invoke(AmazonDynamoDBClient.java:2021)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.describeTable(AmazonDynamoDBClient.java:1299)
    at com.amazon.titan.diskstorage.dynamodb.DynamoDBDelegate.describeTable(DynamoDBDelegate.java:635)
    ... 27 more

It seems that the AWS SDK has failed to load the credentials even though they're exported. What type of solution should I try?

Mohamed Taher Alrefaie
  • 15,698
  • 9
  • 48
  • 66

1 Answers1

2

You can use the setExecutorEnv method on the SparkConf. E.g

  /**
   * Set an environment variable to be used when launching executors for this application.
   * These variables are stored as properties of the form spark.executorEnv.VAR_NAME
   * (for example spark.executorEnv.PATH) but this method makes them easier to set.
   */
  def setExecutorEnv(variable: String, value: String): SparkConf = {
    set("spark.executorEnv." + variable, value)
  }

Also

  /**
   * Set multiple environment variables to be used when launching executors.
   * These variables are stored as properties of the form spark.executorEnv.VAR_NAME
   * (for example spark.executorEnv.PATH) but this method makes them easier to set.
   */
  def setExecutorEnv(variables: Seq[(String, String)]): SparkConf = {
    for ((k, v) <- variables) {
      setExecutorEnv(k, v)
    }
    this
  }

You might consider other options such as setting java System properties: the SparkConf will automatically pick them up.

WestCoastProjects
  • 58,982
  • 91
  • 316
  • 560
  • and read that value in executors like below :::::: val property_value=System.getEnv("property_key") – Suresh Sep 21 '16 at 21:24