0

I have enabed alwayson-SQL through the dse.yaml file. When I am typing the following command "dse client-tool alwayson-sql start" i get the followig errors from system.log

WARN  2019-12-26 12:22:33,606 org.apache.spark.util.Utils: Your hostname, ubuntu1 resolves to a loopback address: 127.0.1.1; using 192.168.93.124 instead (on interface enp0s3)
WARN  2019-12-26 12:22:33,608 org.apache.spark.util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
WARN  2019-12-26 12:22:37,415 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 2 seconds...
WARN  2019-12-26 12:22:40,649 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 4 seconds...
ERROR 2019-12-26 12:22:44,919 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to start or submit Spark application because of java.io.IOException: Failed to fetch dynamic configuration from DSE
java.io.IOException: Failed to fetch dynamic configuration from DSE
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:85)
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:83)
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:83)
    at org.apache.spark.deploy.SparkNodeConfiguration$.apply(SparkNodeConfiguration.scala:45)
    at org.apache.spark.deploy.SparkConfigurator$$anonfun$dynamicConfiguration$2.apply(SparkConfigurator.scala:100)
    at org.apache.spark.deploy.SparkConfigurator$$anonfun$dynamicConfiguration$2.apply(SparkConfigurator.scala:99)
    at scala.util.Try$.apply(Try.scala:192)
    at com.datastax.bdp.util.Lazy.internal$lzycompute(Lazy.scala:26)
    at com.datastax.bdp.util.Lazy.internal(Lazy.scala:25)
    at com.datastax.bdp.util.Lazy.get(Lazy.scala:31)
    at org.apache.spark.deploy.SparkConfigurator.dseDriverProps$lzycompute(SparkConfigurator.scala:176)
    at org.apache.spark.deploy.SparkConfigurator.dseDriverProps(SparkConfigurator.scala:175)
    at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries$lzycompute(SparkConfigurator.scala:147)
    at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries(SparkConfigurator.scala:147)
    at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs$lzycompute(DseSparkArgsPreprocessor.scala:86)
    at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs(DseSparkArgsPreprocessor.scala:75)
    at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:93)
    at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala)
Caused by: java.io.IOException: Failed to open native connection to Cassandra at {192.168.14.2, 192.168.14.3}:9042
    at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:184)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$10.apply(CassandraConnector.scala:167)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$10.apply(CassandraConnector.scala:167)
    at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:32)
    at com.datastax.spark.connector.cql.RefCountedCache.syncAcquire(RefCountedCache.scala:69)
    at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:57)
    at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:79)
    at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:114)
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:53)
    ... 17 common frames omitted
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /192.168.14.2:9042 (com.datastax.driver.core.exceptions.TransportException: [/192.168.14.2:9042] Cannot connect), /192.168.14.3:9042 (com.datastax.driver.core.exceptions.TransportException: [/192.168.14.3:9042] Cannot connect))
    at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:259)
    at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:98)
    at com.datastax.driver.core.Cluster$Manager.negotiateProtocolVersionAndConnect(Cluster.java:1687)
    at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1606)
    at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:453)
    at com.datastax.driver.core.DelegatingCluster.getMetadata(DelegatingCluster.java:89)
    at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:174)
    ... 25 common frames omitted
ERROR 2019-12-26 12:22:45,022 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to cancel delegation token
Siguy12
  • 1
  • 1
  • Does this answer your question? [Can't connect to cassandra - NoHostAvailableException](https://stackoverflow.com/questions/18724334/cant-connect-to-cassandra-nohostavailableexception) – Alessandro Da Rugna Dec 26 '19 at 19:16
  • No unfortunately that didn’t help. I have my native_transport address set to local host And there is no rpc_address setting within my cassandra.yaml file – Siguy12 Dec 27 '19 at 00:02
  • you have to add the rpc_address setting if it isn't there, then do a rolling restart of the cluster – LHWizard Jan 03 '20 at 18:41

0 Answers0