0

Since my project is on AWS and I have to connect with other AWS related instances (such as API-Gateway, for example) I am trying to use a Lambda Function to connect to a RDS instance (and I know for a fact is up and running). Running in Eclipse with the AWS plugin, in java 8. So, the idea is I code on a local Eclipse, upload the function, run it, and then, on a serverless manner, it should execute the code.

No matter what combination of code I try, or what advice I follow, I do not get a succesful connection, so I can't query the database.

With the current iteration of code I have, I get com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server. I can't for the life of me figure out why I can not connect from eclipse-java-awsLambdaFunction to a mysql-aws instance. If I try to access from terminal to the database, with the same credentials, I am allowed. I have read and followed the official documentation 4-5 times already (originally I was following this, now it is slightly tweaked, and I am unsure if with this is enough for a simple connection (I have been told and read that it is!), but I wonder if there is some other way of doing it).

This is what I execute, since the plugin makes me give "A" json, even though I do not need it, I give it one like this: {"key1" : "value1"}

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;

import org.apache.log4j.Logger;

import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;

public class Adapt_LambdaConnection implements RequestHandler<Object, String> {

    private static Logger log = Logger.getLogger(Adapt_LambdaConnection.class.getName());

    @Override
    public String handleRequest(Object input, Context context) {
        context.getLogger().log("Input: " + input);
        //TODO: implement your handler
        Connection con = null;
        try {    
            con = DriverManager.getConnection("jdbc:mysql://cmf6lelghjzq.eu-west-1.rds.amazonaws.com:3306/botdb", "bot", "PassWordRandom");
            }
        catch (SQLException e) { 
            e.toString();
            log.warn(e.toString());
            System.out.println(e + "\nSQLException");

        }catch (Exception e) {
            e.toString();       
            System.out.println(e + "\ngenericException");
        }
        String status = null;
        if (con != null) {
            status = "connection stablished";
            System.out.println(status);
        }else status = "connetion failed";
        return status;
    }

And this is the POM (I was using a way lighter POM, but since nothing was working, I am now using one I know for sure is right since it is cloned from an unrelated proyect we run that has no issues.

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.spa</groupId>
    <artifactId>demo</artifactId>
    <version>1.0.0</version>
    <packaging>jar</packaging>
  <properties>
        <jdk.version>1.8</jdk.version>
        <encoding>UTF-8</encoding>      
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <hadoop.version>2.7.1</hadoop.version>          
        <scala.version>2.11.8</scala.version>
        <scala.tools.version>2.11</scala.tools.version> 
        <spark.version>2.2.1</spark.version>
        <aws.version>1.11.191</aws.version>
        <spark-csv.version>1.5.0</spark-csv.version>
        <commons-codec.version>1.10</commons-codec.version>
        <log4j.version>1.2.17</log4j.version>
        <json.version>20160212</json.version>
        <slack.version>1.3.0</slack.version>    
    </properties>
    <repositories>
        <repository>
        <id>SparkPackagesRepo</id>
        <url>http://dl.bintray.com/spark-packages/maven</url>
    </repository>       
    </repositories>
    <build>
        <finalName>TypeformSurveysTransformer</finalName>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.5</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>2.4.3</version>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <filters>
                        <filter>
                            <artifact>*:*</artifact>
                            <excludes>
                                <exclude>META-INF/*.SF</exclude>
                                <exclude>META-INF/*.DSA</exclude>
                                <exclude>META-INF/*.RSA</exclude>
                            </excludes>
                        </filter>
                    </filters>
                    <createDependencyReducedPom>false</createDependencyReducedPom>                  
                </configuration>                
            </plugin>
        </plugins>
    </build>    

    <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>com.amazonaws</groupId>
                <artifactId>aws-java-sdk-bom</artifactId>
                <version>1.11.283</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
        </dependencies>
    </dependencyManagement>

    <dependencies>
    <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <version>5.1.45</version>
        </dependency>
        <dependency>
          <groupId>org.scala-lang</groupId>
          <artifactId>scala-library</artifactId>
          <version>${scala.version}</version>
        </dependency>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-reflect</artifactId>
            <version>${scala.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>${hadoop.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_${scala.tools.version}</artifactId>
            <version>${spark.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_${scala.tools.version}</artifactId>
            <version>${spark.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_${scala.tools.version}</artifactId>
            <version>${spark.version}</version>
        </dependency>
        <dependency>
            <groupId>com.databricks</groupId>
            <artifactId>spark-csv_2.10</artifactId>
            <version>${spark-csv.version}</version>
        </dependency>       
        <dependency>
            <groupId>commons-codec</groupId>
            <artifactId>commons-codec</artifactId>
            <version>${commons-codec.version}</version>
        </dependency>
        <dependency>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
            <version>${log4j.version}</version>
        </dependency>
        <dependency>
            <groupId>org.json</groupId>
            <artifactId>json</artifactId>
            <version>${json.version}</version>
        </dependency>   
        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-lambda-java-events</artifactId>
            <version>1.3.0</version>
        </dependency>
        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-lambda-java-core</artifactId>
            <version>1.1.0</version>
        </dependency>
        <dependency>
            <groupId>net.gpedro.integrations.slack</groupId>
            <artifactId>slack-webhook</artifactId>
            <version>${slack.version}</version>
        </dependency>
    </dependencies>
</project>
monkey intern
  • 705
  • 3
  • 14
  • 34
  • The reason for me to use mysql and not mariadb in the code is that I have been told that it makes no difference AND in the aws documentation I did not see any reference to mariadb, so I am treating them as the same. Trying to connect to mariadb instead makes the code to break as of right now, instead of simply not connecting or whatever it is going on. – monkey intern Mar 01 '18 at 08:10
  • 1
    Have you checked whether the lambda has access to the subnet in which the database resides? – Federico klez Culloca Mar 01 '18 at 08:17
  • I believe I have @FedericoklezCulloca , when I created the function I had not properly configured, but since I have edited the lambda function to add the right VPC, I believe that is the way to do it. If there is more to it I would love to learn it, I am very new to this, and I am also very stuck right now. – monkey intern Mar 01 '18 at 08:19
  • Related (similar/same? problem): https://stackoverflow.com/questions/49012188/connection-does-not-throw-error-but-does-not-succeed-either – Mark Rotteveel Mar 01 '18 at 09:55

0 Answers0