I am having more then 2gb data in my table i need to read more the 1gb data from the single table, i know various option available in db side to achieve this but i need better approach in java code, can any one tell with example java code like parallel processing in multi threading.
example Code
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
public class SelectRowsExample {
public static void main(String[] args) {
Connection connection = null;
try {
// Load the MySQL JDBC driver
String driverName = "com.mysql.jdbc.Driver";
Class.forName(driverName);
String serverName = "localhost";
String schema = "test";
String url = "jdbc:mysql://" + serverName + "/" + schema;
String username = "username";
String password = "password";
connection = DriverManager.getConnection(url, username, password);
System.out.println("Successfully Connected to the database!");
} catch (ClassNotFoundException e) {
System.out.println("Could not find the database driver " + e.getMessage());
} catch (SQLException e) {
System.out.println("Could not connect to the database " + e.getMessage());
}
try {
Statement statement = connection.createStatement();
ResultSet results = statement.executeQuery("SELECT * FROM employee orderby dept");
while (results.next()) {
String empname = results.getString("name");
System.out.println("Fetching data by column index for row " + results.getRow() + " : " + empname);
String department = results.getString("department");
System.out.println("Fetching data by column name for row " + results.getRow() + " : " + department);
}
} catch (SQLException e) {
System.out.println("Could not retrieve data from the database " + e.getMessage());
}
}
}
Here my query will return name and department details more the 1gb data will come for each department. if i use this way it will surly slow down the application. that's why i thought go for parallel processing in multithreading. any one kindly give me the suggestion to read the huge amount of data quickly.