I have an application that fetches a large dataset from a SQLite DB using Java JDBC. (Note that I'm running on a small scale system - Atom processor, 1GB ram, SQLite DB) Maybe about 160,000 points of data. The data in each row is a timestamp (INTEGER), VALUE (REAL) returned from the query.
I need to do processing on this via another thread. so, I put it into a List(HashMap(String, Object)) for easy usage via the below function. This is passed to the other thread for processing. The creation of this List(HashMap)() seems to take quite some time to process, roughly 20 seconds per query.
Is there a more efficient way to do this?
public List<HashMap<String, Object>> resultSetToArrayList(ResultSet rs) throws SQLException
{
ResultSetMetaData md = rs.getMetaData();
int columns = md.getColumnCount();
List<HashMap<String, Object>> list = new ArrayList<HashMap<String, Object>>();
while (rs.next())
{
HashMap<String, Object> row = new HashMap<String, Object>(columns);
for (int i = 1; i <= columns; ++i)
{
row.put(md.getColumnName(i), rs.getObject(i));
}
list.add(row);
}
return list;
}
After comments below I have improved this to use an object "Result". The improved function is below as well as the object definition. These changes resulted in an 8 second processing time improvement.
public List<Result> resultSetToArrayList(ResultSet rs) throws SQLException
{
List<Result> list = new ArrayList<Result>();
while (rs.next())
{
list.add(new Result(rs.getLong(1), rs.getObject(2)));
}
return list;
}
public class Result
{
long timestamp;
Object value;
public Result(long timestamp, Object value)
{
this.timestamp = timestamp;
this.value = value;
}
public long getTimestamp()
{
return timestamp;
}
public void setTimestamp(long timestamp)
{
this.timestamp = timestamp;
}
public Object getValue()
{
return value;
}
public void setValue(Object value)
{
this.value = value;
}