0

I am currently trying to benchmark MySQL queries using JHM. For this, I established a connection using JDBC with a dockerized server.

My Code:

@Benchmark
public void testWithPreparedStatement(Blackhole bh) throws SQLException {
    PreparedStatement st = connection.prepareStatement(query);
    st.setString(1, "Parameter1");
    st.setString(2, "Parameter2");
    bh.consume(st.executeQuery());
    connection.commit();
    st.close();
}

These are the results of the tests:

# Warmup Iteration   1: 2.318 ms/op
Iteration   1: 2.038 ms/op
Iteration   2: 1.972 ms/op
Iteration   3: 1.908 ms/op
Iteration   4: 2.000 ms/op
Iteration   5: 1.960 ms/op
Iteration   6: 1.939 ms/op
Iteration   7: 1.968 ms/op
Iteration   8: 1.959 ms/op

The query contains multiple joins and the database contains data so the benchmarks should not be so low.

The same query in DataGrip results in:

0 rows retrieved in 59ms (execution: 32ms, fetching: 27ms)

What I have tried:

  • Running RESET QUERY CACHE and FLUSH TABLES in the @TearDown method

  • Setting the @TearDown to Level.Iteration

  • Setting autoCommit to false and manually commiting the query (see: code) + setting the TransactionLevel to READ_COMMITTED

The results the query returns using JDBC are correct though so it is properly executed.

Any ideas? All help is appreciated!

user3024750
  • 105
  • 1
  • 11

1 Answers1

0

It seems like the 'issue' was related to using a PreparedStatement, once I switched over to a normal Statement, I get the same time values I get in Datagrip.

From this SO answer I found out that a PreparedStatement does "Precompilation and DB-side caching of the SQL statement" which proabably lead to the faster execution times.

So the slower code I am now using:

    @Benchmark
    public void testWithStatement(Blackhole bh) throws SQLException{
        Statement st = connection.createStatement();
        ResultSet rs = st.executeQuery(query);
        bh.consume(rs.next());
    }
user3024750
  • 105
  • 1
  • 11