8

I have an application running on Postgres database, sometimes when I have about 8-10 people working on the application, the CPU usage soars high to something between 99-100%, The application was built on Codeigniter framework which I believe had made provision for closing up connections to the database each and every time it is not needed, What could be solution to this problem. I would appreciate any suggestions. Thank you

Basically, what the people do on the application is to running insert queries but at a very fast rate, A person could run between 70 - 90 insert queries in a minute.

Vini.g.fer
  • 11,639
  • 16
  • 61
  • 90
Ola Fashade Samson
  • 111
  • 1
  • 1
  • 6
  • 1
    This isn't especially surprising if you're running lots of tiny inserts and you have fast storage so the bottleneck is memory/CPU. You're doing lots of work, so there's lots of load. What's the problem? – Craig Ringer Aug 24 '15 at 06:05

3 Answers3

20

I came across with the similar kind of issue. The reason was - some transactions were getting stuck and running since long time. Hence CPU utilization got increased to 100%. Following command helped to find out the connections running for the longest time:

SELECT max(now() - xact_start) FROM pg_stat_activity
                           WHERE state IN ('idle in transaction', 'active');

This command shows the the amount of time a connection has been running. This time should not be greater than an hour. So killing the connection which was running for a long long time or stuck at any point worked for me. I followed this post for monitoring and solving my issue. Post includes lots of useful commands to monitor this situation.

Ajeet Khan
  • 8,582
  • 8
  • 42
  • 65
  • Hi Ajeet, In my case, the Maximum Transaction Age is 10 minutes and the CPU consumption is around 95%, any suggestions to analyze the problem? – Suresh Mar 10 '22 at 13:47
6

You need to find out what PostgreSQL is doing. Relevant resources:

Once you find what the slow or the most common queries are use, use EXPLAIN to make sure they are being executed efficiently.

Community
  • 1
  • 1
Samuli Pahaoja
  • 2,660
  • 3
  • 24
  • 32
0

Here are some cases we met that cause high CPU usage of Postgres.

  • Incorrect indexes are used in the query

    • Check the query plan - Through EXPLAIN, we could check the query plan, if the index is used in the query, the Index Scan could be found in the query plan result.
    • Solution: add the corresponding index for the query SQL to reduce CPU usage
  • Query with sort operation

    • Check EXPLAIN (analyze, buffers) - If the memory is insufficient to do the sorting operation, the temporary file could be used to do the sorting, and high CPU usage comes up.
      • Note: DO NOT "EXPLAIN (analyze)" in a busy production system as it actually executes the query behind the scenes to provide more accurate planner information and its impact is significant
    • Solution: Tune up the work_mem and sorting operations
  • Long-running transactions

    • Find long-running transactions through
      SELECT  pid
      , now() - pg_stat_activity.query_start AS duration, query, state
      FROM pg_stat_activity
      WHERE (now() - pg_stat_activity.query_start) >  interval '2 minutes';
      
    • Solution:
      • Kill the long-running transaction through select pg_terminate_backend(pid)
      • Optimize the transaction or query SQL through corresponding indexes.
zangw
  • 43,869
  • 19
  • 177
  • 214