13

The problem

Every AJAX request containing any DB query is taking a lot more time than normal.

I haven't updated the codebase since a week, but all of a sudden all the DB queries done in an AJAX request is taking a lot of time. A thing to notice here is that if the query is written in a page and then the page is normally loaded like if you were to visit: www.example.com/mypage.php ,

mypage.php:

<?php

   $query = $db_handler->prepare(
      "SELECT * FROM table_x LIMIT 5"
   );
   $query->execute();
   $fetch = $query->fetchAll(PDO::FETCH_ASSOC);

?>

The page loads up very quickly with all the result.

But if its done in an AJAX's response file it takes a lot of time(say 15secs) to load

AJAX Code on client-side:

$.ajax
({
    url: 'server_files/ajaxtest.php',
    type: 'POST',
    dataType: 'JSON',
    data:
    {
        data: 'some data'
    },
    success: function(data)
    {
        if( data.success === true )
        {

        }
        else if( data.success === false )
        {

        }
    },
    error: function(e)
    {
        alert('Error');
    }
});

ajax_response.php:

<?php

   header('Content-Type: application/json');

   if( isset($_POST['data']) )
   {
       $query = $db_handler->prepare(
          "SELECT * FROM table_x LIMIT 5"
       );
       $query->execute();
       $fetch = $query->fetchAll(PDO::FETCH_ASSOC);

       echo json_encode([
           'success'  => true,
           'response' => $fetch
       ]);
    }
?>

^ takes 15 sec to load ( A query with 5 row sets(LIMIT 5) is taking the same time as a query with 10 row sets(LIMIT 10). )

if the same file contains only this

<?php

   header('Content-Type: application/json');

   if( isset($_POST['data']) )
   {    
       echo json_encode([
           'success'  => true
       ]);
   }
?>

^ takes 300-400ms to load

Obviously a query will increase the response time a little(1-3secs) but 15secs is too much.


What I've done

1) I've contacted my hosting provider, but that didn't helped much.

2) I've also installed mysqltuner and it shows this:

-------- General Statistics --------------------------------------------------
[--] Skipped version check for MySQLTuner script
[OK] Currently running supported MySQL version 5.5.49-0ubuntu0.14.04.1-log
[OK] Operating on 64-bit architecture

-------- Storage Engine Statistics -------------------------------------------
[--] Status: +Archive -BDB -Federated +InnoDB -ISAM -NDBCluster
[--] Data in PERFORMANCE_SCHEMA tables: 0B (Tables: 17)
[--] Data in MyISAM tables: 27K (Tables: 13)
[--] Data in InnoDB tables: 6M (Tables: 21)
[!!] Total fragmented tables: 21

-------- Security Recommendations  -------------------------------------------
[!!] User 'rootAlbert@127.0.0.1' has no password set.
[!!] User 'rootAlbert@::1' has no password set.
[!!] User 'rootAlbert@lamp' has no password set.

-------- Performance Metrics -------------------------------------------------
[--] Up for: 11h 23m 42s (21K q [0.533 qps], 11K conn, TX: 6M, RX: 2M)
[--] Reads / Writes: 92% / 8%
[--] Total buffers: 432.0M global + 2.7M per thread (151 max threads)
[OK] Maximum possible memory usage: 837.8M (84% of installed RAM)
[OK] Slow queries: 2% (488/21K)
[OK] Highest usage of available connections: 3% (6/151)
[OK] Key buffer size / total MyISAM indexes: 16.0M/156.0K
[OK] Key buffer hit rate: 99.2% (133 cached / 1 reads)
[OK] Query cache efficiency: 61.9% (6K cached / 10K selects)
[OK] Query cache prunes per day: 0
[OK] Sorts requiring temporary tables: 0% (0 temp sorts / 113 sorts)
[!!] Temporary tables created on disk: 50% (421 on disk / 842 total)
[OK] Thread cache hit rate: 99% (6 created / 11K connections)
[OK] Table cache hit rate: 33% (75 open / 223 opened)
[OK] Open file limit used: 1% (76/6K)
[OK] Table locks acquired immediately: 100% (4K immediate / 4K locks)
[OK] InnoDB data size / buffer pool: 6.5M/128.0M

-------- Recommendations -----------------------------------------------------
General recommendations:
    Run OPTIMIZE TABLE to defragment tables for better performance
    MySQL started within last 24 hours - recommendations may be inaccurate
    When making adjustments, make tmp_table_size/max_heap_table_size equal
    Reduce your SELECT DISTINCT queries without LIMIT clauses
Variables to adjust:
    tmp_table_size (> 16M)
    max_heap_table_size (> 16M)

3) Searched a lot and updated my my.cnf file. This is my my.cnf file (this file looked a bit different at the time when the problem occurred)

[client]
port        = 3306
socket      = /var/run/mysqld/mysqld.sock

[mysqld_safe]
socket      = /var/run/mysqld/mysqld.sock
nice        = 0

[mysqld]
local-infile=0
log=/var/log/mysql-logfile
skip_name_resolve

user        = mysql
pid-file    = /var/run/mysqld/mysqld.pid
socket = /var/run/mysqld/mysqld.sock
port = 3306
basedir     = /usr
datadir = /var/lib/mysql
tmpdir      = /tmp
lc-messages-dir = /usr/share/mysql
skip-external-locking

slow-query-log = 1 
slow-query-log-file = /var/log/mysql-slow.log 
long_query_time = 2 
log-queries-not-using-indexes 

key_buffer      = 16M
max_allowed_packet = 32M
thread_stack        = 192K
thread_cache_size       = 8

myisam-recover         = BACKUP

query_cache_type=1
query_cache_limit=2M
query_cache_size=256M

tmp_table_size=16M
max_heap_table_size=16M
table_cache=3084

log_error = /var/log/mysql/error.log

expire_logs_days    = 10
max_binlog_size         = 100M
big-tables

[mysqldump]
quick
quote-names
max_allowed_packet  = 16M

[mysql]

[isamchk]
key_buffer      = 16M

!includedir /etc/mysql/conf.d/

4) Optimized all the tables in the DB

5) I had also upgraded my server from 1GB memory and 1CPU, 2TB Transfer to 2GB memory and 2CPUS, 3TB Transfer

I'm still not getting why is it happening and how to solve this.

Devang Mistry
  • 402
  • 2
  • 5
  • 21
  • 1
    Post full code of `ajax_response.php` file and your ajax code too!! – Saty May 05 '16 at 12:58
  • 3
    are you using PHP's sessions? read [this](http://stackoverflow.com/questions/27101875/why-are-my-xhr-calls-waiting-for-each-other-to-return-a-response) – Saic Siquot May 05 '16 at 13:02
  • 1
    @Saty Posted both the codes – Devang Mistry May 05 '16 at 13:06
  • 1
    @LuisSiquot Yes I'm using PHP's sessions. – Devang Mistry May 05 '16 at 13:06
  • 2
    Sounds like session locking, and not related to mysql per se at all. http://stackoverflow.com/a/15693029/761202 . Please put the unmodified contents of `ajaxtest.php` in the question, without it you've shown a lot of info that's probably irrelevant. `^ takes 300-400ms to load` how are you measuring this? – AD7six May 05 '16 at 13:11
  • 1
    well, that may be the problem. Sessions (using files (the default method)) locks. So one script waits others to have exclusive loocking on session. – Saic Siquot May 05 '16 at 13:15
  • 1
    @AD7six session locking might happen if the session is at all started. In my whole application I'm using session but the code blocks that I've mentioned in the question is the test that I've done just to to check what is causing the delay and they are not using sessions. `^ takes 300-400ms to load` is the time that the request is taking(shown in the inspector of the browser) – Devang Mistry May 05 '16 at 13:25
  • 1
    @wajeeh `echo json_encode($fetch);` only in ajax_response.php ? If yes how is that different from what I've done ? – Devang Mistry May 09 '16 at 13:03
  • 1
    Good question description. – jekaby May 13 '16 at 13:18

3 Answers3

6

The problem was in the connection string. I was using my domain name (example.com) for connecting to the Database. So I changed it to my IP address and it resolved the problem.

Thanks everyone for your help.

Devang Mistry
  • 402
  • 2
  • 5
  • 21
  • That's a bit insane: 15 secs penalty to do a DNS lookup ? –  May 12 '16 at 22:21
  • 2
    Hardcoding IP is never good, find out why your DNS look up this slow. – anwerj May 13 '16 at 10:14
  • @swa66 Yea it took! I don't know why. – Devang Mistry May 13 '16 at 10:42
  • @anwerjunaid why? what problems it can cause? – Devang Mistry May 13 '16 at 10:43
  • 1
    Your DNS setup of your server is not OK if it takes 15 secs to do a simple lookup. It should be cached as you do these lookups frequently. Whomever is managing the machine needs to fix this. Pinpointing the reason is not possible with the info we have, here, but it could be the DNS server, firewall or network issues, or local daemons that are not performing their task properly. Either way, you're going to run into other problems just down the road as anything that's going to do a DNS lookup risks running into the same issue. –  May 13 '16 at 12:43
  • 1
    I've experienced this as well on an Amazon EC2 Windows machine - even then it was a change from `localhost` to `127.0.0.1` that did the trick - so this is legit. – Amanda Callesen May 14 '16 at 12:00
0

I think you must first figure out whether its query which is taking time or not. - Put the same query in phpmyadmin and run the query and check the time. If query takes time then try to remove limit and run or use indexing for it. - If query does not take time then check for any error or issue in network while fetching the data.Remove the query from the ajax for time being and send static data. Check what time it it takes time.

Hope this helps.

Purushottam zende
  • 552
  • 1
  • 6
  • 20
  • better, but I think you have simply answered too soon `Remove the query from the ajax for time being and send static data` this is in the question. I would advise to delete the answer until there is sufficient info in the question to write something specific. – AD7six May 05 '16 at 13:18
  • @AD7six, its one of the option which i said. You can take these steps just to be sure. Atleast i will take these steps first. before going to other big options. – Purushottam zende May 05 '16 at 13:22
  • Remove the query and send static data seems to be a good idea for testing – Ricardo Vigatti May 05 '16 at 13:27
  • Good idea but shifting everything there is going to take a lot of time. Also features like pagination, on-scroll load won't work without AJAX – Devang Mistry May 05 '16 at 13:29
  • @Devang, you can check in console, for time being and error wont matter if its not on live, like 5 mins ? – Purushottam zende May 05 '16 at 13:34
  • @Purushottamzende The problem is in the production environment and any testing there could be risky. Instead a test can be run on a duplicate page, but I've also tried that and it works normal. – Devang Mistry May 05 '16 at 13:37
  • @Devang, by normal you mean ? after removing query, does it still takes time ? or it loads quickly. ? – Purushottam zende May 05 '16 at 13:39
  • @Purushottamzende No I'm saying as you suggested to move from AJAX requests to directly loading data on page load, I'm saying that I've done that and it works fine, doesn't take time. – Devang Mistry May 05 '16 at 13:44
  • @Devang, i am asking just to confirm, executing only query takes time or not ? in phpmyadmin or wherver u r accessing the DB – Purushottam zende May 05 '16 at 13:53
  • @Purushottamzende executing query takes time only in an AJAX request. Not in phpMyAdmin nor in a page load, `www.example.com/mypage.php` – Devang Mistry May 05 '16 at 14:03
-2

You should check MySQL's total max_connections allowed. And see the connection is closed or freed properly after executing the every query. The sequence should be like connection open > execute query > connection close. This is one of the reason to slowdown the ajax query. if connection is continue to open without closing and it will exceeds connection limit by ajax request.

example:

$db=new mysqli("localhost","root","","test"); //open the connection

$db->query("[Sql commands]");// execute query
//some php

$db->close(); //close the connection

MySQL system variable max_connections determines the number of connections which MySQL/MariaDB will accept. The default value is 151 connections, which allows 150 normal connections plus one connection from the SUPER account.

The first thing to decide is what new maximum value you want to set for max_connections. There are several considerations to take into account when increasing the number of MySQL/MariaDB connections. The maximum number which can be supported by the system will depend on:

  1. The amount of available RAM.
  2. How much RAM each connection takes (simple queries will require less RAM than more labor-intensive connections).
  3. The acceptable response time.
  4. According to the MySQL documentation, most Linux systems should be able to support 500-1000 connections without difficulty.

To see the current number of max_connections log in to the MySQL/MariaDB command line client with the command:

mysql> show variables like 'max_connections';

Set the new max_connections value with the command:

mysql> SET GLOBAL max_connections=[desired new maximum number];
  • Please add some further explanation to your answer such that others can learn from it. As you can see in the accepted answer, given by the OP itself, this was a DNS problem – Nico Haase Nov 04 '20 at 09:17