-1

in my database table i have up to 1000 entries in a table called product .

In show all product page i have <ul class="produlctlist"></ul>

And there is a script

jQuery(document).ready(function( $ ) { 

     $.ajax({
              method: "POST",
              url: "http://example.com/filter.php",
            }).done(function( msg ) {

                     $( ".produlctlist" ).html(msg);

                     });

   });

in filter.php

$query = "SELECT * FROM `product`";
$result = $conn->query($query);

 while($row = $result->fetch_assoc()) { 
  echo "<li><img src='".$row['img']."' />".$row['name']."</li>";

 }

Here everything working perfectly . What my question is i need to limit the query as "SELECT * FROMproductLIMIT 45" and implement pagination or load more for better page speed ? Or no need to limit the query since all details are loaded from jquery ?

Is there any alternative solution available without using pagination or load more ?

For to fetch 1000 data from mysql it take more time than fetch 45 data, then there any way to configured in Jquery that fetch 50 data first , then next 50 data like that ?

Abilash Erikson
  • 341
  • 4
  • 26
  • 55

2 Answers2

1

javascript: - you need to handle global variable page - if you want to keep it for example for refreshing the page every change store in cookie and then load cookie value in first line

var page = 0;
jQuery(document).ready(function( $ ) { 

 $.ajax({
          method: "POST",
          url: "http://example.com/filter.php?page="+page,
        }).done(function( msg ) {

                 $( ".produlctlist" ).html(msg);

                 });
});

javascript regarding your edit (start from first 45 and then load next 45 every next load):

var page = 0;
jQuery(document).ready(function( $ ) { 

 $.ajax({
          method: "POST",
          url: "http://example.com/filter.php?page="+page,
        }).done(function( msg ) {

                 $( ".produlctlist" ).html(msg);
                 page++;

                 });
});

php

$query = "SELECT * FROM `product` LIMIT ".(intval($_GET['page'])*45).", 45";
$result = $conn->query($query);

while($row = $result->fetch_assoc()) { 
   echo "<li><img src='".$row['img']."' />".$row['name']."</li>";
}
Eduard Void
  • 2,646
  • 1
  • 13
  • 13
1

I see two things that you could improve:

  • retrieve only the columns that you need (do not use *, instead specify which columns you want)
  • seccond issue is that you are getting all products each time, you can use pagination to get only the rows you need each time.

You can use LIMIT to get the number of rows you desired by specifying the offset and the number of rows you want.

The LIMIT clause can be used to constrain the number of rows returned by the SELECT statement. LIMIT takes one or two numeric arguments, which must both be nonnegative integer constants (except when using prepared statements).

With two arguments, the first argument specifies the offset of the first row to return, and the second specifies the maximum number of rows to return. The offset of the initial row is 0 (not 1)

Additionally SELECT only the rows you need to use (read this post for more details):

SELECT img, name FROM `product`LIMIT 0, 44; -- Returns rows 0-45
lloiacono
  • 4,714
  • 2
  • 30
  • 46
  • :D smaller bug, second parameter of the limit function is the number of rows from the position in first parameter, so your query will return rows 0-43 - count(*) = 44 - the bug is in comment section. Row 0 is also in the count – Eduard Void May 24 '17 at 06:51
  • it's not about the Limit query , it's about need to use this or not . Please read question once again . – Abilash Erikson May 24 '17 at 06:55
  • 1
    @abilasher after reading your question again, I still think you need to use a limit, since you mentioned that you have 1000 entries in your table. If you dont want to show the 1000 entries at once, then use limit and handle the pagination with javascript. Also consider selection only img and name, this could also improve performance. – lloiacono May 24 '17 at 07:01