2

Im building a simple album and track webapp but im getting into a pickle with retrieving the correct information and making it efficient. I've got two tables - album has about 500 items and track with about 5000.

So I start off with a the normal while loop:

$albumQuery = "SELECT * FROM `album` ORDER BY album ASC";
$albumDatabase = mysql_query($albumQuery, $admin) or die(mysql_error()); 

while($albumRow = mysql_fetch_array($albumDatabase)){
    $list = "";
    $list.="build some html for showing later";

this is where I get into an efficency problem:

    $trackQuery = "SELECT track, album FROM `track` WHERE album = '{$album}' ORDER BY filename ASC";
    $trackDatabase = mysql_query($trackQuery, $admin) or die(mysql_error());

    while($trackRow = mysql_fetch_array($trackDatabase)){
        $list.="track html info";
    }

    $listarray[] = $list;
}

natcasesort($listarray);

I then run a foreach later in the body

foreach ($listarray as $v){
    $first_letter = strtoupper(mb_substr($v,9,1));
    if($tmp!==$first_letter){
        $tmp = $first_letter;
        echo '<div class="alphaHolder">'.$tmp.'</div>';
    }
    echo $v;
}

This works perfectly, apart from probably being possibly massively inefficent.

I can get it to regurgitate all the info in the right way and order.

As an experiment I placed of the second while loop outside the parent loop and stuck this info into its own array. Then ran this:

foreach($trackArray as $k => $v){
    if(in_array($v['album'], $albumRow)){
        $list.="track html info";
    }
}

But then I realized that every time it runs through the parent loop its rerunning the foreach over the 5000 items then doing the in_array 500 times. This obviously takes ages and normally crashes the browser. Leading me to the original nested while loop being slightly more efficient and convoluted.

I have started thinking that array_intersect might be the solution but I don't want to bark up the wrong tree.

There's probably too many loops and what have you going on but I need the natural sort and the alpha/numerical heading regurgitation.

Screenie: http://ink361.com/#/photos/133115598501394804_9688917

3 Answers3

1

I think, you should rethink "tracks" query. You could JOIN "albums" table, to add each album row, to each tracks row, and make only one loop over resulting array, to generate needed result.

Any way, you could just do all foreach ($listarray as $v){ stuff right before $listarray[] = $list;.

And, why do you use sorting with PHP, when you have already done it, with SQL. Do you really need that? Additionally, do you really need all those 5000 items accessed at same time? Not everyboy would want that sorting done, my cat would like that sorted by length of song, or genere, have you done enough thinking about that?

For dealing with efficiency more efficiently, it is good, if sample data is avilable, together with standalone-working "case code". You code seems broken at this moment. Maybe, you should work on your php coding practices.

Deele
  • 3,728
  • 2
  • 33
  • 51
  • Ok. That was the next step - joining the queries. The way I want the app to work really needs the natural sorting side that sql doesn't have. Yes I realize that giving more of the code would help but I don't have it all to hand - what I have given is the gist of what is happening here and its from memory. The sorting will be controlled in other ways later on in the project - this is how 'I' want it to work at the moment. – MarkBeharrell Feb 28 '12 at 10:11
  • Efficiency is tricky stuff. The main topic in theese days, is not machine power, but need of client/user. If your unefficient code "does the job" and you can find server, that handles it with ease, no need to enchance code, unless you experience lack of resources. – Deele Feb 28 '12 at 10:30
  • From what I found it isn't case insensitive enough but now I think of it it might be a hang up from an earlier bit of code. So I can probably tweak the query now. – MarkBeharrell Feb 28 '12 at 10:37
1

Like in many other questions, no direct approach could help.
An HTML page consists of 5500 rows would ALWAYS be inefficient and slow. Incredible inefficient and dramatically slow.

So, you need not some cunning algorithm but completely different architecture.

Pagination is a thing you need.

Split your list to pages.

Your Common Sense
  • 156,878
  • 40
  • 214
  • 345
  • Yes this was always going to be a solution but wasnt my main aim. Having it all at you in one hit was what I wanted... http://ink361.com/#/photos/133115598501394804_9688917 – MarkBeharrell Feb 28 '12 at 10:40
  • 1
    @Mark If that is what you wish, use AJAX. First, show all albums, when user clicks on any, request to server, brings tracks and info, for that album. – Deele Feb 28 '12 at 11:05
  • Ah yes this might be a nice solution. As the initial album view is literally a few seconds. *opens AJAX tome* – MarkBeharrell Feb 28 '12 at 11:27
  • The sorting was solved by adding upper(album) to the main album query. Don't know why I didn't think of it beforehand... – MarkBeharrell Feb 28 '12 at 21:20
-1

You can first flip the array later check if the key exists. This will by much faster as its indexed searched unlike previous iterative search.

$flipped_$albumRow = array_flip($albumRow);
foreach($trackArray as $k => $v){
    if(isset($flipped_albumRow[$v['album']]))){ # <- line of interest.
        $list.="track html info";
    }
}
Shiplu Mokaddim
  • 56,364
  • 17
  • 141
  • 187
  • Ahhh a more constructive bit of help not judgement on code. Thank you! – MarkBeharrell Feb 28 '12 at 10:21
  • You wouldn't believe how much quicker this made the whole process. I'm going to combine this with a JOIN (it'll probably mean that this will become redundant but still its all useful learning!) – MarkBeharrell Feb 28 '12 at 21:19
  • @Mark The ground information is based on this post. http://stackoverflow.com/questions/8826908/what-is-the-best-way-to-find-difference-between-2-large-arrays – Shiplu Mokaddim Feb 28 '12 at 21:22