1

I'm initializing a huge array of 49605 key=>value pairs) (the array will never be changed again) $boardkey_to_values=Array(97031=>0,97531=>1,409531=>2,410031=>3,410131=>4,472031=>5,472531=>6,472631=>7,472651=>7,484531=>8,485031=>9,485151=>10,485131=>10,...)

Thing is this takes a lot of time for the compiler (40ms in averages)

I wondered if they could be a faster solution.

I'm using a big subset of the keys in my programm (15-35k). I was using MySQL before with where_in, but it was even slower (6s in average), I was given the advice to hardcode it, and indeed, it is much faster but I wanted to optimize it even more. See the original post String to Value compare Optimizing MySQL Query

Community
  • 1
  • 1
edi9999
  • 19,701
  • 13
  • 88
  • 127
  • 5
    It would probably be faster to store the data in a DB, and only load the specific entries you needed when you needed them – Mark Baker Feb 25 '13 at 09:40
  • 2
    Where did you get the original array from Database or File ? – Baba Feb 25 '13 at 09:41
  • 3
    Depends, what you intend to do with the array. Could you give some context? – Fabian Schmengler Feb 25 '13 at 09:46
  • 2
    Indeed, missing context here, could you explain a bit more your overall problem this solution is for? I'm guessing you could cut that array into smaller "sub arrays" and only work with the one(s) you actually need instead of the entire 50K items. – smassey Feb 25 '13 at 10:08
  • [Check out this related question][1] [1]: http://stackoverflow.com/questions/2120401/php-parse-ini-file-performance – sectus Feb 25 '13 at 10:18

1 Answers1

4

40 ms isnt terribly slow for such a large array. But if this is on the web and multiple people are calling the PHP page, that can slow the server down. You have several options:

  • Use multiple Ajax calls, to populate your array, after the page has rendered, i.e. sets of 10000 every few seconds (This way you can do other stuff on the page and let the array populate in its own time)

  • Use a database, as it will be faster to search/update instead of storing it in an Array.

  • Change the program logic to only work with a few values at a time, instead of 49K of them. (kind of like pagination, where only a subset of the data is shown per page)

Husman
  • 6,819
  • 9
  • 29
  • 47
  • I think the best idea is the idea of pagination, as I could probably find a way to paginate it – edi9999 Feb 25 '13 at 10:35
  • Yep, thats where SQL views and limits come into play, for maximum performance. Its not uncommon to see websites show 10, 20, 50, 100 records per page, and thats where you get the best benefits (low bandwidth, low sql search times, low processing utilised, low server loads) and it forces the user to filter more. – Husman Feb 25 '13 at 10:41