I have a database table with 10,000,000+ rows which I process to create a new table. What is the best type of storage engine for reading?
Currently I use:
$sql = "UPDATE `packages_unsorted` SET `acv_return_viadetails` = '0';";
Then to select the records:
$sql = "SELECT * FROM `packages_unsorted` WHERE `acv_return_viadetails` = '0' LIMIT 1000;";
I collect the primary key id
into an array called $ids and I process the records and write the data to a CSV file for import at the end of each 1000 rows.
I then update the acv_return_viadetails field with:
$sql = "UPDATE `packages_unsorted` SET `acv_return_viadetails` = '1' WHERE `id` IN ('".implode("','",$ids)."') LIMIT 1000;";
As the above is the only write to the table and I select all fields every time. Should I be using a memory table or some other type of storage engine to make things faster? Or are there any other optimizations I could introduce that would speed things up?
Also what would be a reasonable my.cnf configuration for such an operation.
Thanks in advance :)