I found a similar question here: Performance: condition testing vs assignment
This question is not about optimization. It's about coding preferences.
Here is an example:
I have data that I have no control over. It's from a 3rd party in the form of rows from a db table. It's the result of a MSSQL SP. Being bloated, I'd like to reduce it's size before transmitting the data over the wire as JSON. I can make it about 80% smaller as most of the data is repetitive.
So I do something like so:
$processed = array();
foreach ($result as $row)
{
$id = $row['id'];
$processed[$id]['title'] = $row['title'];
$processed[$id]['data'] = $row['data'];
$processed[$id]['stuff'] = $row['stuff'];
/* many more assignments with different keys */
$unique = array();
$unique['cost'] = $row['cost'];
/* a few more assignments with different keys */
$processed[$id]['prices'][$row['date']] = $unique;
}
I thought this might be quicker, but it looks slower (I timed it):
$processed = array();
$id = null;
foreach ($result as $row)
{
if ($id != $row['id'])
{
$id = $row['id'];
$processed[$id]['title'] = $row['title'];
$processed[$id]['data'] = $row['data'];
$processed[$id]['stuff'] = $row['stuff'];
/* many more similar lines */
}
$unique = array();
$unique['cost'] = $row['cost'];
/* a few more similar lines */
$processed[$id]['prices'][$row['date']] = $unique;
}
Can anyone confirm that with PHP "if"s or conditionals are indeed more compute intensive that assignments? Thanks.
[My answer as an edit]
I did some stand alone tests (without and real data or other code overhead) on FastCGI PHP running with IIS:
function testif()
{
$i = 0;
while ($i < 100000000)
{
if (1 != 0) /* do nothing */;
$i++;
}
return "done";
}
1st run: 20.7496500015256748 sec.
2nd run: 20.8813898563381191 sec.
function testassign()
{
$i = 0;
while ($i < 100000000)
{
$x = "a 26 character long string";
$i++;
}
return "done";
}
1st run: 21.0238358974455215 sec.
2nd run: 20.7978239059451699 sec.