Suppose I have a very large array of information for a user:
$user=array(
"name"=>"john",
"ip"=>"xx.xx.xx.xx",
"email"=>"john@something.com",
//lots more values
)
Let's also suppose that this information needs to go into more than one table. For instance a username
needs to go table users
, address
needs to go into a details
table, etc.
Now, I use a certain self-made function to insert into my tables that matches array keys to column names and array values to the values being inputted. Something similar to this:
function insert_sql($table, arr $values){
GLOBAL $dbc;
$sql="INSERT INTO `$table` (".implode(array_keys($values), ", ").") VALUES (".implode(array_values($values), ", ").")";
$dbc->prepare($sql)->execute();
return $dbc->lastInsertId();
}
//I don't actually use this function, just trying to show you what is being accomplished.
The problem is that my function uses all the keys and all the values, so when I just need certain parts of the array put into multiple tables, it doesn't work.
The question is:
How do I make an INSERT statement ignore a column if it doesn't exist? So if I insert name
,email
,address
, into table users
, but this table doesn't have an address column, I need it to insert the row with the name and email but simply ignore the fact that the address column is not there.
EDIT: The other option is to make an array with the columns of a table and use it to filter the values array. Although I am not really sure how to set this up.