Judging by what's been going on in the comment section, what you're asking is that you would like to have a more optimal query process. Right now you are using two different queries to populate your two tables, and you're wondering whether that could be done more optimally.
First things first, it's not possible to populate TWO different tables with ONE query.
However, what you could do, is use transactions.
The rest of this answer will follow the assumption that you are using PHP as your backend scripting language (as you tagged yourself).
Also, it is not inherently obvious whether you use prepared statements for your queries or not. In the case you don't, I would highly recommend using prepared statements. Otherwise, you're opening yourself up to SQL Injections (SQLI Attacks).
I will proceed by using mysqli prepared statements in this answer.
<?php
// Your input post variables
$name = $_POST['name'];
$foodArray = $_POST['foodArray'];
/*
I'm using a function to handle my queries,
simply because it makes large piles of code easier to read.
I now know that every time the function:
createUserAndFood($name, $foodArray);
is called, that it will populate my user and food table.
That way I don't have to worry about writing all the code multiple times.
*/
function createUserAndFood($name, $foodArray){
// food array values
$foodValues = array_values($foodArray);
// DB variables
$servername = "localhost";
$username = "username";
$password = "password";
$dbname = "myDB";
// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);
// Check connection
if($conn->connect_error){
die("Connection failed: " . $conn->connect_error);
}
/*
Stops the query from auto commiting,
I'll explain later, you can "maybe" disregard this.
*/
$conn->autocommit(FALSE);
// Declare the query
$sql = "INSERT INTO userTable(name) VALUES(?)";
// Prepare and bind
$stmt = $conn->prepare($sql);
$stmt->bind_param("s", $name);
// Execute the query
$stmt->execute();
// Fetch last inserted id
$lastID = $conn->insert_id;
$sql = "INSERT INTO foodTable(userId, food) VALUES(?, ?)";
$stmt = $conn->prepare($sql);
for($i = 0; $length = count($foodValues) > $i; $i++){
$stmt->bind_param("is", $lastID, $food);
$food = $foodValues[$i];
$stmt->execute();
}
// Commits the query / queries
$conn->commit();
// Close connection
$stmt->close();
$conn->close();
}
?>
Since you wanted to optimize your queries, the general idea that we are using here, is that we are making use of the MySQL function LAST_INSERT_ID();
via PHP and store it into a variable.
Now, this is mainly relevant if you are using auto incremented id's. If you are not, you can disregard this specific logic and use something else. But if you are, then keep reading.
The reason why we are storing the last id into a variable is because we need to use it multiple times (the new user might have more than one favorite food afterall). If you were not to store the last id into a variable, it would instead take the auto incremented value of the second table after the initial insert, which means upon your third insert statement and forward, you would be working with the wrong id.
Now, as I promised to explain, the reason I'm using $conn->autocommit(FALSE);
and $conn->commit();
is because you might not want incomplete data sets in your database. Imagine that a user input is happening, but your database crashes in the middle of it all. You'll have incomplete data sets. If this is not really a concern of yours, then you can disregard that.
To simplify what's going on at the MySQL side of things, think of it like this:
BEGIN;
INSERT userTable SET name = '$name';
SET @lastID = LAST_INSERT_ID();
INSERT foodTable SET id = @lastID, food = '$food';
COMMIT;