I'm thinking about switching to PostgreSQL from MySQL. I have no experience with PostgreSQL.
I have a batch update process that naively updates several tables from an Excel workbook using INSERT ... ON DUPLICATE KEY UPDATE.
I've been reading about performing the same thing in PostgreSQL, and I've found that using writable Common Table Expressions are going to be the most efficient, and most "idiomatic" way to go about gaining the same functionality.
I make heavy use of foreign key constraints, and I update junction tables upon the bulk upload.
I don't have the opportunity to make sure that the data is sound, relationship-wise, before it's uploaded. Meaning, I don't check the necessary table for existing keys for the rows in another table that contain them as foreign keys.
Is there a way to capture individual foreign key failures when using a wCTE for upserting many rows into a PostgreSQL database?
To provide a little more information, I'm currently using PHP and looping over a data set, performing a SQL query for every index in the array:
for ($i = 1; $i < $length; $i++)
{
$values = array(
$accessoryTypes[$i][0]
isset($accessoryTypes[$i][1]) ? $accessoryTypes[$i][1] : null
);
$query = "INSERT INTO {$table} (`Code`, `Name`) VALUES (?, ?) ON DUPLICATE KEY UPDATE `Name` = VALUES(Name)";
try
{
DB::statement($query, $values);
} catch (Exception $e)
{
$this->sqlErrors['accessoryTypeErrors'][] = $e->getMessage();
}
}
I catch the exception when the upsert fails - I then later parse the error message and gather which foreign key constraint fails so I can display what the problem was.
I'm wondering if this functionality is possible using CTEs in PostgreSQL. If not - do I have to continue to manually look over the data and perform a separate query for each set when trying to insert it?