One way to do this is to encode the entire data set as JSON and send in in a single query. There are a couple different ways to achieve this. The Microsoft docs discuss one here. They insert JSON objects in the example (so if you wanted to do it in bulk you would use an array of objects).
I will instead use an array of arrays. This makes for slightly less readable SQL, but decreases overhead. Also, I am accessing MSSQL via PDO, but you could just as easily use sqlsrv.
$stmt = $db->prepare(<<<EndSQL
CREATE TABLE #exampleTable (
A VARCHAR(200),
B VARCHAR(200)
)
INSERT INTO #exampleTable
SELECT * FROM OPENJSON(?) WITH (
A VARCHAR(200) '$[0]',
B VARCHAR(200) '$[1]'
);
DROP TABLE #exampleTable;
EndSQL);
$data = [];
for($i = 0; $i < 100000; $i++) {
$data[] = [
'A' => 'A-val-' . $i,
'B' => 'B-val-' . $i
];
}
$stmt->execute([json_encode($data)]);
This is similar to the strategy where you stick a bunch of inserts in the same statement, but it allows you to use a static SQL statement regardless of how many inserts you have.
INSERT INTO #exampleTable VALUES
(?, ?),
(?, ?),
(?, ?),
(?, ?),
(?, ?),
(?, ?),
(?, ?),
...
If you do use this, the next limit you might run into is memory. This loads everything into PHP's memory before sending it to SQL. There are 2 ways to get around this: do batches of ~10,000 lines or stream JSON to SQL Server as you generate it.