I'm trying to optimize (and safeguard against crashes/errors) my application which will need to execute hundreds or thousands of SQL queries daily, and I'm wondering what the best design/setup is for mssql.
I know it probably wouldn't be efficient to use await sql.connect()
and for each query (this seems to be slower as well).
What I've done before is the following:
sql.js
const sql = require('mssql');
const config = require('./util/db-config.js');
async function connect () {
await sql.connect(config)
}
async function someDatabaseFunction () {
await sql.query('sql statement').then((table) => {
//do stuff here
}
}
module.exports = {connect, someDatabaseFunction}
app.js
const express = require('express');
const {connect, someDatabaseFunction} = require('./sql.js');
const app = express();
connect()
app.post('/someroute', async (req, res) => {
await someDatabaseFunction()
});
app.listen(3000, () => console.log('Running on port 3000!'))
So basically I do const sql = require('mssql')
on a global scope (is that what it's called?) in sql.js
, and run the connect()
function on startup of the app, and then any functions inside sql.js
use the global connection pool to execute queries.
This has worked for me before but as I develop for larger scale and more critical operations, I worry about what might happen if the connection to the database breaks/fails... Is there like a "best practice" for this kind of setup?