I'm calling a stored procedure through the Enterprise Library DAL Application Block and pass a DataTable which is in turn 'received' as a custom Table datatype (@names
as NamesTable
) in my procedure. The procedure is very slow from the second call and I'm looking for a different way to implement it so performance is greatly increased.
The Names/HistoricalNames tables are huge (100 million records) and the data passed to these tables (through the dataset/table parameter) is around 4 million records).
Basically what it does (needs to do) is the following:
- Import
@names
(which is the DataTable/Table parameter -
- Check if either the
Names
or theHistoricalNames
table contains any of the names contained in the new dataset/table parameter, if so skip the entire import and return 2 - Otherwise insert all records from
@names
inNames
and return 1;
- Check if either the
The tables look like this:
create table Names
(
id int IDENTITY(1,1) NOT NULL,
name nvarchar(20),
otherId uniqueidentifier
)
create table HistoricalNames
(
id int IDENTITY(1,1) NOT NULL,
name nvarchar(20),
otherId uniqueidentifier
)
The Table valued parameter (@names
) looks like this:
create table NameTable
(
name nvarchar(20)
otherId uniqueidentifier
)
This is the procedure:
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE [dbo].[sp_ImportNames]
@names NameTable READONLY
AS
BEGIN
IF ((SELECT COUNT(cd.name) FROM Names as cd WHERE cd.name IN (SELECT c.name FROM @names as c)) > 0)
BEGIN
SELECT 2;
END
ELSE IF ((SELECT COUNT(cd.name) FROM HistoricalNames as cd WHERE cd.name IN (SELECT c.name FROM @names as c)) > 0)
BEGIN
SELECT 2;
END
ELSE
BEGIN
INSERT INTO Names (name, otherId) SELECT * FROM @names;
SELECT 1;
END
END
GO
Can this be easily tuned for performance? Any help would be greatly appreciated!