I see some very strange behaviour when I pass integer from .NET code to SQL.
.NET
int facilityID = 0;
//...
var param = new SqlParameter("@facid", facilityID)
// db call
using (SqlConnection connectionToSQL = new SqlConnection(connectionString))
{
using (SqlCommand command = new SqlCommand(storedProcedure, connectionToSQL))
{
command.CommandType = CommandType.StoredProcedure;
command.CommandTimeout = commandTimeout;
command.Parameters.Add(param)
adapter = new SqlDataAdapter(command);
connectionToSQL.Open();
data = new DataSet();
adapter.Fill(data);
}
}
SQL
CREATE PROCEDURE MySP @facid int = NULL,
-- ...
SELECT
-- ...
WHERE r.entity = isnull(@facid,r.entity) -- r.entity defined as [entity] [numeric] (18, 0) NULL, however it contains only numbers > 0 (no NULLs)
So this select statement can return data only if @facid is NULL or some number > 0 that matches to one of r.entity values, HOWEVER when I run .NET code it returns results. Can someone please explain how this is possible? What exactly and why it makes @facid=NULL despite that I pass 0 from .NET code.
.NET 4.5
System.Data version 4.0.0.0
Microsoft SQL Server 2008 R2 (SP3-GDR) (KB4057113) - 10.50.6560.0 (X64) Dec 28 2017 15:03:48 Copyright (c) Microsoft Corporation Developer Edition (64-bit) on Windows NT 6.1 (Build 7601: Service Pack 1) (Hypervisor)