For 14 years, I have lived by the sworn reference "Never use a UDF on more than one or two rows being affected."
I was about to share just how bad UDFs are with my team, by using a very basic example:
/*
CREATE FUNCTION dbo.Test (@Numerator Float, @Denominator Float)
RETURNS Float AS
BEGIN
DECLARE @Return Float;
SET @Return = @Numerator / NullIf(@Denominator,0);
RETURN @Return
END
GO
*/
----------------------------------------------------------------------
SELECT rn / NullIf(3.00,0)
FROM
(
SELECT TOP 1000000 rn = Convert(Float,ROW_NUMBER() OVER (ORDER BY s1.[object_id]))
FROM sys.all_objects AS s1 CROSS JOIN sys.all_objects AS s2
ORDER BY s1.[object_id]
) tbl;
----------------------------------------------------------------------
SELECT dbo.Test (rn,3.00)
FROM
(
SELECT TOP 1000000 rn = Convert(Float,ROW_NUMBER() OVER (ORDER BY s1.[object_id]))
FROM sys.all_objects AS s1 CROSS JOIN sys.all_objects AS s2
ORDER BY s1.[object_id]
) tbl;
----------------------------------------------------------------------
--DROP FUNCTION dbo.Test;
GO
However, when I was ready to do the show-n-tell, the length of time it took to run the first block and the second block were almost the exact same, about 8 seconds.I remember running a very similar experiment about 6 years ago, and noticed an exponential difference in time to execute.
Have UDFs become safe to use now? What's changed?