2

As the title says, is it possible to execute some SQL within a SQL column?

I have a large record set with a bunch of arbitrary decimal values that needs to be run against a small number of calculations that are stored as SQL in another table.

For example, I have 3000 rows where the values range from 10 to 100. Let's say each set of 1000 is Kelvin, Celsius and Fahrenheit respectively. The corresponding inline SQL table contains the formula that will convert them all to something else.

Is it possible to run this dynamic SQL against each of the values from the result set on-the-fly?

I know I can do this via a CURSOR or similar loop, but this will take quite a performance hit.

Is there a way to do this without incurring too many performance problems?

ChrisC
  • 144
  • 9
  • To run what? Is the data in the column valid T-SQL statement and you need to execute all of the statements at the same time? – gotqn May 10 '17 at 11:03
  • I don't think this is possible. You could tackle this problem in much they way you suggest in the question, by using dynamic sql. There is no TSQL or compiled query type you could use to short circuit they SQL compilation part of the process. If there was, you'd have to maintain some sort of schema binding to get any benefit. An approach where you store SQL in a database sounds like an anti-pattern, SQL Server is already doing this behind the scenes in a optimised way, using the system tables. This would be creating an inefficient layer of indirection. – Jodrell May 10 '17 at 11:03
  • "if it's possible to run the stored SQL on-the-fly" Yes. "without taking too much of a performance hit" Heck no. Inline functions that are *not* inline table-valued functions have rotten performance anyway, so dynamic SQL would be your least problem. I'm politely not even questioning the fact that there's arbitrary SQL in a column, although that obviously has huge security implications. If you have to have this, finding some way of building a dynamic query that doesn't have to work row by row is your best bet. That or having a client build the queries. What's your actual use case? – Jeroen Mostert May 10 '17 at 11:04
  • @JeroenMostert You can't talk about the performance, without knowing the `T-SQL` statements he is going to execute. He may need to run just 10 `SELECT 1` statements - what's the performance hit of concatenated these statements and executing them at the same time - nothing. – gotqn May 10 '17 at 11:07
  • Unfortunately, it is arbitrary SQL in the column... and no, this wasn't my idea :-) Luckily it's not visible to the outside world, but the results from running the SQL is visible and it's already quite performance intensive. I was hoping that I was missing some hidden SQL feature as I've been scouring the web for over an hour now and couldn't find anything conclusive, apart from running dynamic SQL. – ChrisC May 10 '17 at 11:09
  • I'll clarify a bit more because @gotqn is right, it's tiny select statements that runs a calculation. The calculation contains an 'x' that requires replacing so it will look something like this: 'SELECT x/20' – ChrisC May 10 '17 at 11:10
  • 1
    @ChrisC, unless you can change the way this works, I can only suggest you do your best to minimise the number of rows that you must perform the "interpretation process" upon. Do all the independent filtering first. – Jodrell May 10 '17 at 11:12
  • @Jodrell under the circumstances that makes the most sense, as ultimately the records only require a small number of calculations. Can you put that in as an answer and I'll mark it as such. Thanks! – ChrisC May 10 '17 at 11:14
  • 2
    If you edit your question to make this clear (as in, an example of the input table and the output, how many rows we're talking about, what kind of expressions are in there, if the `SELECT` statements have a `FROM` and suchlike) you may get good answers on how to do this at least reasonably well. For example, if you can interpret all the SQL as a domain-specific language with simple expressions, folding them into a query by parsing them is doable, or a dynamic query can be built that uses `CROSS APPLY` or `UNION` on the expressions minus `SELECT`. – Jeroen Mostert May 10 '17 at 11:17
  • @JeroenMostert OK I've edited the question to provide some more context along with an example – ChrisC May 10 '17 at 11:26
  • From the way you describe it, building a dynamic query that simply includes the calculation as a column should be fine, since the expression isn't unique per row. Any good solution would take care not to work row by row (as you would with a cursor), not avoid dynamic SQL per se. – Jeroen Mostert May 10 '17 at 11:32

1 Answers1

1

I don't think this is possible. You could tackle this problem in much the way you suggest in the question, by using dynamic sql. There is no TSQL or compiled query type you could use to short circuit they SQL compilation part of the process. If there was, you'd have to maintain some sort of schema binding to get any benefit.

Unless you can change the way this works, I can only suggest you do your best to minimise the number of rows that you must perform the "interpretation process" upon. Do all the independent filtering first.


Now that you've expanded your question ...

I'd suggest you write some code that will construct one statement that incorporates all the inline SQL operations you want to perform and targets your table of decimal values.

This way you can perform the operation as a set based operation rather than some sort of RBAR approach.

Community
  • 1
  • 1
Jodrell
  • 34,946
  • 5
  • 87
  • 124
  • thanks for that and also the link. I didn't think of using a set based operation and I think that may just solve this problem for me. – ChrisC May 10 '17 at 11:41