To do it without a special procedure in a simple statement, you could convert each row to XML and then use an XQuery on the XML to search for any value in the row that matches. So for example:
declare @SearchValue as varchar(20)
set @SearchValue = 'MyValue'
select *
--,(select MyTable.* for XML PATH (''),TYPE) AllColumns
--,convert(nvarchar(255),(select MyTable.* for XML PATH (''),TYPE).query('for $item in * where $item=sql:variable("@SearchValue") return $item')) FoundColumns
from MyTable
where convert(nvarchar(255),(select MyTable.* for XML PATH (''),TYPE).query('for $item in * where $item=sql:variable("@SearchValue") return $item'))<>''
A procedure specifically designed for this task could probably do this more efficiently and could take advantage of indexes... etc. Honestly I would not put this into a production database solution without quite a bit of consideration, but as a throw together search tool it's not bad. I ran a search on a 700,000 record table in 40 seconds. However if I filter by each column individually it runs nearly instantly. Also a few more caveats:
- None of the table columns can not have spaces or other unfriendly
characters for an XML tag. I couldn't figure out how to get column names with spaces to work. Maybe there's a way.
- The filter has to be written in XQuery... which is not exactly like
SQL. But you can use =, <, >, and there's even pattern matching.
- The parameter for the query function must be a string literal. So
you can't build a string dynamically. This is why I used the variable for your search values, but you could also use a sql:column("ColName") if needed.
- If searching for other types besides strings, the search string you use must match exactly what the field would be converted to as an XML value.