Execute Table-Valued Function on Multiple Rows

Execute table-valued function on multiple rows?

OUTER APPLY:

SELECT Stuff.id
,Results.pn
,Results.s
FROM stackoverflow_454945 AS Stuff
OUTER APPLY dbo.Split(',', Stuff.myColumn) AS Results
WHERE ISNULL(Results.s,'') <> ''

Execute table valued function from row values

I believe that this should do what you need, using dynamic SQL to generate a single statement that can give you your results and then using that with EXEC to put them into your table. The FOR XML trick is a common one for concatenating VARCHAR values together from multiple rows. It has to be written with the AS [text()] for it to work.

--=========================================================
-- Set up
--=========================================================
CREATE TABLE dbo.TestTableFunctions (function_name VARCHAR(50) NOT NULL, parameter VARCHAR(20) NOT NULL)
INSERT INTO dbo.TestTableFunctions (function_name, parameter)
VALUES ('fn_one', '1001'), ('fn_two', '1001'), ('fn_one', '1002'), ('fn_two', '1002')

CREATE TABLE dbo.TestTableFunctionsResults (function_name VARCHAR(50) NOT NULL, parameter VARCHAR(20) NOT NULL, result VARCHAR(200) NOT NULL)
GO
CREATE FUNCTION dbo.fn_one
(
@parameter VARCHAR(20)
)
RETURNS TABLE
AS
RETURN
SELECT 'fn_one_' + @parameter AS result
GO
CREATE FUNCTION dbo.fn_two
(
@parameter VARCHAR(20)
)
RETURNS TABLE
AS
RETURN
SELECT 'fn_two_' + @parameter AS result
GO

--=========================================================
-- The important stuff
--=========================================================

DECLARE @sql VARCHAR(MAX)

SELECT @sql =
(
SELECT 'SELECT ''' + T1.function_name + ''', ''' + T1.parameter + ''', F.result FROM ' + T1.function_name + '(' + T1.parameter + ') F UNION ALL ' AS [text()]
FROM
TestTableFunctions T1
FOR XML PATH ('')
)

SELECT @sql = SUBSTRING(@sql, 1, LEN(@sql) - 10)

INSERT INTO dbo.TestTableFunctionsResults
EXEC(@sql)

SELECT * FROM dbo.TestTableFunctionsResults

--=========================================================
-- Clean up
--=========================================================
DROP TABLE dbo.TestTableFunctions
DROP TABLE dbo.TestTableFunctionsResults
DROP FUNCTION dbo.fn_one
DROP FUNCTION dbo.fn_two
GO

The first SELECT statement (ignoring the setup) builds a string which has the syntax to run all of the functions in your table, returning the results all UNIONed together. That makes it possible to run the string with EXEC, which means that you can then INSERT those results into your table.

A couple of quick notes though... First, the functions must all return identical result set structures - the same number of columns with the same data types (technically, they might be able to be different data types if SQL Server can always do implicit conversions on them, but it's really not worth the risk). Second, if someone were able to update your functions table they could use SQL injection to wreak havoc on your system. You'll need that to be tightly controlled and I wouldn't let users just enter in function names, etc.

executing table valued valued function in select statement

Use CROSS APPLY to execute your function against each row in the table, passing a row's column as the parameter, and joining all the rows from the results in the result.

You didn't give a more concrete example of your query, so I can't give you a clearer example, but this is what you should look for. use CROSS APPLY just like you would an INNER JOIN, only there's no ON clause.

Okay, from the updated query, yours would look something like this:

SELECT Fname, Lname, x.ColumnName
FROM Employees
CROSS APPLY mytabfunct(Fname) x

If you just do a SELECT mytabfunct('Joseph') you'll get a result set with some column names. You'll need to change "ColumnName" to one of the column names from your table-valued function in the above statement. You didn't show what the function returns, so I can't get more specific than that. But try this and see if it gets you closer.

Use a table-valued function in a SELECT column list?

You can cross apply a TVF to a regular table e.g.

SELECT
x.a
, x.b
, x.d
, y.*
FROM x
cross apply FN(x.c) y;

How to use a table valued function in select clause?

You would use a CROSS APPLY:

SELECT a.rawData, b.*
FROM TableA a
CROSS APPLY FunctionA(a.rawdata) b

Table Value Function, Select Multiple Rows

declare @FirstId int, @SecondId int;
select
@FirstID = convert(int,min(case when ValueIndex = 1 then Value end))
, @SecondID = convert(int,min(case when ValueIndex = 2 then Value end))
from dbo.SplitString('1234-5678', '-')

select
FirstId = @FirstId
, SecondId = @SecondId

rextester demo: http://rextester.com/TRTDI68038

returns:

+---------+----------+
| FirstId | SecondId |
+---------+----------+
| 1234 | 5678 |
+---------+----------+

Demo was done using a CSV Splitter table valued function by Jeff Moden with the function name and output columns renamed.


Splitting strings reference:

  • Tally OH! An Improved SQL 8K “CSV Splitter” Function - Jeff Moden
  • Splitting Strings : A Follow-Up - Aaron Bertrand
  • Split strings the right way – or the next best way - Aaron Bertrand
  • string_split() in SQL Server 2016 : Follow-Up #1 - Aaron Bertrand
  • Ordinal workaround for **string_split()** - Solomon Rutzky

Most efficient way to call same Table-Valued Function on multiple columns in a Query

FIRST: it should be mentioned that the absolutely fastest method of getting the desired results is to do the following:

  1. Migrate data into either new columns or even a new table:

    1. New column approach:

      1. Add new columns as {name}_new to the table with the DECIMAL(18, 3) datatype
      2. Do a one-time migration of the data from the old VARCHAR columns to the DECIMAL columns
      3. rename the old columns to {name}_old
      4. rename new columns to be just {name}
    2. New table approach:

      1. Create new table as {table_name}_new using DECIMAL(18, 3) datatype
      2. Do a one-time migration of the data from current table to new DECIMAL-based table.
      3. rename old table to _old
      4. remove _new from new table
  2. Update app, etc to never insert data encoded in this manner
  3. after one release cycle, if no problems, drop old columns or table
  4. drop TVFs and UDF
  5. Never speak of this again!

THAT BEING SAID:
You can get rid of a lot of that code as it is largely unnecessary duplication. Also, there are at least two bugs that cause the output to sometimes be incorrect, or sometimes throw an error. And those bugs were copied into Joe's code as it produces the same results (including the error) as the O.P.'s code. For example:

  • These values produce a correct result:

    00062929x
    00021577E
    00000509H
  • These values produce an incorrect result:

    00002020Q
    00016723L
    00009431O
    00017221R
  • This value produces an error:

    00062145}
    anything ending with "}"

Comparing all 3 versions against 448,740 rows using SET STATISTICS TIME ON;, they all ran in just over 5000 ms of elapsed time. But for CPU time, the results were:

  • O.P.'s TVF: 7031 ms
  • Joe's TVF: 3734 ms
  • Solomon's TVF: 1407 ms

SETUP: DATA

The following creates a table and populates it. This should create the same data set across all systems running SQL Server 2017 since they will have the same rows in spt_values. This helps provide a basis of comparison across other people testing on their system since randomly generated data would factor into timing differences across systems, or even between tests on the same system if the sample data is regenerated. I started with the same 3 column table as Joe did, but used the sample values from the question as a template to come up with a variety of numeric values appended with each of the possible trailing character options (including no trailing character). This is also why I forced the Collation on the columns: I didn't want the fact that I am using a binary-Collation Instance to unfairly negate the effect of using the COLLATE keyword to force a different Collation in the TVF).

The only difference is in the ordering of the rows in the table.

USE [tempdb];
SET NOCOUNT ON;

CREATE TABLE dbo.TestVals
(
[TestValsID] INT IDENTITY(1, 1) NOT NULL PRIMARY KEY,
[Col1] VARCHAR(50) COLLATE Latin1_General_100_CI_AS NOT NULL,
[Col2] VARCHAR(50) COLLATE Latin1_General_100_CI_AS NOT NULL,
[Col3] VARCHAR(50) COLLATE Latin1_General_100_CI_AS NOT NULL
);

;WITH cte AS
(
SELECT (val.[number] + tmp.[blah]) AS [num]
FROM [master].[dbo].[spt_values] val
CROSS JOIN (VALUES (1), (7845), (0), (237), (61063), (999)) tmp(blah)
WHERE val.[number] BETWEEN 0 AND 1000000
)
INSERT INTO dbo.TestVals ([Col1], [Col2], [Col3])
SELECT FORMATMESSAGE('%08d%s', cte.[num], tab.[col]) AS [Col1],
FORMATMESSAGE('%08d%s', ((cte.[num] + 2) * 2), tab.[col]) AS [Col2],
FORMATMESSAGE('%08d%s', ((cte.[num] + 1) * 3), tab.[col]) AS [Col3]
FROM cte
CROSS JOIN (VALUES (''), ('{'), ('A'), ('B'), ('C'), ('D'), ('E'), ('F'),
('G'), ('H'), ('I'), ('}'), ('J'), ('K'), ('L'), ('M'), ('N'),
('O'), ('P'), ('Q'), ('R'), ('p'), ('q'), ('r'), ('s'), ('t'),
('u'), ('v'), ('w'), ('x'), ('y')) tab(col)
ORDER BY NEWID();
-- 463698 rows

SETUP: TVF

GO
CREATE OR ALTER FUNCTION dbo.ConvertAmountVerified_Solomon
(
@amt VARCHAR(50)
)
RETURNS TABLE
WITH SCHEMABINDING
AS
RETURN

WITH ctePosition AS
(
SELECT CHARINDEX(RIGHT(RTRIM(@amt), 1) COLLATE Latin1_General_100_BIN2,
'{ABCDEFGHI}JKLMNOPQRpqrstuvwxy') AS [Value]
),
cteAppend AS
(
SELECT pos.[Value] AS [Position],
IIF(pos.[Value] > 0,
CHAR(48 + ((pos.[Value] - 1) % 10)),
'') AS [Value]
FROM ctePosition pos
)
SELECT (CONVERT(DECIMAL(18, 3),
IIF(app.[Position] > 0,
SUBSTRING(RTRIM(@amt), 1, LEN(@amt) - 1) + app.[Value],
@amt))
/ 100. )
* IIF(app.[Position] > 10, -1., 1.) AS [AmountVerified]
FROM cteAppend app;
GO

Please note:

  1. I used a binary (i.e. _BIN2) Collation which is faster than a case-sensitive Collation as it does not need to account for any linguistic rules.
  2. The only thing that really matters is the location (i.e. the "index") of the right-most character within the list of alpha characters plus the two curly brackets. Everything done operationally is derived from that position more so than the value of the character itself.
  3. I used the input parameter and return value datatypes as indicated in the original UDF that was rewritten by the O.P. Unless there was good reason to go from VARCHAR(50) to VARCHAR(60), and from NUMERIC (18,3) to NUMERIC (18,2) (good reason would be "they were wrong"), then I would stick with the original signature / types.
  4. I added a period / decimal point to the end of the 3 numeric literals / constants: 100., -1., and 1.. This was not in my original version of this TVF (in the history of this answer) but I noticed some CONVERT_IMPLICIT calls in the XML execution plan (since 100 is an INT but the operation needs to be NUMERIC / DECIMAL) so I just took care of that ahead of time.
  5. I create a string character using the CHAR() function rather than passing a string version of a number (e.g. '2') into a CONVERT function (which was what I was originally doing, again in the history). This appears to be ever so slightly faster. Only a few milliseconds, but still.

TEST

Please note that I had to filter out rows ending with } as that caused the O.P.'s and Joe's TVFs to error. While my code handles the } correctly, I wanted to be consistent with what rows were being tested across the 3 versions. This is why the number of rows generated by the setup query is slightly higher than the number I noted above the test results for how many rows were being tested.

SET STATISTICS TIME ON;

DECLARE @Dummy DECIMAL(18, 3);
SELECT --@Dummy = -- commented out = results to client; uncomment to not return results
cnvrtS.[AmountVerified]
FROM dbo.TestVals vals
CROSS APPLY dbo.ConvertAmountVerified_Solomon(vals.[Col1]) cnvrtS
WHERE RIGHT(vals.[Col1], 1) <> '}'; -- filter out rows that cause error in O.P.'s code

SET STATISTICS TIME OFF;
GO

CPU time is only slightly lower when uncommenting the --@Dummy =, and the ranking among the 3 TVFs is the same. But interestingly enough, when uncommenting the variable, the rankings change a little:

  • Joe's TVF: 3295 ms
  • O.P.'s TVF: 2240 ms
  • Solomon's TVF: 1203 ms

Not sure why the O.P.'s code would perform so much better in this scenario (whereas my and Joe's code only improved marginally), but it does seem consistent across many tests. And no, I did not look at execution plan differences as I don't have time to investigate that.

EVEN FASTERER

I have completed testing of the alternate approach and it does provide a slight but definite improvement to what is shown above. The new approach uses SQLCLR and it appears to scale better. I found that when adding in the second column to the query, the T-SQL approach double in time. But, when adding in additional columns using a SQLCLR Scalar UDF, the time went up, but not by the same amount as the single column timing. Maybe there is some initial overhead in invoking the SQLCLR method (not associated with the overhead of the initial loading of the App Domain and of the Assembly into the App Domain) because the timings were (elapsed time, not CPU time):

  • 1 column: 1018 ms
  • 2 columns: 1750 - 1800 ms
  • 3 columns: 2500 - 2600 ms

So it's possible that the timing (of dumping to a variable, not returning the result set) has a 200 ms - 250 ms overhead and then 750 ms - 800 ms per instance time. CPU timings were: 950 ms, 1750 ms, and 2400 ms for 1, 2, and 3 instances of the UDF, respectively.

C# CODE

using System.Data.SqlTypes;
using Microsoft.SqlServer.Server;

public class Transformations
{
private const string _CHARLIST_ = "{ABCDEFGHI}JKLMNOPQRpqrstuvwxy";

[SqlFunction(IsDeterministic = true, IsPrecise = true,
DataAccess = DataAccessKind.None, SystemDataAccess = SystemDataAccessKind.None)]
public static SqlDouble ConvertAmountVerified_SQLCLR(
[SqlFacet(MaxSize = 50)] SqlString Amt)
{
string _Amount = Amt.Value.TrimEnd();

int _LastCharIndex = (_Amount.Length - 1);
int _Position = _CHARLIST_.IndexOf(_Amount[_LastCharIndex]);

if (_Position >= 0)
{
char[] _TempAmount = _Amount.ToCharArray();
_TempAmount[_LastCharIndex] = char.ConvertFromUtf32(48 + (_Position % 10))[0];
_Amount = new string(_TempAmount);
}

decimal _Return = decimal.Parse(_Amount) / 100M;

if (_Position > 9)
{
_Return *= -1M;
}

return new SqlDouble((double)_Return);
}
}

I originally used SqlDecimal as the return type, but there is a performance penalty for using that as opposed to SqlDouble / FLOAT. Sometimes FLOAT has issues (due to it being an imprecise type), but I verified against the T-SQL TVF via the following query and no differences were detected:

SELECT cnvrtS.[AmountVerified],
dbo.ConvertAmountVerified_SQLCLR(vals.[Col1])
FROM dbo.TestVals vals
CROSS APPLY dbo.ConvertAmountVerified_Solomon(vals.[Col1]) cnvrtS
WHERE cnvrtS.[AmountVerified] <> dbo.ConvertAmountVerified_SQLCLR(vals.[Col1]);

TEST

SET STATISTICS TIME ON;

DECLARE @Dummy DECIMAL(18, 3), @Dummy2 DECIMAL(18, 3), @Dummy3 DECIMAL(18, 3);
SELECT @Dummy =
dbo.ConvertAmountVerified_SQLCLR(vals.[Col1])
, @Dummy2 =
dbo.ConvertAmountVerified_SQLCLR(vals.[Col2])
, @Dummy3 =
dbo.ConvertAmountVerified_SQLCLR(vals.[Col3])
FROM dbo.TestVals vals
WHERE RIGHT(vals.[Col1], 1) <> '}';

SET STATISTICS TIME OFF;

Using a Table-Valued Function to Turn a Single Row into Many Within Select

You need to use an APPLY as your join.

SELECT
a.val AS [Owner],
pa.[RefNumber],
pa.[OSProjectCode]
FROM dbo.ProjectsActions pa
CROSS APPLY dbo.Split(pa.[Owner], '/') a

The CROSS APPLY acts like an INNER JOIN passing the row-level value into your table-valued function. If you expect split function returns NULL if it can't split the value (NULL, empty, etc), you can use OUTER APPLY so that the NULL won't drop that row out of your result set. You can also add a COALESCE to fall back to the [owner].

SELECT
COALESCE(a.val, pa.[Owner]) AS [Owner],
pa.[RefNumber],
pa.[OSProjectCode]
FROM dbo.ProjectsActions pa
OUTER APPLY dbo.Split(pa.[Owner], '/') a

How to execute table valued function in SQL Server

You can't pass whole table,like the way,you are trying now..you can use cross apply to get all

you can try below

select * from dbo.LINK  r
cross apply
dbo. fn_security(r.rk)

Pass multiple values of same input type to BigQuery Table Valued Function

But I'd like to return multiple rows, one for each value of input_val.

Consider below:

Step 1 - create table function

CREATE TABLE FUNCTION project.dataset.Job(input_vals array<INT64>) AS (
SELECT
val,
row2
FROM TableOfInterest
WHERE val IN UNNEST(input_vals)
);

Step 2 - use it :o)

SELECT 
val,
row2
FROM project.dataset.Job(
ARRAY(
SELECT val
FROM ValTable
LIMIT 100
))


Related Topics



Leave a reply



Submit