How to pass two sql tables as input parameter for r codes in SQL Server
You can use R's built in serialization to accomplish this in the SQL Server 2016. It's definitely verbose, however, it does work.
Example, setup two separate input temp tables (they cannot be table variables, unfortunately):
CREATE TABLE #Table1
(
[StrCol] NVARCHAR( 50 ),
[IntCol] INT
);
INSERT INTO
#Table1
(
[StrCol],
[IntCol]
)
VALUES
( N'testing data 1', 1 ),
( N'testing data 2', 2 )
;
CREATE TABLE #Table2
(
[StrCol] NVARCHAR( 50 ),
[IntCol] INT
);
INSERT INTO
#Table2
(
[StrCol],
[IntCol]
)
VALUES
( N'more testing data 1', 5 ),
( N'more testing data 2', 6 )
;
From here, you can create VARBINARY types to hold the serialized results translated in R.
DECLARE
@Table1_Input NVARCHAR( MAX ) = 'SELECT * FROM #Table1;',
@Table2_Input NVARCHAR( MAX ) = 'SELECT * FROM #Table2;',
@Table1_Data VARBINARY( MAX ),
@Table2_Data VARBINARY( MAX );
EXECUTE sp_execute_external_script
@language = N'R',
@script = N'
if( nrow(InputDataSet) == 0 )
stop("Invalid data passed in")
# Read in the sql table, serialize it to an output string
Output <- serialize(InputDataSet, NULL)
',
@input_data_1 = @Table1_Input,
@params = N'@Output VARBINARY( MAX ) OUTPUT',
@Output = @Table1_Data OUTPUT;
EXECUTE sp_execute_external_script
@language = N'R',
@script = N'
if( nrow(InputDataSet) == 0 )
stop("Invalid data passed in")
# Read in the sql table, serialize it to an output string
Output <- serialize(InputDataSet, NULL)
',
@input_data_1 = @Table2_Input,
@params = N'@Output VARBINARY( MAX ) OUTPUT',
@Output = @Table2_Data OUTPUT;
Finally,
EXECUTE sp_execute_external_script
@language = N'R',
@script = N'
table1 <- unserialize(Table1_Data)
table2 <- unserialize(Table2_Data)
OutputDataSet <- rbind(table1, table2)
',
@params = N'@Table1_Data VARBINARY(MAX), @Table2_Data VARBINARY(MAX)',
@Table1_Data = @Table1_Data,
@Table2_Data = @Table2_Data
WITH RESULT SETS (( [Col1] NVARCHAR( 50 ), [Col2] INT ));
Results:
Col1 Col2
testing data 1
testing data 2
more testing data 5
more testing data 6
Passing More then One Parameter to [sp_execute_external_script] for Executing R code
You need to Write Output Syntax @Params
in the query.
ALTER procedure [dbo].[spGetAllEmployees]
@empid2 int,
@Name1 Varchar(50),
@DOB1 Date
as
Begin
Declare @empid int
Declare @NewName varchar(50)
Declare @NewDOB Varchar(50)
EXECUTE sp_execute_external_script
@language = N'R'
,@script = N'df <- as.data.frame(InputDataSet);df[,1] <- df[,1]+1;OutputDataSet <-df'
,@input_data_1 = N'select * from tblEmployee where EmployeeId=@empid AND Name=@NewName AND DOB=@NewDOB'
,@params = N'@empid Int OUTPUT,@NewName varchar(50) OUTPUT,@NewDOB Varchar(50) OUTPUT'
,@empid = @empid2
,@NewName =@Name1
,@NewDOB=@DOB1
WITH RESULT SETS (([Col1] Int, [Col2] varchar(50), [Col3] varchar(50), Col4 varchar(50),Col5 varchar(50),Col6 varchar(50)));
End
-- EXEC dbo.spGetAllEmployees @empid2 ='1', @Name1='Nabi',@DOB1='2017-04-01';
Stored procedure parameters to R
Consider using @params
to pass SQL Server params to R variables as shown in this MS docs. Be sure to always return a dataframe object and leave @input_data_1
for SQL Server queries that you need passed into R script in the InputDataSet dataframe object (hence it is blank here).
CREATE PROCEDURE myProc (@a int, @b int)
AS
EXEC sp_execute_external_script
@language = N'R'
, @script = N' OutputDataSet <- data.frame(SumRes = sum(a_r, b_r));'
, @input_data_1 = N' ;'
, @params = N' @a_r int, @b_r int'
, @a_r = @a
, @b_r = @b
WITH RESULT SETS (([SumResult] int NOT NULL));
And then call procedure:
EXEC dbo.myProc @a = 2, @b = 3
SQL query in R passing two dates as parameters
The paste
function inserts spaces by default, turning your string into
Select * From tableA
WHERE DATE(Administered_Date) between ' 01-01-2011 ' AND , ' 01-31-2011 '
Instead use paste0
. Or continue using paste
and set the sep
parameter to a blank string:
Data = dbGetQuery(conn, paste(
"
Select * From tableA
WHERE DATE(Administered_Date) between '", begin.date,"'" AND , '", end.date, "'"
",
sep = ""))
SQL stored procedure - table as parameter
To do that you will need to use dynamic SQL
The basic procedure is to build up a string that will hold the statement you will execute, then execute it
declare @SQL nvarchar(1000)
declare @t as nvarchar (1000)
set @t = 'MyTable'
set @Sql = 'Select * from ' + @t
exec sp_executesql @sql
Passing multiple values for a single parameter in Reporting Services
Although John Sansom's solution works, there's another way to do this, without having to use a potentially inefficient scalar valued UDF.
In the SSRS report, on the parameters tab of the query definition, set the parameter value to
=join(Parameters!<your param name>.Value,",")
In your query, you can then reference the value like so:
where yourColumn in (@<your param name>)
R in SQL Server: Output data frame into a table
As shown in this tutorial, you can use INSERT INTO ... EXEC
in a previously created table with columns aligning to script's dataframe return:
INSERT INTO Table1
execute sp_execute_external_script
@language=N'R',
@script=N'inp_dat <- InputDataSet
inp_dat$NewCol <- max(inp_dat$col1,inp_dat$col2)
new_dat <- inp_dat',
@input_data_1=N'SELECT * FROM IM_COMP_TEST_SQL2016.dbo.temp_table',
@output_data_1=N'newdat';
However, to use the make-table query may require OPENQUERY()
or OPENROWSET()
using an ad-hoc distributed query as described in this SO Post to return the output of stored procedure:
Stored Procedure
CREATE PROCEDURE dbo.R_DataFrame
AS
BEGIN
execute sp_execute_external_script
@language=N'R',
@script=N'inp_dat <- InputDataSet
inp_dat$NewCol <- max(inp_dat$col1,inp_dat$col2)
new_dat <- inp_dat',
@input_data_1=N'SELECT * FROM IM_COMP_TEST_SQL2016.dbo.temp_table',
@output_data_1=N'newdat';
-- ADD ALL COLUMN TYPES;
WITH RESULT SETS (("newdat" [col1] varchar(20), [col2] double, [col3] int ...));
END
GO
Action Query
SELECT * INTO Table1
FROM OPENROWSET('SQLNCLI', 'Server=(local);Trusted_Connection=yes;',
'EXEC dbo.R_DataFrame')
Multiple input sources for MSSQL Server 2017 Python analytical services
As you've already figured out, sp_execucte_external_script
only allows one result set to be passed in. :-(
You can certainly query from inside the script to fetch data as long as your script is okay with the fact that it's not executing under the current SQL session's user's permissions.
If pagination is important and one data set is significantly larger than the others and you're using Enterprise Edition, you might consider passing the largest data set into the script in chunks using sp_execute_external_script
's streaming feature.
If you'd like all of your data to be assembled in SQL Server (vs. fetched by queries in your script), you could try to serialize the result sets and then pass them in as parameters (link describes how to do this in R but something similar should be possible with Python).
Related Topics
Running Total by Group SQL (Oracle)
Drop All Tables Sharing The Same Prefix in Postgres
Ms Access - Execute a Saved Query by Name in Vba
How to Remove SQL Azure Data Sync Objects Manually
Rails - Distinct on After a Join
How to Identify Views with Broken Dependencies in SQL Server
Sql Efficiency - [=] Vs [In] Vs [Like] Vs [Matches]
Deleting a Lot of Data in Oracle
Creating a Table from a Query Using a Different Tablespace (Oracle Sql)
Failing Update Table in Db2 with Sqlcode: -668, Sqlstate: 57016, Sqlerrmc: 7;
Convert Excel Worksheet to SQL Script
Does "Select for Update" Prevent Other Connections Inserting When the Row Is Not Present
Sql: How to Select a Max Value for Each Group Per Day