Are There Any Way to Execute a Query Inside the String Value (Like Eval) in Postgresql

Are there any way to execute a query inside the string value (like eval) in PostgreSQL?

If the statements you are trying to "eval" always return the same data type, you could write an eval() function that uses the EXECUTE mentioned by Grzegorz.

create or replace function eval(expression text) returns integer
as
$body$
declare
result integer;
begin
execute expression into result;
return result;
end;
$body$
language plpgsql

Then you could do something like

SELECT eval('select 41') + 1;

But this approach won't work if your dynamic statements return something different for each expression that you want to evaluate.

Also bear in mind that this opens a huge security risk by running arbitrary statements. If that is a problem depends on your environment. If that is only used in interactive SQL sessions then it isn't a problem.

Eval Calculation-string: 2 * 3 + 4 * 5 in Postgresql

You need PL/pgSQL:

create or replace function f(_s text)
returns numeric as $$
declare i numeric;
begin
execute format('select %s', _s) into i;
return i;
end;
$$ language plpgsql;

select f('1 + 1');
f
---
2

How to evaluate expression in select statement in Postgres

you can write an SQL function that does this for you and use e.g. the ones supplied with postgres-utils:

select 
c.name as cust_name,
p.name as prod_name,
p.cost as prod_cost,

eval(
'select '||c.price_expression||' from product where id=:pid',
'{"{cost}",:pid}',
array[ p.cost, p.id ]
) as cust_cost

from product p, customer c

But of course it may be slow, insecure, you could use materialized views to cache it more easily, etc. - see docu there.

python's eval() in Amazon-Redshift: evaluating strings as expressions

You're in luck! Amazon Redshift recently introduced User Defined Functions that can call Python code.

Therefore, you can take advantage of the Python eval() command to evaluate a condition.

Here's some code that worked for me:

CREATE FUNCTION f_eval(condition TEXT)
RETURNS boolean
VOLATILE AS $$
return eval(condition)
$$ LANGUAGE plpythonu;

Then run it with:

SELECT f_eval('5 < 6 and 3 < 4');

This returns true in SQL.

PL/pgSQL perform vs execute

PERFORM is plpgsql command used for calls of void functions. PLpgSQL is careful about useless SELECT statements - the SELECT without INTO clause is not allowed. But sometimes you need to call a function and you don't need to store result (or functions has no result). The function in SQL is called with SELECT statement. But it is not possible in PLpgSQL - so the command PERFORM was introduced.

CREATE OR REPLACE FUNCTION foo()
RETURNS void AS $$
BEGIN
RAISE NOTICE 'Hello from void function';
END;
$$ LANGUAGE plpgsql;

-- direct call from SQL
SELECT foo();

-- in PLpgSQL
DO $$
BEGIN
SELECT foo(); -- is not allowed
PERFORM foo(); -- is ok
END;
$$;

The PERFORM statements execute a parameter and forgot result.

Your example perform 'create table foo as (select 1)';

is same like SELECT 'create table foo as (select 1)'. It returns a string "create table foo as (select 1)" and this string is discarded.

The EXECUTE statement evaluate a expression to get string. In next step this string is executed.

So EXECUTE 'create table ' || some_var || '(a int)'; has two steps

  1. evaluate expression 'create table ' || some_var || '(a int)'
  2. if some_var is mytab for example, then execute a command create table mytab(a int)

The PERFORM statement is used for function calls, when functions are not used in assignment statement. The EXECUTE is used for evaluation of dynamic SQL - when a form of SQL command is known in runtime.

How to execute a string result of a stored procedure in postgres

Your first problem was solved by using dynamic SQL with EXECUTE like Craig advised.
But the rabbit hole goes deeper:

CREATE OR REPLACE FUNCTION myresult(mytable text, myprefix text)
RETURNS SETOF RECORD AS
$func$
DECLARE
smalltext text;
myoneliner text;
BEGIN
SELECT INTO myoneliner
'SELECT '
|| string_agg(quote_ident(column_name::text), ',' ORDER BY column_name)
|| ' FROM ' || quote_ident(mytable)
FROM information_schema.columns
WHERE table_name = mytable
AND column_name LIKE myprefix||'%'
AND table_schema = 'public'; -- schema name; might be another param

smalltext := lower(myoneliner); -- nonsense
RAISE NOTICE 'My additional text: %', myoneliner;

RETURN QUERY EXECUTE myoneliner;
END
$func$ LANGUAGE plpgsql;

Major points

  • Don't cast the whole statement to lower case. Column names might be double-quoted with upper case letters, which are case-sensitive in this case (no pun intended).

  • You don't need DISTINCT in the query on information_schema.columns. Column names are unique per table.

  • You do need to specify the schema, though (or use another way to single out one schema), or you might be mixing column names from multiple tables of the same name in multiple schemas, resulting in nonsense.

  • You must sanitize all identifiers in dynamic code - including table names: quote_ident(mytable). Be aware that your text parameter to the function is case sensitive! The query on information_schema.columns requires that, too.

  • I untangled your whole construct to build the list of column names with string_agg() instead of the array constructor. Related answer:

    • Update multiple columns that start with a specific string
  • The assignment operator in plpgsql is :=.

  • Simplified syntax of RAISE NOTICE.

Core problem impossible to solve

All of this still doesn't solve your main problem: SQL demands a definition of the columns to be returned. You can circumvent this by returning anonymous records like you tried. But that's just postponing the inevitable. Now you have to provide a column definition list at call time, just like your error message tells you. But you just don't know which columns are going to be returned. Catch 22.

Your call would work like this:

SELECT *
FROM myresult('dkj_p_k27ac','enri') AS f (
enrich_d_dkj_p_k27ac text -- replace with actual column types
, enrich_lr_dkj_p_k27ac text
, enrich_r_dkj_p_k27ac text);

But you don't know number, names (optional) and data types of returned columns, not at creation time of the function and not even at call time. It's impossible to do exactly that in a single call. You need two separate queries to the database.

You could return all columns of any given table dynamically with a function using polymorphic types, because there is a well defined type for the whole table. Last chapter of this related answer:

  • Refactor a PL/pgSQL function to return the output of various SELECT queries

Execute statement provided as a value

You can do this dynamically in a plpgsql function:

create or replace function eval_bool(expr text)
returns boolean language plpgsql as $$
declare
rslt boolean;
begin
execute format('select %s', expr) into rslt;
return rslt;
end $$;

select id, selector, eval_bool(selector)
from stmts;

id | selector | eval_bool
----+----------+-----------
1 | 5 > 3 | t
2 | 5 < 3 | f
(2 rows)

How to use execute dynamic query into int array in Postgresql

You need to aggregate the values into an array in order to store them in an array variable.

Additionally: you shouldn't pass parameters as strings, pass them with the USING clause:

EXECUTE format(
'
SELECT array_agg(DISTINCT m.id)
FROM Part p
JOIN %1$s A ON A.part_id = p.id
JOIN Model m ON m.%1$s_id = A.id
WHERE p.id = $1
',
trim(NEW.part_type)
)
INTO model_ids
USING NEW.id;

Execute a dynamic crosstab query

What you ask for is impossible. SQL is a strictly typed language. PostgreSQL functions need to declare a return type (RETURNS ..) at the time of creation.

A limited way around this is with polymorphic functions. If you can provide the return type at the time of the function call. But that's not evident from your question.

  • Refactor a PL/pgSQL function to return the output of various SELECT queries

You can return a completely dynamic result with anonymous records. But then you are required to provide a column definition list with every call. And how do you know about the returned columns? Catch 22.

There are various workarounds, depending on what you need or can work with. Since all your data columns seem to share the same data type, I suggest to return an array: text[]. Or you could return a document type like hstore or json. Related:

  • Dynamic alternative to pivot with CASE and GROUP BY

  • Dynamically convert hstore keys into columns for an unknown set of keys

But it might be simpler to just use two calls: 1: Let Postgres build the query. 2: Execute and retrieve returned rows.

  • Selecting multiple max() values using a single SQL statement

I would not use the function from Eric Minikel as presented in your question at all. It is not safe against SQL injection by way of maliciously malformed identifiers. Use format() to build query strings unless you are running an outdated version older than Postgres 9.1.

A shorter and cleaner implementation could look like this:

CREATE OR REPLACE FUNCTION xtab(_tbl regclass, _row text, _cat text
, _expr text -- still vulnerable to SQL injection!
, _type regtype)
RETURNS text
LANGUAGE plpgsql AS
$func$
DECLARE
_cat_list text;
_col_list text;
BEGIN

-- generate categories for xtab param and col definition list
EXECUTE format(
$$SELECT string_agg(quote_literal(x.cat), '), (')
, string_agg(quote_ident (x.cat), %L)
FROM (SELECT DISTINCT %I AS cat FROM %s ORDER BY 1) x$$
, ' ' || _type || ', ', _cat, _tbl)
INTO _cat_list, _col_list;

-- generate query string
RETURN format(
'SELECT * FROM crosstab(
$q$SELECT %I, %I, %s
FROM %I
GROUP BY 1, 2 -- only works if the 3rd column is an aggregate expression
ORDER BY 1, 2$q$
, $c$VALUES (%5$s)$c$
) ct(%1$I text, %6$s %7$s)'
, _row, _cat, _expr -- expr must be an aggregate expression!
, _tbl, _cat_list, _col_list, _type);

END
$func$;

Same function call as your original version. The function crosstab() is provided by the additional module tablefunc which has to be installed. Basics:

  • PostgreSQL Crosstab Query

This handles column and table names safely. Note the use of object identifier types regclass and regtype. Also works for schema-qualified names.

  • Table name as a PostgreSQL function parameter

However, it is not completely safe while you pass a string to be executed as expression (_expr - cellc in your original query). This kind of input is inherently unsafe against SQL injection and should never be exposed to the general public.

  • SQL injection in Postgres functions vs prepared queries

Scans the table only once for both lists of categories and should be a bit faster.

Still can't return completely dynamic row types since that's strictly not possible.



Related Topics



Leave a reply



Submit