Mysql: Selecting Multiple Fields into Multiple Variables in a Stored Procedure

MySQL: Selecting multiple fields into multiple variables in a stored procedure

Your syntax isn't quite right: you need to list the fields in order before the INTO, and the corresponding target variables after:

SELECT Id, dateCreated
INTO iId, dCreate
FROM products
WHERE pName = iName

SQL Server : selecting multiple columns into multiple variables in a stored procedure

Not sure if I understand your issue. If you want to set multiple variables at once:

DECLARE @myInt INT;
DECLARE @myDate DATETIME;

SELECT @myInt = someInt, @myDate = someDate
FROM someTable
WHERE myName = someName

Also note if the select fetches many rows the variables will hold the last rows values. In fact the variables will be set for each row fetched.

Also note in SQL Server you don't have a parameter to the Int declaration. Maybe you want to use Decimal(20).

SELECT INTO multiple @variables MySQL

You can join with the same table and ensure that each join will provide a new id, something like (eg. for two ids, but you will get the point):

SELECT a1.id, a2.id INTO @photo1, @photo2
FROM album a1
inner join album a2 on a2.scene=a1.scene and a2.upload=a1.upload and a2.id>a1.id
WHERE a1.uploaded = @time AND a1.scene_id = NEW.id;

See SqlFiddle for a complete sql and test case.

Is there a way to set multiple variables in one SELECT query in MySQL?

Without more context, it's unclear precisely what you are asking, but if you need to temporarily override the exit handler, nest your query in a new scoping block, with a handler to clear the variables. (They have to be cleared if inside a loop, or they will still contain their prior values if assigned).

-- inside existing procedure
BEGIN -- add this and the handler
DECLARE CONTINUE HANDLER FOR NOT FOUND SET var_01 = NULL, var_02 = NULL;
SELECT val_01, val_02 INTO var_01, var_02 FROM table_name WHERE col_name = condition_01;
END; -- add this
-- procedure continues here

Select multiple columns into multiple variables

Your query should be:

SELECT T1.DATE1, T1.DATE2, T1.DATE3
INTO V_DATE1, V_DATE2, V_DATE3
FROM T1
WHERE ID='X';

MySQL store procedure Assign value to multiple variable from select statement

Don't use variables with same name of columns, the query will take preference on table column names.

A good idea is use variables with prefix:

BEGIN
DECLARE p_PaidFee INT DEFAULT 0;
DECLARE p_DueFee INT DEFAULT 0;
DECLARE p_CourseFee INT DEFAULT 0;
INSERT INTO `creditdirectory`(`TypeID`, `PersonName`, `CreditBy`, `PersonID`, `ModeOfPayment`,`Details`,`Amount`,`CompanyID`)
VALUES(1,PersonName,CreditBy, AddmissionID, ModeOfPayment, 'Installment', PaidAmount,
CompanyID);
SELECT `CourseFee`,`PaidFee`,`DueFee` INTO p_CourseFee,p_PaidFee,p_DueFee FROM `studentcoursedetails` WHERE `ID`= CourseID;
SET p_PaidFee = p_PaidFee + PaidAmount;
SET p_DueFee = p_CourseFee - p_PaidFee;
IF (NextDueDate !='') THEN
UPDATE `studentcoursedetails` SET `PaidFee` = p_PaidFee, `DueFee` = p_DueFee, `DueDate` = NextDueDate WHERE `ID`= CourseID;
ELSE
UPDATE `studentcoursedetails` SET `PaidFee` = p_PaidFee, `DueFee` = p_DueFee, `DueDate` = NULL WHERE `ID` = CourseID;
END IF;
END

Comma separated values in MySQL IN clause

Building on the FIND_IN_SET() example from @Jeremy Smith, you can do it with a join so you don't have to run a subquery.

SELECT * FROM table t
JOIN locations l ON FIND_IN_SET(t.e_ID, l.city) > 0
WHERE l.e_ID = ?

This is known to perform very poorly, since it has to do table-scans, evaluating the FIND_IN_SET() function for every combination of rows in table and locations. It cannot make use of an index, and there's no way to improve it.

I know you said you are trying to make the best of a bad database design, but you must understand just how drastically bad this is.

Explanation: Suppose I were to ask you to look up everyone in a telephone book whose first, middle, or last initial is "J." There's no way the sorted order of the book helps in this case, since you have to scan every single page anyway.

The LIKE solution given by @fthiella has a similar problem with regards to performance. It cannot be indexed.

Also see my answer to Is storing a delimited list in a database column really that bad? for other pitfalls of this way of storing denormalized data.

If you can create a supplementary table to store an index, you can map the locations to each entry in the city list:

CREATE TABLE location2city (
location INT,
city INT,
PRIMARY KEY (location, city)
);

Assuming you have a lookup table for all possible cities (not just those mentioned in the table) you can bear the inefficiency one time to produce the mapping:

INSERT INTO location2city (location, city)
SELECT l.e_ID, c.e_ID FROM cities c JOIN locations l
ON FIND_IN_SET(c.e_ID, l.city) > 0;

Now you can run a much more efficient query to find entries in your table:

SELECT * FROM location2city l
JOIN table t ON t.e_ID = l.city
WHERE l.e_ID = ?;

This can make use of an index. Now you just need to take care that any INSERT/UPDATE/DELETE of rows in locations also inserts the corresponding mapping rows in location2city.

Comma separated values in MySQL IN clause

Building on the FIND_IN_SET() example from @Jeremy Smith, you can do it with a join so you don't have to run a subquery.

SELECT * FROM table t
JOIN locations l ON FIND_IN_SET(t.e_ID, l.city) > 0
WHERE l.e_ID = ?

This is known to perform very poorly, since it has to do table-scans, evaluating the FIND_IN_SET() function for every combination of rows in table and locations. It cannot make use of an index, and there's no way to improve it.

I know you said you are trying to make the best of a bad database design, but you must understand just how drastically bad this is.

Explanation: Suppose I were to ask you to look up everyone in a telephone book whose first, middle, or last initial is "J." There's no way the sorted order of the book helps in this case, since you have to scan every single page anyway.

The LIKE solution given by @fthiella has a similar problem with regards to performance. It cannot be indexed.

Also see my answer to Is storing a delimited list in a database column really that bad? for other pitfalls of this way of storing denormalized data.

If you can create a supplementary table to store an index, you can map the locations to each entry in the city list:

CREATE TABLE location2city (
location INT,
city INT,
PRIMARY KEY (location, city)
);

Assuming you have a lookup table for all possible cities (not just those mentioned in the table) you can bear the inefficiency one time to produce the mapping:

INSERT INTO location2city (location, city)
SELECT l.e_ID, c.e_ID FROM cities c JOIN locations l
ON FIND_IN_SET(c.e_ID, l.city) > 0;

Now you can run a much more efficient query to find entries in your table:

SELECT * FROM location2city l
JOIN table t ON t.e_ID = l.city
WHERE l.e_ID = ?;

This can make use of an index. Now you just need to take care that any INSERT/UPDATE/DELETE of rows in locations also inserts the corresponding mapping rows in location2city.



Related Topics



Leave a reply



Submit