Select Rows Until Condition Met

Select rows until condition met

Use a sub-query to find out at what point you should stop, then return all row from your starting point to the calculated stop point.

SELECT
*
FROM
yourTable
WHERE
id >= 4
AND id <= (SELECT MIN(id) FROM yourTable WHERE b = 'F' AND id >= 4)

Note, this assumes that the last record is always an 'F'. You can deal with the last record being a 'T' using a COALESCE.

SELECT
*
FROM
yourTable
WHERE
id >= 4
AND id <= COALESCE(
(SELECT MIN(id) FROM yourTable WHERE b = 'F' AND id >= 4),
(SELECT MAX(id) FROM yourTable )
)

How can I select the row until met specific condition

but now I want to select the minimum amount of station that makes the overall trip amount reach over 320,000

You would get the list of stations using a cumulative sum and then filter:

SELECT s.*
FROM (SELECT start_station_id AS station_id,
COUNT(*) AS cnt,
SUM(COUNT(*)) OVER (ORDER BY COUNT(*) DESC) as running_cnt
FROM bigquery-public-data.san_francisco_bikeshare.bikeshare_trips
WHERE EXTRACT(YEAR FROM start_date) = 2015
GROUP BY 1
) s
WHERE running_cnt - cnt < 320000

Select until condition in sql

Find min(id) first and then the row having lower or equal id

SELECT *
FROM EVS1
WHERE id <= (SELECT MIN(id) FROM EVS1 WHERE evType = 200)

I assume that you define the ordering according to the id attribute.

If it is necessary to do it for each CreatedByUserId then use a dependent subquery for the minimal id computation

SELECT *
FROM EVS1 e1
WHERE id <= (
SELECT MIN(id)
FROM EVS1 e2
WHERE e2.evType = 200
and e1.CreatedByUserId = e2.CreatedByUserId
)

DBFIDDLE DEMO

I believe that this solution will be faster then a window function for a large data if you will have an index

CREATE INDEX ix_evs1_evType_CreatedByUserId ON evs1(evType, CreatedByUserId) INCLUDE(id)

Finding row until condition is met

You need a recursive CTE. Something like this:

WITH cte AS (
SELECT t.PackingNr, t.SerienNr
FROM YourTable t
WHERE t..PackingNr = 'YourValueHere'

UNION ALL

SELECT t.PackingNr, t.SerienNr
FROM YourTable t
JOIN cte ON cte.SerienNr = t.PackingNr
)
SELECT TOP (1)
*
FROM cte
WHERE cte.SerienNr LIKE 'R%';

Selecting all rows until first occurrence of given value

SELECT * FROM mytable where date > (
SELECT max(date) FROM mytable where check = 0
)

SQL select all rows per group after a condition is met

You might want to try window functions:

select category, timestamp, condition
from (
select
t.*,
min(condition) over(partition by category order by timestamp desc) min_cond
from mytable t
) t
where min_cond = 1

The window min() with the order by clause computes the minimum value of condition over the current and following rows of the same category: we can use it as a filter to eliminate rows for which there is a more recent row with a 0.

Compared to the correlated subquery approach, the upside of using window functions is that it reduces the number of scans needed on the table. Of course this computing also has a cost, so you'll need to assess both solutions against your sample data.

Counting rows until a condition is met in R - NAs before the condition is met

Here's a way with dplyr :

library(dplyr)

dummy_tb %>%
#Replace `NA` with 0
mutate(USAGE = replace(USAGE, is.na(USAGE), 0)) %>%
#Group by USER_ID
group_by(USER_ID) %>%
#Create a new group which resets everytime USAGE is greater than usage_limit
group_by(temp = cumsum(USAGE >= usage_limit), add = TRUE) %>%
#Create an index
mutate(out = row_number() - 1) %>%
group_by(USER_ID) %>%
#Replace with NA values before first usage_limit cross.
mutate(out = replace(out, row_number() < which.max(USAGE >= usage_limit), NA))

which returns :

#   USER_ID REFERENCE_DATE USAGE USAGE_35PCT_MTH temp out
#1 000001 31.01.2016 0.30 NA 0 NA
#2 000001 29.02.2016 0.35 0 1 0
#3 000001 31.03.2016 0.34 1 1 1
#4 000001 30.04.2016 0.38 0 2 0
#5 000001 31.05.2016 0.40 0 3 0
#6 000001 30.06.2016 0.70 0 4 0
#7 000001 31.07.2016 0.78 0 5 0
#8 000001 31.08.2016 0.95 0 6 0
#9 000001 30.09.2016 0.36 0 7 0
#10 000001 31.10.2016 0.22 1 7 1
#11 000001 30.11.2016 0.11 2 7 2
#12 000001 31.12.2016 0.01 3 7 3
#13 000001 31.01.2017 0.10 4 7 4
#14 000001 28.02.2017 0.10 5 7 5
#15 000001 31.03.2017 0.10 6 7 6
#16 200000 31.03.2014 0.00 NA 0 NA
#17 200000 30.04.2014 0.36 0 1 0
#18 200000 31.05.2014 0.20 1 1 1
#19 200000 30.06.2014 0.00 2 1 2
#20 200000 31.07.2014 0.20 3 1 3
#21 200000 31.08.2014 0.20 4 1 4
#22 200000 30.09.2014 0.00 5 1 5
#23 200000 31.10.2014 0.20 6 1 6


Related Topics



Leave a reply



Submit