Implement Incrementing Counter in SQL
If I understand your question correctly, the concern is that concurrent transactions could cause unintended results with the counter value. You know how to manage the counter values just fine, it's just a matter of whether or not some other transaction could read/modify at same time. If that understanding is correct, that's what isolation levels are for. Depending upon how the data is accessed/stored (e.g. can there be multiple records with the same <whatever id value> or is that unique?) and application requirements (e.g. is it OK if <whatever id value> is not unique and there is insert of new record that is not updated due to timing) you should SET TRANSACTION ISOLATION LEVEL accordingly.
Group records with incrementing count based on flag
So here a way to do it. It based on How do I calculate a running total in SQL without using a cursor? which does have some flaws. I'm using an index on the advice that it makes the ordering work out DESPITE the fact that order on the update is not guaranteed.
And it also worth pointing you to Calculate running total / running balance for Aaron Bertrand treatment.
The possibly clever bit here is the conversion of Y/N to 1/0 for use in calculating.
CREATE TABLE Orders (division CHAR(6),ID CHAR(6),dat DATETIME, flag CHAR(1))
INSERT INTO Orders VALUES
('ABC123','ZZZ123','01/17/2013','Y')
,('ABC123','ZZZ123','01/25/2013','N')
,('ABC123','ZZZ123','01/25/2013','N')
,('ABC123','ZZZ123','01/25/2013','N')
,('ABC123','ZZZ123','01/25/2013','N')
,('ABC123','ZZZ123','02/22/2013','Y')
,('ABC123','ZZZ123','02/26/2013','N')
,('ABC123','YYY222','03/20/2013','Y')
,('ABC123','YYY222','05/17/2013','N')
,('XYZ456','ZZZ999','01/15/2012','N')
,('XYZ456','ZZZ999','01/30/2012','N')
,('XYZ456','ZZZ123','02/09/2012','N')
,('XYZ456','ZZZ123','04/13/2012','Y')
,('XYZ456','ZZZ123','06/23/2012','N')
,('XYZ456','ZZZ123','010/5/2012','Y')
,('XYZ456','ZZZ123','11/18/2012','N')
CREATE TABLE #Orders (division CHAR(6), ID CHAR(6), dat DATETIME, flag CHAR(1),flag_int INTEGER, rn BIGINT, OrderGroup INT)
CREATE CLUSTERED INDEX IDX_C_Temp_Order ON #Orders(division, id,rn)
INSERT INTO #Orders (division, id,dat,flag,flag_int,rn,OrderGroup)
SELECT division
,ID
,dat
,flag
,CASE flag WHEN 'y' THEN 1 ELSE 0 END flag_int
,ROW_NUMBER() OVER (PARTITION BY division, id ORDER BY dat) rn
,0 OrderGroup
FROM Orders
DECLARE @OrderGroup INT = 0
UPDATE #Orders
SET @OrderGroup = OrderGroup = CASE WHEN rn = 1 THEN 1 ELSE @OrderGroup + flag_int END
FROM #Orders
SELECT *
FROM #Orders
ORDER BY division
,ID
,rn
DROP TABLE #Orders
How to get count incremental by date
You can make use of COUNT() OVER ()
without PARTITION BY
,by using ORDER BY
. It will give you the cumulative sum.Use DISTINCT
to filter out the duplicate values.
SELECT DISTINCT CAST(create_date AS DATE) [Date],
COUNT(create_date) OVER (ORDER BY CAST(create_date AS DATE)) as [COUNT]
FROM [YourTable]
Incrementing Count within a Group By
You need to simply use PARTITION BY
:
SELECT
user_id,
id AS message_id,
sent_at,
ROW_NUMBER() OVER(PARTITION BY user_id ORDER BY user_id, sent_at) AS counter
FROM messages
ORDER BY
user_id,
sent_at;
Reset and increment count when column values change
Here's an option for you.
Basically a running total based on TRAN_ID dictated by INTERACTION='Open' since you stated: "Each SUB_TASK will start at 1 for each new TRAN_ID and it remains the same until a new 'Open' interaction is detailed, where it will increment by 1"
Challenge is cumulative sums are not supported until SQL Server 2012+ so we'll have to use an outer apply:
Sample temp table with data:
CREATE TABLE #TestData
(
[TRAN_ID] INT
, [TASK_ID] INT
, [INTERACTION] NVARCHAR(10)
, [INTERACTION_DATETIME] DATETIME
);
INSERT INTO #TestData (
[TRAN_ID]
, [TASK_ID]
, [INTERACTION]
, [INTERACTION_DATETIME]
)
VALUES ( 1234, 1, 'Open', '2018-01-04 18:02:18' )
, ( 1234, 1, 'Close', '2018-01-04 18:02:27' )
, ( 2234, 11, 'Open', '2018-01-03 09:04:33' )
, ( 2234, 11, 'Close', '2018-01-03 09:04:50' )
, ( 2234, 11, 'Open', '2018-01-04 09:05:29' )
, ( 2234, 11, 'Edit', '2018-01-04 09:06:42' )
, ( 2234, 11, 'Edit', '2018-01-04 09:07:33' )
, ( 2234, 11, 'Merge', '2018-01-04 09:09:21' )
, ( 2234, 11, 'Close', '2018-01-04 09:13:50' )
, ( 2234, 11, 'Open', '2018-01-05 11:14:34' )
, ( 2234, 11, 'Edit', '2018-01-05 11:16:49' )
, ( 2234, 11, 'Edit', '2018-01-05 11:21:21' )
, ( 2234, 11, 'Merge', '2018-01-05 11:55:33' )
, ( 2234, 11, 'Close', '2018-01-05 11:56:12' )
, ( 3242, 13, 'Open', '2018-01-03 15:47:22' )
, ( 3242, 13, 'Close', '2018-01-03 15:47:59' )
, ( 3242, 13, 'Open', '2018-01-19 09:38:09' )
, ( 3242, 13, 'Edit', '2018-01-19 09:39:10' )
, ( 3242, 13, 'Edit', '2018-01-19 09:42:12' )
, ( 3242, 13, 'Close', '2018-01-19 09:46:12' );
Then we can use an outer apply. We'll sum based on INTERACTION equal to 'Open' or not for the TASK_ID where the INTERACTION_DATETIME less than or equal. This should work in 2008:
SELECT *
FROM [#TestData] [a]
OUTER APPLY (
SELECT SUM( CASE WHEN [b].[INTERACTION] = 'Open' THEN 1
ELSE 0
END
) AS [SUB_TASK_ID]
FROM [#TestData] [b]
WHERE [b].[TRAN_ID] = [a].[TRAN_ID]
AND [b].[INTERACTION_DATETIME] <= [a].[INTERACTION_DATETIME]
) [s]
ORDER BY [a].[TRAN_ID], [a].[INTERACTION_DATETIME]
Giving you your desired results:
TRAN_ID TASK_ID INTERACTION INTERACTION_DATETIME SUB_TASK_ID
----------- ----------- ----------- ----------------------- -----------
1234 1 Open 2018-01-04 18:02:18.000 1
1234 1 Close 2018-01-04 18:02:27.000 1
2234 11 Open 2018-01-03 09:04:33.000 1
2234 11 Close 2018-01-03 09:04:50.000 1
2234 11 Open 2018-01-04 09:05:29.000 2
2234 11 Edit 2018-01-04 09:06:42.000 2
2234 11 Edit 2018-01-04 09:07:33.000 2
2234 11 Merge 2018-01-04 09:09:21.000 2
2234 11 Close 2018-01-04 09:13:50.000 2
2234 11 Open 2018-01-05 11:14:34.000 3
2234 11 Edit 2018-01-05 11:16:49.000 3
2234 11 Edit 2018-01-05 11:21:21.000 3
2234 11 Merge 2018-01-05 11:55:33.000 3
2234 11 Close 2018-01-05 11:56:12.000 3
3242 13 Open 2018-01-03 15:47:22.000 1
3242 13 Close 2018-01-03 15:47:59.000 1
3242 13 Open 2018-01-19 09:38:09.000 2
3242 13 Edit 2018-01-19 09:39:10.000 2
3242 13 Edit 2018-01-19 09:42:12.000 2
3242 13 Close 2018-01-19 09:46:12.000 2
For 2012+ you can do all of that with the window function. Here's what that would look like just so you have it. A little cleaner and easier:
SELECT *
, SUM( CASE WHEN [INTERACTION] = 'Open' THEN 1
ELSE 0
END
) OVER ( PARTITION BY [TRAN_ID]
ORDER BY [TRAN_ID]
, [INTERACTION_DATETIME]
) AS [SUB_TASK_ID]
FROM [#TestData]
ORDER BY [TRAN_ID]
, [TASK_ID]
, [INTERACTION_DATETIME];
reset auto increment number each year
Set up a SQL agent job to run at midnight on December 31 and run the following snippet of code:
DBCC CHECKIDENT('table_name', RESEED, (YEAR(GETDATE()) % 100) * 10000)
Note that there are the following problems with this approach:
- If you put more than 10000 records in the table in a given year then your numbers will run over then end and the year will be wrong
- You have two different types of information in this column: the identity value and the year. It is usually better to divide information like that out (i.e., just use an identity column and then also store the date)
- You only have the year, not the month or day which might be useful in the future if it isn't right now
Related Topics
Select Distinct Values from One Table and Join With Another Table
Sql: Select All Rows If Parameter Is Null, Else Only Select Matching Rows
Sql Server Pass Column Name as Where Clause Parameter
Sql: How to Get Both Match and Non-Match Records
How to Get Total Count Value Each Day Upto 5 Days
Sql - Finding Students Taking 2 or More Classes
How to Use Json_Extract in MySQL and Get a String Without the Quotes
How to Identify Rows Where Two Columns Have Match Exactly in SQL Server
How to Get Depatment Wise Max Salary as Well as Name of Employee Having It
Sql Server How to Return Null Instead of 0 If a Grouped Value Doesn't Have Rows in Source
How to Create Column in SQL Query With Custom Text
How to Execute a Stored Procedure Once for Each Row Returned by Query
Extracting Data Between Two Delimiters in SQL Server
1052: Column 'Id' in Field List Is Ambiguous
How to Select the Oldest Date With Ties