SQL Delete Records Within a Specific Range

SQL Delete Records within a specific Range

If you use Sql Server

delete from Table where id between 79 and 296

Note : the between statement is inclusive, so rows 79 and 296 will also be deleted

After your edit : you now clarified that you want :

ID (>79 AND < 296)

So use this :

delete from Table where id > 79 and id < 296

How to delete rows with IDs in a specific range?

DELETE FROM table_name where id >= 1043 and id < 1101; 

How to delete specific range of records from a GROUP BY query?

Here's what I came up with:

Query to check results to be deleted: (col "old" is to identify two oldest records, "new" the two newest, for only those MediaIDs with a count > 10)

select * from (

select
ROW_NUMBER() over (partition by v.mediaid order by v.adddate asc) old,
ROW_NUMBER() over (partition by v.mediaid order by v.adddate desc) recent,
v.*
from MediaTest_V v
inner join (
SELECT MediaID
FROM MediaTest_V
GROUP BY MediaID
HAVING COUNT(*) > 10
) vc ON vc.MediaID = v.MediaID

) seq
where
seq.old > 2 and seq.recent > 2

Query to perform delete:

delete del
from MediaTest_V del
inner join (

select ID from (

select
ROW_NUMBER() over (partition by v.mediaid order by v.adddate asc) old,
ROW_NUMBER() over (partition by v.mediaid order by v.adddate desc) recent,
v.*
from MediaTest_V v
inner join (
SELECT MediaID
FROM MediaTest_V
GROUP BY MediaID
HAVING COUNT(*) > 10
) vc ON vc.MediaID = v.MediaID

) seq
where
seq.old > 2 and seq.recent > 2

) seq on del.ID = seq.ID

How to delete a range of records at once on MySQL?

You can use the between function:

delete from exampleTable where id between 40000 and 50000

or:

delete from exampleTable where id >= 40000 and id <= 50000

pretty simple?

SQL Server - Deleting rows between a date range using SQL. Date conversion fails

You wrote 31st of February... Maybe..... that date doesn't exists.

DELETE FROM BIZ 
WHERE [Orgnl_Cmpltn_Date]
BETWEEN '2014-02-28' AND '2014-04-01'

For a general idea of convert date:

DELETE FROM BIZ 
WHERE [Orgnl_Cmpltn_Date]
BETWEEN CONVERT(date,'2014.02.28',102) and CONVERT(date,'2014.04.01',102)

Here you can find the complete list of values for third parameter of CONVERT
https://msdn.microsoft.com/en-us/library/ms187928.aspx

SQL Delete Tables within a specific Range

  1. create a PHP script
  2. use information_schema.tables or 'show tables' to dynamically find your table names
  3. select a range (from the user or in some variables)
  4. loop and do the DROP TABLE

delete millions for records from table between date range

There are two basic approaches to do massive DELETE operations.

1) Create another table, drop old one and rename the new one, and ANALYZE the new table in the end:

begin;
create table camera_activities_new (like camera_activities including all);

insert into camera_activities_new
select * from camera_activities
where done_at >= ''2016-01-01'::date;

alter sequence camera_activities_id_seq owned by camera_activities_new;
drop table camera_activities;
alter table camera_activities_new rename to camera_activities;
alter index camera_activities_new_camera_id_done_at_idx rename to camera_activities_camera_id_done_at_idx;
commit;

analyze camera_activities;

This approach guarantees that the resulting table will be in the best shape (no bloating). But it can be less convenient your system is heavy loaded and the table is involved. In such cases, "smooth deletion" might look better.

2) "Smooth" deletion: delete only relatively small amount of rows each time, use more aggressive autovacuum settings and control bloating.

Example, showing how to split deletion to many independent transactions (in bash; relies on $PGDATABASE, $PGHOST, $PGUSER, $PGPASSWORD environment variables):

while true; do
res=$(psql -c "delete from camera_activities where id in (select id camera_activities where done_at < '2016-01-01'::date limit 500);" \
| grep DELETE | awk {'print $2'} )
if [[ $res = '0' ]]; then break; fi;
sleep 0.3; # control speed here; check bloating level
done

– this will stop automatically when no rows are left to delete.

Your index on (camera_id, done_at) should speed up subselect, making Bitmap Index Scan – check with EXPLAIN. But probably it's worth to have a separate index on done_at, it can be btree or brin (lossy but smaller in size) in this case:

create i_camera_activities_done_at on camera_activities using brin(done_at);

Example of "more aggressive" (than default) autovacuum settings:

log_autovacuum_min_duration = 0
autovacuum_vacuum_scale_factor = 0.01
autovacuum_analyze_scale_factor = 0.05
autovacuum_naptime = 60
autovacuum_vacuum_cost_delay = 20

Different queries which help to see your table's bloating level:

  • https://wiki.postgresql.org/wiki/Show_database_bloat
  • http://blog.ioguix.net/postgresql/2014/09/10/Bloat-estimation-for-tables.html
  • https://github.com/ioguix/pgsql-bloat-estimation/blob/master/table/table_bloat-82-84.sql
  • https://github.com/dataegret/pg-utils/blob/master/sql/table_bloat.sql (and for indexes:
  • https://github.com/dataegret/pg-utils/blob/master/sql/index_bloat.sql; these queries require pgstattuple extension)


Related Topics



Leave a reply



Submit