Replace Identity Column from Int to Bigint

Replace identity column from int to bigint

Well, it won't be a quick'n'easy way to do this, really....

My approach would be this:

  1. create a new table with identical structure - except for the ID column being BIGINT IDENTITY instead of INT IDENTITY

    ----[ put your server into exclusive single-user mode here; user cannot use your server from this point on ]----

  2. find and disable all foreign key constraints referencing your table

  3. turn SET IDENTITY_INSERT (your new table) ON

  4. insert the rows from your old table into the new table

  5. turn SET IDENTITY_INSERT (your new table) OFF

  6. delete your old table

  7. rename your new table to the old table name

  8. update all table that have a FK reference to your table to use BIGINT instead of INT (that should be doable with a simple ALTER TABLE ..... ALTER COLUMN FKID BIGINT)

  9. re-create all foreign key relationships again

  10. now you can return your server to normal multi-user usage again

How to change a column type from integer to biginteger while other tables have foreign id relation in laravel?

To change the category_id from int to bigint, you would need to:

  1. Drop the foreign key on the category_id column: $table->dropForeign('admin_campaigns_category_id_foreign');

  2. Change the category_id column to bigint: $table->bigInteger('category_id')->change();

  3. Before being able to re-establish the foreign key, you need to change the admin_campaign_categories.id column to bigint too:

      Schema::table('admin_campaign_categories', function (Blueprint $table) {
    $table->bigInteger('id')->change();
    });

Before you can do that, you obviously need to do steps 1 and 2 on all other columns in all other tables that have foreign keys on admin_campaign_categories.id.

So if the admin_campaigns table was the only table referencing admin_campaign_categories.id, you can do:

public function up()
{
Schema::table('admin_campaigns', function (Blueprint $table) {
$table->dropForeign('admin_campaigns_category_id_foreign');
});

Schema::table('admin_campaign_categories', function (Blueprint $table) {
$table->bigInteger('id')->change();
});

Schema::table('admin_campaigns', function (Blueprint $table) {
$table->bigInteger('category_id')->change();
$table->foreign('category_id')
->references('id')->on('admin_campaign_categories')
->onDelete('cascade');
});
}

public function down()
{
Schema::table('admin_campaigns', function (Blueprint $table) {
$table->dropForeign('admin_campaigns_category_id_foreign');
});

Schema::table('admin_campaign_categories', function (Blueprint $table) {
$table->integer('id')->change();
});

Schema::table('admin_campaigns', function (Blueprint $table) {
$table->integer('category_id')->change();
$table->foreign('category_id')
->references('id')->on('admin_campaign_categories')
->onDelete('cascade');
});
}


A Word on Primary Key Columns

I also want to emphasize that you should never use signed integer / signed big integer types for numeric primary key columns. Instead, you should use

$table->unsignedBigInteger('id')->change();

everywhere in the migration above.

The reason for this is that a numeric primary key will probably never be set to a negative value, which holds true especially when using it as an auto increment column.

By setting the column to integer or bigInteger types, you limit the maximum assignable value to 2,147,483,647 or 263-1, respectively, whereas you could save a value twice as high when using unsignedInteger or unsignedBigInteger types, with the same amount of storage needed (4/8 bytes).

In most scenarios, both ranges are so high that you will probably never notice a difference, but it's basically a waste of resources. I have seen an integer id exceed the maximum range once in a user table for a very busy application, and it's not pretty. Basically, in MySQL the whole DB just stops working completely.

postgreSQL concurrently change column type from int to bigint

If you don't have foreign keys pointing your id you could add new column, fill it, drop old one and rename new to old:

alter table my_table add column new_id bigint;

begin; update my_table set new_id = id where id between 0 and 100000; commit;
begin; update my_table set new_id = id where id between 100001 and 200000; commit;
begin; update my_table set new_id = id where id between 200001 and 300000; commit;
begin; update my_table set new_id = id where id between 300001 and 400000; commit;
...

create unique index my_table_pk_idx on my_table(new_id);

begin;
alter table my_table drop constraint my_table_pk;
alter table my_table alter column new_id set default nextval('my_table_id_seq'::regclass);
update my_table set new_id = id where new_id is null;
alter table my_table add constraint my_table_pk primary key using index my_table_pk_idx;
alter table my_table drop column id;
alter table my_table rename column new_id to id;
commit;

Duration of PostgreSQL ALTER COLUMN TYPE int to bigint

This command will have to rewrite the whole table, because bigint takes 8 bytes of storage rather than the 4 of an integer. The table will be locked from concurrent access while this is taking place, so with a big table you should be prepared for a longer downtime.

If you expect that this could be necessary, perform the change as soon as possible, while the table is still small.

Changing INT to BigInt

If one or more of those columns have no constraints against them (like a foreign key, index, default, rule, etc), you should be able to change each one quickly by doing

ALTER TABLE monster ALTER COLUMN MyIntCol1 bigint

Management Studio's change SQL is rarely the most efficient and tends to favour temporary tables for anything modifying an existing column.



Related Topics



Leave a reply



Submit