Postgresql Copy/Transfer Data from One Database to Another

Copy a table from one database to another in Postgres

Extract the table and pipe it directly to the target database:

pg_dump -t table_to_copy source_db | psql target_db

Note: If the other database already has the table set up, you should use the -a flag to import data only, else you may see weird errors like "Out of memory":

pg_dump -a -t table_to_copy source_db | psql target_db

PostgreSQL copy/transfer data from one database to another

This is a really straightforward task. Just use dblink for this purpose:

INSERT INTO t(a, b, c)
SELECT a, b, c FROM dblink('host=xxx user=xxx password=xxx dbname=xxx', 'SELECT a, b, c FROM t') AS x(a integer, b integer, c integer)

If you need to fetch data from external database on a regular basis, it would be wise to define a server and user mapping. Then, you could use shorter statement:

dblink('yourdbname', 'your query')

PostgreSQL: How to copy data from one database table to another database

In the case the two databases are on two different server instances, you could export in CSV from db1 and then import the data in db2 :

COPY (SELECT * FROM t1) TO '/home/export.csv';

and then load back into db2 :

COPY t2 FROM '/home/export.csv';

Again, the two tables on the two different database instances must have the same structure.

Using the command line tools : pg_dump and psql , you could do even in this way :

pg_dump -U postgres -t t1 db1 | psql -U postgres -d db2

You can specify command line arguments to both pg_dump and psql to specify the address and/or port of the server .

Another option would be to use an external tool like : openDBcopy, to perform the migration/copy of the table.

Copying PostgreSQL database to another server

You don't need to create an intermediate file. You can do

pg_dump -C -h localhost -U localuser dbname | psql -h remotehost -U remoteuser dbname

or

pg_dump -C -h remotehost -U remoteuser dbname | psql -h localhost -U localuser dbname

using psql or pg_dump to connect to a remote host.

With a big database or a slow connection, dumping a file and transfering the file compressed may be faster.

As Kornel said there is no need to dump to a intermediate file, if you want to work compressed you can use a compressed tunnel

pg_dump -C dbname | bzip2 | ssh  remoteuser@remotehost "bunzip2 | psql dbname"

or

pg_dump -C dbname | ssh -C remoteuser@remotehost "psql dbname"

but this solution also requires to get a session in both ends.

Note: pg_dump is for backing up and psql is for restoring. So, the first command in this answer is to copy from local to remote and the second one is from remote to local. More -> https://www.postgresql.org/docs/9.6/app-pgdump.html

Transferring data from one database to another (Postgres)

Yes, backup using "PLAIN" format (SQL statements) and then (when connected to the other DB) open the file and run it.

Or you could select "COMPRESS" format in the "backup" dialogue, and then you could use the restore dialogue.

Also there's an equivalent of phpMyAdmin for Postgres, called "phppgadmin". Select the table in question and then use the "Export" tab.

Creating a copy of a database in PostgreSQL

Postgres allows the use of any existing database on the server as a template when creating a new database. I'm not sure whether pgAdmin gives you the option on the create database dialog but you should be able to execute the following in a query window if it doesn't:

CREATE DATABASE newdb WITH TEMPLATE originaldb OWNER dbuser;

Still, you may get:

ERROR:  source database "originaldb" is being accessed by other users

To disconnect all other users from the database, you can use this query:

SELECT pg_terminate_backend(pg_stat_activity.pid) FROM pg_stat_activity 
WHERE pg_stat_activity.datname = 'originaldb' AND pid <> pg_backend_pid();

Copy table data from one database to another

I'd use an ETL tool for this. There are free tools available, they can help you change column names and they are widely used and tested. Most tools allow external schedulers like the windows task scheduler or cron to run transformations based on whatever time schedule you need.

I personally have used Pentaho PDI for similar tasks in the past and it has always worked well for me. For your requirement I'd create a single transformation that first loads the table data from the source database, modify the column names in a "Select Values"-step and then insert the values into the target table using the "truncate" option to remove the existing rows from the target table. If your table is too big to be re-filled each time, you'd need to figure out a delta load procedure.



Related Topics



Leave a reply



Submit