nr: #1 dodano: 2017-01-04 22:01
It is not a perfect solution, but solve the main problem, that is the
... big temporary table, so lost CPU, disk and my time"...
This is the way we do it, a workaround with
Adopt your conventions to avoid file-copy and file-permission confusions... The standard file path for a CSV. Example:
Initialize your database or SQL script with the magic extension,
CREATE EXTENSION file_fdw;
CREATE SERVER files FOREIGN DATA WRAPPER file_fdw;
For each CSV file,
3.1. make a symbolic link (or
scp remote copy) for your new file
ln -sf $PWD/myNewData.csv /tmp/pg_socKer_file.csv
3.2. configure the file_fdw for your new table (suppose
CREATE FOREIGN TABLE temp1 (a int, b text, c text)
SERVER files OPTIONS (
PS: after run SQL script with
psql, when having some permission problem, change owner of the link by
sudo chown -h postgres:postgres /tmp/pg_socKer_file.csv.
3.3. use the file_fdw table as source (suppose populating
INSERT INTO mytable(info)
SELECT json_build_array(a,b,c) FROM temp1;
Thanks to @JosMac (and his tutorial)!
NOTE: if there are a STDIN way to do it (exists??), will be easy, avoiding permission problems and use of absolute paths. See this answer/discussion.