How do I insert multiple values into a postgres table at once?
Categories:
Efficiently Inserting Multiple Rows into PostgreSQL

Learn the most effective methods for inserting multiple records into a PostgreSQL table simultaneously, optimizing performance and reducing database load.
When working with PostgreSQL, inserting data is a fundamental operation. While inserting a single row is straightforward, inserting multiple rows efficiently is crucial for application performance and database health. This article explores various techniques for bulk inserts, highlighting their advantages and use cases. We'll cover the standard INSERT INTO ... VALUES
syntax, the UNNEST
function, and the COPY
command, along with considerations for each.
Method 1: Single INSERT with Multiple VALUES Clauses
The most common and often sufficient way to insert multiple rows is by providing multiple VALUES
clauses within a single INSERT
statement. This method is generally preferred over executing individual INSERT
statements in a loop, as it reduces network round-trips and allows the database to optimize the transaction.
INSERT INTO products (name, price, stock_quantity)
VALUES
('Laptop', 1200.00, 50),
('Mouse', 25.50, 200),
('Keyboard', 75.00, 150);
Inserting multiple rows using a single INSERT statement with multiple VALUES clauses.
COPY
command for superior performance.Method 2: Using UNNEST for Dynamic Data
For scenarios where your data is already structured as arrays in your application or you need to construct the data dynamically, the UNNEST
function combined with INSERT INTO ... SELECT
can be a powerful approach. UNNEST
expands an array into a set of rows, which can then be inserted into your table. This is particularly useful when dealing with data that arrives as a collection of arrays.
INSERT INTO products (name, price, stock_quantity)
SELECT * FROM UNNEST(
ARRAY['Monitor', 'Webcam', 'Headphones'],
ARRAY[300.00, 50.00, 100.00],
ARRAY[100, 300, 250]
);
Inserting multiple rows using UNNEST with arrays.
flowchart TD A[Application Data as Arrays] --> B{UNNEST Function} B --> C[Rows Generated by UNNEST] C --> D[INSERT INTO Target Table] D --> E[PostgreSQL Table]
Data flow when using UNNEST for bulk inserts.
Method 3: The COPY Command for High Performance
For inserting very large volumes of data (tens of thousands to millions of rows), the COPY
command is by far the most efficient method. It allows you to load data directly from a file or from the standard input stream. This bypasses much of the SQL parser and transaction overhead, making it significantly faster than INSERT
statements. The COPY
command is typically executed from the psql
client or through client libraries that support it.
-- Example: Copying from a CSV file
COPY products (name, price, stock_quantity)
FROM '/path/to/your/products.csv'
DELIMITER ','
CSV HEADER;
-- Example: Copying from standard input (e.g., from a script)
COPY products (name, price, stock_quantity) FROM STDIN DELIMITER ',' CSV;
Laptop,1200.00,50
Mouse,25.50,200
Keyboard,75.00,150
\.
Using the COPY command to insert data from a file or standard input.
COPY
command requires appropriate file system permissions on the database server if loading from a file. When using FROM STDIN
, the data is streamed directly from the client.Choosing the right method depends on the volume of data, the source of the data, and performance requirements. For most application-driven bulk inserts, the single INSERT
with multiple VALUES
clauses is a good balance of simplicity and performance. For truly massive data loads, COPY
is the undisputed champion.