SQL Inserting multiples values using the result of a select
Efficiently Inserting Multiple Values Using SELECT Statements in SQL

Learn how to leverage the results of a SELECT query to insert multiple rows into another table, optimizing your SQL data manipulation tasks.
In SQL, inserting data into a table is a fundamental operation. While single-row inserts are straightforward, often you need to insert multiple rows, especially when those rows are derived from existing data in other tables. This article explores how to efficiently insert multiple values into a table by using the results of a SELECT
statement, a powerful technique that streamlines data migration, replication, and transformation tasks.
Understanding INSERT INTO ... SELECT
The INSERT INTO ... SELECT
statement is a highly efficient way to copy data from one table to another, or even from a subset of a table back into itself. Instead of specifying individual VALUES
for each column, you provide a SELECT
statement that retrieves the data you wish to insert. This method is particularly useful when dealing with large datasets or when the data to be inserted requires complex filtering, joining, or aggregation from existing tables.
flowchart TD A[Source Table(s)] --> B{SELECT Query (Filter, Join, Transform)}; B --> C[Target Table]; C -- INSERT INTO --> D[New Rows Added];
Conceptual flow of INSERT INTO ... SELECT operation
Basic Syntax and Usage
The basic syntax for inserting multiple rows using a SELECT
statement is quite simple. You specify the target table and optionally the columns you want to populate, followed by your SELECT
query. It's crucial that the number of columns and their data types in the SELECT
statement match those of the target table (or the specified columns in the INSERT
clause).
INSERT INTO TargetTable (Column1, Column2, Column3)
SELECT SourceColumn1, SourceColumn2, SourceColumn3
FROM SourceTable
WHERE Condition;
Basic syntax for INSERT INTO ... SELECT statement
If you are inserting into all columns of the TargetTable
and the order of columns in the SELECT
statement matches the order of columns in TargetTable
, you can omit the column list after TargetTable
.
INSERT INTO TargetTable
SELECT SourceColumn1, SourceColumn2, SourceColumn3
FROM SourceTable
WHERE Condition;
Simplified syntax when inserting into all columns in order
CAST()
or CONVERT()
) is recommended for clarity and to prevent unexpected errors.Practical Examples
Let's consider a scenario where we have an Orders
table and we want to archive old orders into an ArchivedOrders
table. We can use INSERT INTO ... SELECT
to move all orders older than a certain date.
-- Create sample tables
CREATE TABLE Orders (
OrderID INT PRIMARY KEY,
CustomerID INT,
OrderDate DATE,
TotalAmount DECIMAL(10, 2)
);
CREATE TABLE ArchivedOrders (
ArchiveID INT IDENTITY(1,1) PRIMARY KEY, -- For SQL Server, use SERIAL for PostgreSQL
OriginalOrderID INT,
CustomerID INT,
OrderDate DATE,
TotalAmount DECIMAL(10, 2),
ArchiveDate DATETIME DEFAULT GETDATE()
);
-- Insert some sample data into Orders
INSERT INTO Orders (OrderID, CustomerID, OrderDate, TotalAmount) VALUES
(1, 101, '2022-01-15', 150.00),
(2, 102, '2022-03-20', 250.50),
(3, 101, '2023-05-10', 75.25),
(4, 103, '2023-07-01', 300.00),
(5, 102, '2024-01-05', 120.00);
-- Archive orders older than '2023-01-01'
INSERT INTO ArchivedOrders (OriginalOrderID, CustomerID, OrderDate, TotalAmount)
SELECT OrderID, CustomerID, OrderDate, TotalAmount
FROM Orders
WHERE OrderDate < '2023-01-01';
-- Verify the archived orders
SELECT * FROM ArchivedOrders;
Archiving old orders using INSERT INTO ... SELECT
This example demonstrates how to select specific columns and apply a WHERE
clause to filter the data before insertion. You can also include joins, aggregations, and subqueries within your SELECT
statement to construct the data exactly as needed for the target table.
INSERT INTO ... SELECT
with large datasets. While efficient, it can still consume significant resources. Consider performing the operation during off-peak hours or in batches if dealing with extremely large tables to avoid impacting production performance.