Of course! The executemany() method in Python is a powerful feature provided by database cursor objects, most notably when working with libraries like sqlite3 and psycopg2 (for PostgreSQL).

It's designed to efficiently insert, update, or delete multiple rows in a single database operation, which is significantly faster than executing the same query in a loop.
What is executemany()?
executemany() takes two main arguments:
sql: A parameterized SQL query string. The query must contain placeholders (like for SQLite or%sfor PostgreSQL) for the values.seq_of_parameters: A sequence (like a list of tuples or a list of lists) where each element is a tuple containing the parameters for a single execution of the query.
The method then executes the query once for each set of parameters in the sequence.
Why Use executemany()? (Performance)
The key advantage is performance. When you insert rows one by one in a loop:

- Your Python code sends the query to the database.
- The database parses the query, creates a plan, and executes it.
- This network communication and overhead happen for every single row.
executemany() does the following:
- It sends the query to the database once.
- It then sends all the parameters in a batch.
- The database can often optimize this, treating it as a single "multi-row insert" operation, which is much more efficient.
This can lead to performance improvements of 10x, 100x, or even more, especially when dealing with thousands or millions of rows.
Code Examples
Let's look at examples using sqlite3, which is built into Python, so you can run this code without any external installations.
Example 1: Basic INSERT with sqlite3
This is the most common use case.

import sqlite3
import time
# --- Setup: Create a database and table ---
conn = sqlite3.connect(':memory:') # Use ':memory:' for an in-memory database
cursor = conn.cursor()
# Create a table for users
cursor.execute('''
CREATE TABLE users (
id INTEGER PRIMARY KEY,
name TEXT NOT NULL,
email TEXT NOT NULL UNIQUE,
age INTEGER
)
''')
# --- Data to insert ---
# Each tuple represents one row. The order must match the placeholders.
users_to_insert = [
('Alice', 'alice@example.com', 30),
('Bob', 'bob@example.com', 25),
('Charlie', 'charlie@example.com', 35),
('Diana', 'diana@example.com', 28)
]
# --- Using executemany() ---
# The '?' is the placeholder for SQLite.
insert_query = "INSERT INTO users (name, email, age) VALUES (?, ?, ?)"
try:
# Execute the query with all the data at once
cursor.executemany(insert_query, users_to_insert)
# Commit the transaction to save the changes
conn.commit()
print(f"Successfully inserted {cursor.rowcount} rows.")
# --- Verify the data ---
cursor.execute("SELECT * FROM users")
print("\nAll users in the database:")
for row in cursor.fetchall():
print(row)
except sqlite3.Error as e:
print(f"An error occurred: {e}")
finally:
# Close the connection
conn.close()
Example 2: Using Placeholders for Different Databases
The placeholder syntax changes depending on the database driver you are using.
| Database Driver | Placeholder Syntax | Example Query |
|---|---|---|
sqlite3 (built-in) |
INSERT INTO mytable (col1, col2) VALUES (?, ?) |
|
psycopg2 (PostgreSQL) |
%s |
INSERT INTO mytable (col1, col2) VALUES (%s, %s) |
mysql-connector-python (MySQL) |
%s |
INSERT INTO mytable (col1, col2) VALUES (%s, %s) |
cx_Oracle (Oracle) |
name (named) or |
INSERT INTO mytable (col1, col2) VALUES (:val1, :val2) |
Example with psycopg2 (PostgreSQL):
# You need to install psycopg2 first: pip install psycopg2-binary
import psycopg2
# --- Setup: Connect to a PostgreSQL database ---
# Replace with your actual connection details
conn = psycopg2.connect(
dbname="your_db",
user="your_user",
password="your_password",
host="localhost"
)
cursor = conn.cursor()
# Create a table (if it doesn't exist)
cursor.execute('''
CREATE TABLE IF NOT EXISTS products (
product_id SERIAL PRIMARY KEY,
product_name VARCHAR(100) NOT NULL,
price NUMERIC(10, 2) NOT NULL
)
''')
# --- Data to insert ---
products_to_insert = [
('Laptop', 1200.50),
('Mouse', 25.00),
('Keyboard', 75.75),
('Monitor', 300.00)
]
# --- Using executemany() with psycopg2 ---
# Note the use of %s placeholders
insert_query = "INSERT INTO products (product_name, price) VALUES (%s, %s)"
try:
cursor.executemany(insert_query, products_to_insert)
conn.commit()
print(f"Successfully inserted {cursor.rowcount} products.")
# --- Verify the data ---
cursor.execute("SELECT * FROM products;")
print("\nAll products:")
for row in cursor.fetchall():
print(row)
except psycopg2.Error as e:
print(f"An error occurred: {e}")
conn.rollback() # Rollback in case of error
finally:
cursor.close()
conn.close()
Important Considerations and Best Practices
Always Use Parameterized Queries
NEVER format your query with f-strings or .format() to insert values. This is a major security risk called SQL Injection.
❌ WRONG (Vulnerable to SQL Injection):
name = "Alice'; DROP TABLE users; --"
query = f"INSERT INTO users (name) VALUES ('{name}')"
cursor.execute(query)
✅ CORRECT (Safe with executemany):
name = "Alice'; DROP TABLE users; --" # The database driver will safely escape the special characters query = "INSERT INTO users (name) VALUES (?)" cursor.execute(query, (name,)) # Note the comma to make it a tuple
Handle Transactions
Database operations like INSERT are part of a transaction. You must explicitly commit() the transaction to make the changes permanent. If an error occurs, you should rollback() to undo any changes made in the current transaction.
try:
cursor.executemany(...)
conn.commit() # Make changes permanent
except Exception:
conn.rollback() # Undo changes if something went wrong
Batch Size for Very Large Datasets
If you have millions of rows, loading them all into a single Python list can consume a lot of memory. A better approach is to process your data in smaller batches.
import sqlite3
conn = sqlite3.connect('mydatabase.db')
cursor = conn.cursor()
# Assume you have a generator that yields data one row at a time
def data_generator():
for i in range(100000):
yield (f'User_{i}', f'user_{i}@example.com', i % 100)
# Process in batches of 1000
batch_size = 1000
insert_query = "INSERT INTO users (name, email, age) VALUES (?, ?, ?)"
batch = []
for row in data_generator():
batch.append(row)
if len(batch) == batch_size:
cursor.executemany(insert_query, batch)
conn.commit()
print(f"Inserted a batch of {batch_size} rows.")
batch = [] # Reset the batch
# Insert any remaining rows
if batch:
cursor.executemany(insert_query, batch)
conn.commit()
print(f"Inserted final batch of {len(batch)} rows.")
conn.close()
Check cursor.rowcount
After calling executemany(), the cursor.rowcount attribute will tell you how many rows were affected by the operation. This is useful for confirming success.
