Of course! Here is a comprehensive guide to using LocalStack with Python.

What is LocalStack?
LocalStack is a flexible, open-source solution that provides a fully functional local cloud environment. It allows you to run AWS services, such as S3, SQS, DynamoDB, Lambda, and more, on your own machine.
Why use LocalStack?
- Fast Development & Testing: No more waiting for slow network calls to the real AWS cloud. Test your code instantly on your local machine.
- Cost-Effective: It's free to use for development. You don't incur any AWS charges while building and testing your application.
- Offline Development: Develop and test your application without an internet connection.
- CI/CD Friendly: Perfect for automated testing in your CI/CD pipeline (e.g., Jenkins, GitHub Actions) where you need a predictable environment.
- Isolated Environment: Test complex scenarios (like error handling, failed messages) without affecting your real AWS resources.
Getting Started: A Step-by-Step Python Guide
Let's build a simple but practical example: a Python application that uploads a file to an S3 bucket and then reads a list of objects from that bucket. We'll do this entirely against LocalStack.
Step 1: Install Prerequisites
- Python: Ensure you have Python 3.7+ installed.
- Docker: LocalStack runs inside a Docker container, so you need Docker installed and running on your machine.
Step 2: Install LocalStack
The easiest way to run LocalStack is with Docker.

# Pull the latest LocalStack image docker pull localstack/localstack # Run LocalStack docker run -d --name localstack -p 4566:4566 -p 4571:4571 localstack/localstack
-d: Runs the container in detached mode (in the background).--name localstack: Gives the container a memorable name.-p 4566:4566: Maps port 4566 on your host to port 4566 in the container. This is the primary port for the LocalStack endpoint.-p 4571:4571: Maps the Lambda API port.
Your local AWS environment is now running at http://localhost:4566.
Step 3: Set Up Your Python Project
Create a project directory and a virtual environment.
mkdir localstack-python-example cd localstack-python-example python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
Install the necessary Python libraries. We'll use boto3, the official AWS SDK for Python.
pip install boto3 pytest
boto3: The AWS SDK.pytest: A popular testing framework, which we'll use to demonstrate how to test code against LocalStack.
Step 4: Configure Boto3 to Use LocalStack
By default, boto3 points to the real AWS. We need to tell it to use our LocalStack endpoint. We can do this by setting environment variables or by configuring the boto3 client directly.

Method A: Environment Variables (Recommended for CI/CD)
Set these environment variables in your terminal before running your Python script.
export AWS_ACCESS_KEY_ID="test" # Dummy key export AWS_SECRET_ACCESS_KEY="test" # Dummy secret export AWS_ENDPOINT_URL="http://localhost:4566" export AWS_DEFAULT_REGION="us-east-1" # Or any region you prefer
Method B: In-Code Configuration
You can also pass the endpoint URL directly when creating a client or resource.
import boto3
# Create an S3 client that points to LocalStack
s3_client = boto3.client(
's3',
endpoint_url='http://localhost:4566',
aws_access_key_id='test',
aws_secret_access_key='test',
region_name='us-east-1'
)
We'll use the environment variable method as it's cleaner for scripts.
Step 5: Write the Python Code
Create a file named s3_example.py.
# s3_example.py
import os
import boto3
from botocore.exceptions import ClientError
# Boto3 will automatically use the environment variables we set
s3_client = boto3.client('s3')
BUCKET_NAME = "my-test-bucket-localstack"
def create_bucket():
"""Creates an S3 bucket if it doesn't already exist."""
try:
s3_client.create_bucket(Bucket=BUCKET_NAME)
print(f"Bucket '{BUCKET_NAME}' created successfully.")
except ClientError as e:
if e.response['Error']['Code'] == 'BucketAlreadyOwnedByYou':
print(f"Bucket '{BUCKET_NAME}' already exists.")
else:
raise
def upload_file():
"""Uploads a sample file to the S3 bucket."""
file_name = "sample.txt"
object_key = "uploaded/sample.txt"
# Create a dummy file
with open(file_name, "w") as f:
f.write("Hello from LocalStack!")
try:
s3_client.upload_file(file_name, BUCKET_NAME, object_key)
print(f"File '{file_name}' uploaded to '{BUCKET_NAME}/{object_key}'.")
except ClientError as e:
print(f"Error uploading file: {e}")
raise
def list_objects():
"""Lists all objects in the S3 bucket."""
try:
response = s3_client.list_objects_v2(Bucket=BUCKET_NAME)
if 'Contents' in response:
print("\nObjects in bucket:")
for obj in response['Contents']:
print(f" - {obj['Key']} (Size: {obj['Size']} bytes)")
else:
print(f"\nBucket '{BUCKET_NAME}' is empty.")
except ClientError as e:
print(f"Error listing objects: {e}")
raise
if __name__ == "__main__":
print("--- Starting S3 operations with LocalStack ---")
create_bucket()
upload_file()
list_objects()
print("--- Operations complete ---")
Step 6: Run the Script
Now, run your Python script from the terminal (making sure your environment variables are set).
python s3_example.py
You should see output similar to this:
--- Starting S3 operations with LocalStack ---
Bucket 'my-test-bucket-localstack' created successfully.
File 'sample.txt' uploaded to 'my-test-bucket-localstack/uploaded/sample.txt'.
Objects in bucket:
- uploaded/sample.txt (Size: 30 bytes)
--- Operations complete ---
Congratulations! You have successfully used Python to interact with AWS services running on your local machine.
Testing with Pytest: A Best Practice
Writing scripts is good, but for robust applications, you need unit and integration tests. LocalStack is perfect for this.
Let's create a test for our s3_example.py code.
-
Create a test file:
test_s3_example.py# test_s3_example.py import boto3 import pytest from moto import mock_aws # The magic mocking library for LocalStack/AWS # We use the 'moto' library, which is the standard for mocking AWS services in tests. # It works seamlessly with LocalStack's endpoint. # The @mock_aws decorator patches all boto3 clients to use the mocked backend. @mock_aws def test_s3_workflow(): # 1. Setup: Create the S3 client and bucket s3_client = boto3.client('s3', region_name='us-east-1') bucket_name = "my-test-bucket-for-tests" s3_client.create_bucket(Bucket=bucket_name) # 2. Action: Upload a file file_content = b"Hello from a pytest test!" s3_client.put_object(Bucket=bucket_name, Key='test.txt', Body=file_content) # 3. Assertion: Verify the file was uploaded response = s3_client.list_objects_v2(Bucket=bucket_name) assert 'Contents' in response assert len(response['Contents']) == 1 assert response['Contents'][0]['Key'] == 'test.txt' # Get the object and verify its content get_object_response = s3_client.get_object(Bucket=bucket_name, Key='test.txt') assert get_object_response['Body'].read() == file_content print("Test passed: S3 workflow works as expected!") -
Install
moto:motois a library that provides mock implementations of AWS services. It's the standard for testingboto3code and works perfectly with LocalStack.pip install moto
-
Run the test:
pytest test_s3_example.py -v
You'll see output indicating the test passed. Notice that you didn't even need to have LocalStack running for this test!
motoprovides the mock environment directly. This is the fastest way to run unit tests. For integration tests that require the full LocalStack environment, you would configure pytest to start/stop the LocalStack container.
Common Services & Useful Commands
Here's a quick reference for other common services.
| Service | LocalStack Endpoint Port | Python (boto3) Example |
|---|---|---|
| S3 | 4566 | s3 = boto3.client('s3', endpoint_url='http://localhost:4566') |
| SQS | 4566 | sqs = boto3.client('sqs', endpoint_url='http://localhost:4566') |
| DynamoDB | 4566 | dynamodb = boto3.resource('dynamodb', endpoint_url='http://localhost:4566') |
| Lambda | 4566 (API Gateway) | lambda_client = boto3.client('lambda', endpoint_url='http://localhost:4566/2025-03-31') |
| SES | 4566 | ses = boto3.client('ses', endpoint_url='http://localhost:4566') |
| CloudWatch | 4566 | cloudwatch = boto3.client('cloudwatch', endpoint_url='http://localhost:4566') |
Useful docker commands for LocalStack:
# View the container logs docker logs localstack # Stop the container docker stop localstack # Remove the container docker rm localstack # Run with a custom configuration file (advanced) docker run -d -p 4566:4566 --env-file ./localstack.env -v /var/run/docker.sock:/var/run/docker.sock localstack/localstack
Key Takeaways
- LocalStack is a Docker container that simulates AWS services on
localhost:4566. - Use
boto3as you normally would, but configure it with theAWS_ENDPOINT_URLenvironment variable or an in-code parameter. - Use dummy credentials like
test/testfor development. - For testing, use
moto. It provides fast, in-memory mocks of AWS services and is the industry standard for testingboto3code. It integrates perfectly withpytest. - It's not 100% perfect: While LocalStack covers most common use cases, some niche AWS services or very specific API versions might be missing or behave differently. Always check the LocalStack Roadmap for known limitations.
