杰瑞科技汇

Python WorkRequest如何高效处理并发任务?

Of course! The term "Python work request" can mean a few different things. I'll cover the most common interpretations, from a simple concept to a more advanced architectural pattern.

Python WorkRequest如何高效处理并发任务?-图1
(图片来源网络,侵删)

Let's break it down:

  1. The Simple Concept: A basic script that takes input (a "request") and does some "work".
  2. The Architectural Pattern: A robust system for managing background tasks, often using a message queue (like RabbitMQ or Redis). This is a very common and important pattern.
  3. A Specific Library: A Python library that implements this pattern for you.

The Simple Concept: A Basic Work Request Script

This is the most straightforward interpretation. You have a script that performs a task based on some input. This input could come from command-line arguments, user input, or a configuration file.

Example: A script that processes a file.

Let's say we have a script that takes a filename as a "request" and counts the number of lines in it.

Python WorkRequest如何高效处理并发任务?-图2
(图片来源网络,侵删)

work_request.py

import sys
def do_work(filename):
    """
    This is the "work" function. It takes a request (filename)
    and performs an action.
    """
    print(f"Starting work on: {filename}")
    try:
        with open(filename, 'r') as f:
            line_count = sum(1 for _ in f)
        print(f"Work finished. '{filename}' has {line_count} lines.")
        return line_count
    except FileNotFoundError:
        print(f"Error: File '{filename}' not found.")
        return None
if __name__ == "__main__":
    # The "request" comes from the command line arguments
    if len(sys.argv) != 2:
        print("Usage: python work_request.py <filename>")
        sys.exit(1) # Exit with an error code
    request_data = sys.argv[1]
    do_work(request_data)

How to run it:

# Create a dummy file
echo "Hello\nWorld\nThis is a test." > my_file.txt
# Run the script with the file as the "request"
python work_request.py my_file.txt

Output:

Starting work on: my_file.txt
Work finished. 'my_file.txt' has 3 lines.

This is a simple, synchronous work request. The script runs, does the work, and finishes.

Python WorkRequest如何高效处理并发任务?-图3
(图片来源网络,侵删)

The Architectural Pattern: Background Task Queues

In real-world applications, you often need to perform long-running tasks (like sending emails, processing videos, generating reports) without making the user wait. The user clicks a button ("Send Report"), and the application immediately shows "Report is being prepared." The actual work is done in the background.

This is where the Work Request Pattern (or Task Queue Pattern) comes in. It's a fundamental concept in building scalable applications.

The core components are:

  1. Client: The application that receives the user's request. Instead of doing the work itself, it creates a "work request" (a small message) and puts it into a queue.
  2. Queue: A message broker (like RabbitMQ, Redis, or Celery's own broker) that holds the work requests. It acts as a buffer between the client and the worker.
  3. Worker: A separate, dedicated process (or multiple processes) that constantly watches the queue. When a new request appears, the worker picks it up, runs the task, and then signals that it's done.

Why use this pattern?

  • Decoupling: The client doesn't need to know how the work is done or who does it.
  • Asynchronous: The user gets an immediate response.
  • Scalability: If you have too much work, you can just add more workers to handle the load.
  • Durability: If a worker crashes, the message can be put back in the queue for another worker to try later.

Implementing the Pattern with Celery (The Easy Way)

Manually setting up RabbitMQ and writing your own workers is complex. The most popular library for handling this in Python is Celery.

Celery is a powerful, distributed task queue that handles all the complexity for you.

Let's build a simple "Hello World" example with Celery.

Step 1: Install Celery and a message broker (Redis is easy to start with).

pip install celery redis

You'll also need to run a Redis server. If you have Docker, it's easiest:

docker run -d -p 6379:6379 redis

Step 2: Create a file for your tasks (tasks.py). This file defines the "work" functions.

# tasks.py
from celery import Celery
# Define the Celery application, telling it where to find the broker (Redis)
app = Celery('tasks', broker='redis://localhost:6379/0')
@app.task
def add(x, y):
    """
    This is our work request. The @app.task decorator turns this
    regular Python function into a Celery task that can be sent
    to the queue.
    """
    print(f"Worker received a task to add {x} and {y}")
    result = x + y
    print(f"Worker finished: {x} + {y} = {result}")
    return result
@app.task
def send_welcome_email(user_email):
    """A more realistic work request."""
    print(f"Simulating sending email to {user_email}...")
    # In a real app, you'd use an email library here like sendmail
    # This would take a few seconds, so we simulate it.
    import time
    time.sleep(3)
    print(f"Email successfully sent to {user_email}!")
    return f"Email sent to {user_email}"

Step 3: Start the Celery Worker. Open a new terminal and run this command. This worker will listen for tasks on the Redis queue and execute them.

celery -A tasks worker --loglevel=info

You should see output indicating the worker is online and waiting for tasks.

Step 4: Send a "Work Request" from a Client. Now, open another terminal (or use a Python interpreter) to act as the client. This code will create and send a task to the queue.

# client.py
from tasks import add, send_welcome_email
# This line does NOT run the task. It creates a "work request"
# (a message) and sends it to the queue. The function returns
# an "AsyncResult" object immediately.
result_async = add.delay(5, 7)
print(f"Task sent! Task ID is: {result_async.id}")
print(f"Task status is: {result_async.status}") # Should be 'PENDING'
# Now, let's check the worker's terminal. You should see it
# pick up the task and print the logs.
# You can get the final result when it's ready
# This will block until the task is complete.
try:
    final_result = result_async.get(timeout=10)
    print(f"The final result is: {final_result}")
    print(f"Task status is now: {result_async.status}") # Should be 'SUCCESS'
except Exception as e:
    print(f"An error occurred: {e}")
# Let's send another task, this time for the email
email_result = send_welcome_email.delay('new.user@example.com')
print(f"\nEmail task sent! Task ID is: {email_result.id}")
print(f"Email task status is: {email_result.status}")

When you run client.py, you'll see it finishes instantly. The real work happens in the worker terminal.


Summary

Term Meaning When to Use
Simple Work Request A script that takes input and does a job synchronously. Quick scripts, data processing, or simple command-line tools.
Work Request Pattern An architectural design for asynchronous background tasks using a queue. Essential for building responsive, scalable web apps and services.
Celery A Python library that implements the Work Request Pattern for you. Almost any time you need background tasks in a Python application. It's the de-facto standard.

For any serious application, learning about and using the Work Request Pattern with Celery is a highly valuable skill.

分享:
扫描分享到社交APP
上一篇
下一篇