Python并行任务队列框架:tasks.py

jopen 10年前

Task.py是一个以并行方式执行多个任务的简单而快速的任务队列框架,所有你需要做的就是指定所跑的任务为一个可作为参数的简单函数从而立马获得并行能力。

Installation

  1. Install redis and start the server, tasks uses redis for queueing jobs. If you already have a redis server setup, call tasks.set_redis and pass a redis connection object with a different database/namespace from what you normally use in your application.

  2. Install the redis-py and eventlet libraries.

    pip install redis eventlet

  3. Install tasks or copy this package to your source code.

    pip install tasks-py

Usage

Import tasks and call eventlet's monkey patch function in the first line of your module. Call tasks.set_func to register your function. This function will be receiving a string as an argument and its return value will be ignored. To indicate failure of the task, raise an error or exception within the function. Call tasks.main() to get the interactive command line options.

import eventlet  eventlet.monkey_patch()  import tasks    from urllib2 import urlopen    def fetch(url):      f = open('/tmp/download')      body = urlopen(url).read()      f.write(body)      f.close()    tasks.set_func(fetch)  tasks.main()

Now to add jobs, create a file with one argument per line and use this command.

$ python yourfile.py add <list_of_jobs.txt>

To start (or restart) the job processing (do this in a screen session or close the input stream):

$ python yourfile.py run

tasks has resume support, so it will start where you left off the last time.

To view the current status while it is running:

$ python yourfile.py status

Once you are done, you can clear the logs and the completed tasks by calling reset.

$ python yourfile.py reset

See the code or the test.py file for more information. Feel free to fork and modify this.

项目主页:http://www.open-open.com/lib/view/home/1418016693292