@twein89
2016-07-11T03:31:43.000000Z
字数 1292
阅读 718
celery
scrapy
celery是一个分布式任务队列,其broker可以选用RabbitMQ,Redis或其他数据库,在此我们选择redis作为消息队列的broker。
pip install celery
创建一个任务文件tasks.py:
from celery import Celery
app = Celery('tasks', broker='redis://username:password@localhost:6379',)
@app.task
def add(x, y):
return x + y
在task.py同级目录下,从命令行启动 celery woker 服务:
celery -A tasks worker --loglevel=info
在CELERYBEAT_SCHEDULE中添加一个定期任务
例:每隔30秒跑一次task.add任务
from datetime import timedelta
CELERYBEAT_SCHEDULE = {
'add-every-30-seconds': {
'task': 'tasks.add',
'schedule': timedelta(seconds=30),
'args': (16, 16)
},
}
CELERY_TIMEZONE = 'Asia/Shanghai'
使用crontab schedule类型的任务可以控制任务的执行时间,例如,某天或一周中某天的特定时间。
from celery.schedules import crontab
CELERYBEAT_SCHEDULE = {
# Executes every Monday morning at 7:30 A.M
'add-every-monday-morning': {
'task': 'tasks.add',
'schedule': crontab(hour=7, minute=30, day_of_week=1),
'args': (16, 16),
},
}
Example | Meaning |
---|---|
crontab() | Execute every minute. |
crontab(minute=0, hour=0) | Execute daily at midnight. |
crontab(minute=0, hour='*/3') | Execute every three hours: midnight, 3am, 6am, 9am, noon, 3pm, 6pm, 9pm. |
crontab(minute=0,hour='0,3,6,9,12,15,18,21') | Same as previous. |
crontab(minute='*/15') | Execute every 15 minutes. |
crontab(day_of_week='sunday') | Execute every minute (!) at Sundays. |
crontab(minute='',hour='', day_of_week='sun') | Same as previous. |
启动celery beat 服务:
celery -A proj beat