celery-cloudwatch | Publishes Celery task statistics to AWS CloudWatch | Cloud Storage library
kandi X-RAY | celery-cloudwatch Summary
kandi X-RAY | celery-cloudwatch Summary
Monitor your celery application from within AWS CloudWatch!.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Print the metrics from the given state
- Build the metrics
- Add metric
- Adds celery events to metrics
- Send metrics
- Helper function to walk the dimensions
- Yield successive n - sized chunks
- Serializes this metric to a dictionary
- Start celery
- Sets the state
- Cancel the timer
- Freezes the given fun
- Prints the time to stdout
- Returns the total number of waiting tasks
- Return the average rate
- Find the package version
- Read a file
- Send shutter signal
- Send metrics to CloudWatch
celery-cloudwatch Key Features
celery-cloudwatch Examples and Code Snippets
Community Discussions
Trending Discussions on celery-cloudwatch
QUESTION
Current setup: celery running on docker containers (with our product's code) on an EC2 node, creating and processing tasks. Our backend/broker is Redis, running in AWS' elasticache.
Goal: being able to see the queue size at any given time (similar to flower's monitoring), hopefully through AWS CloudWatch, but not needed. The content of the tasks isn't pertinent, as I am familiar with making a backup of the redis instance, and can parse the backup using local tools to do any analysis needed. Short lived historical data is highly preferred (CloudWatch goes back 2 weeks, and has granularity of 1 min datapoints, this is quite nice).
Based on how I'm aware Flower works, Flower wouldn't be feasible to use due to the amount of security groups/restrictions that we currently have in place. Additionally flower is only monitoring while you're on the page, so there is no historical data saved.
Elasticache already has built in CloudWatch for number of items in redis. This seems to me the best route to achieve the goal. However currently the queue represents one item in redis (no matter how many tasks are in the queue). Here is a sample of the redis backup parsed to json:
...ANSWER
Answered 2019-Feb-06 at 22:56To see the queue length of a queue using a redis broker, just use llen
in redis. e.g., llen celery
.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install celery-cloudwatch
Set up an IAM Role for your instance. It must include a policy to perform 'PutMetricData', eg: { "Version": "2000-01-01", "Statement": [ { "Effect": "Allow", "Action": [ "cloudwatch:PutMetricData" ], "Resource": [ "*" ] } ] } (Note: Alternatively, you can set up a User with the same policy and provide access details that way)
Install via python-pip (and upgrade pip & boto) sudo apt-get install -y python-pip sudo pip install --upgrade pip boto # Install directly sudo pip install celery-cloudwatch # OR, install in a virtualenv sudo apt-get install -y python-virtualenv mkdir /var/python-envs virtualenv /var/python-envs/ccwatch source /var/python-envs/ccwatch/bin/activate pip install celery-cloudwatch
Create your own boto.cfg at /etc/boto.cfg- [Credentials] # if not using an IAM Role - provide aws key/secret aws_access_key_id = xxx aws_secret_access_key = yyy [Boto] cloudwatch_region_name = my-region cloudwatch_region_endpoint = monitoring.my-region.amazonaws.com
Create your own config file in /etc/ccwatch.yaml ccwatch: broker: null camera: celery_cloudwatch.CloudWatchCamera verbose: no camera: frequency: 60.0 verbose: no cloudwatch-camera: dryrun: no namespace: celery tasks: - myapp.mytasks.taskname - myapp.mytasks.anothertask - myapp.mytasks.thirdtask - name: myapp.secondarytasks dimensions: task: myapp.secondarytasks customDim: value - name: myapp.tertiarytasks dimensions: task: myapp.tertiarytasks customDim: value
Install upstart Create a file /etc/init/celery-cloudwatch.conf- description "Celery CloudWatch" author "nathan muir <ndmuir@gmail.com>" setuid nobody setgid nogroup start on runlevel [234] stop on runlevel [0156] exec /var/python-envs/ccwatch/bin/ccwatch respawn then- sudo initctl reload-configuration sudo service celery-cloudwatch start
Start Celery your celery workers with the -E (or CELERY_SEND_EVENTS=1 and CELERY_TRACK_STARTED=1) options, and, start celery clients with CELERY_SEND_TASK_SENT_EVENT=1
All done! head over to your CloudWatch monitoring page to see the results!
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page