kandi background
Explore Kits

celery | Distributed Task Queue | Pub Sub library

 by   celery Python Version: 5.3.0b1 License: Non-SPDX

 by   celery Python Version: 5.3.0b1 License: Non-SPDX

kandi X-RAY | celery Summary

celery is a Python library typically used in Messaging, Pub Sub applications. celery has build file available and it has medium support. However celery has 16 bugs, it has 2 vulnerabilities and it has a Non-SPDX License. You can install using 'pip install celery' or download it from GitHub, PyPI.
Distributed Task Queue (development branch)
Support
Support
Quality
Quality
Security
Security
License
License
Reuse
Reuse

kandi-support Support

  • celery has a medium active ecosystem.
  • It has 20864 star(s) with 4423 fork(s). There are 476 watchers for this library.
  • There were 6 major release(s) in the last 12 months.
  • There are 533 open issues and 4273 have been closed. On average issues are closed in 333 days. There are 61 open pull requests and 0 closed requests.
  • It has a neutral sentiment in the developer community.
  • The latest version of celery is 5.3.0b1
celery Support
Best in #Pub Sub
Average in #Pub Sub
celery Support
Best in #Pub Sub
Average in #Pub Sub

quality kandi Quality

  • celery has 16 bugs (4 blocker, 1 critical, 3 major, 8 minor) and 937 code smells.
celery Quality
Best in #Pub Sub
Average in #Pub Sub
celery Quality
Best in #Pub Sub
Average in #Pub Sub

securitySecurity

  • celery has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
  • celery code analysis shows 2 unresolved vulnerabilities (2 blocker, 0 critical, 0 major, 0 minor).
  • There are 30 security hotspots that need review.
celery Security
Best in #Pub Sub
Average in #Pub Sub
celery Security
Best in #Pub Sub
Average in #Pub Sub

license License

  • celery has a Non-SPDX License.
  • Non-SPDX licenses can be open source with a non SPDX compliant license, or non open source licenses, and you need to review them closely before use.
celery License
Best in #Pub Sub
Average in #Pub Sub
celery License
Best in #Pub Sub
Average in #Pub Sub

buildReuse

  • celery releases are available to install and integrate.
  • Deployable package is available in PyPI.
  • Build file is available. You can build the component from source.
  • celery saves you 43207 person hours of effort in developing the same functionality from scratch.
  • It has 51047 lines of code, 5998 functions and 329 files.
  • It has medium code complexity. Code complexity directly impacts maintainability of the code.
celery Reuse
Best in #Pub Sub
Average in #Pub Sub
celery Reuse
Best in #Pub Sub
Average in #Pub Sub
Top functions reviewed by kandi - BETA

kandi has reviewed celery and discovered the below as its top functions. This is intended to give you an instant insight into celery implemented functionality, and help decide if they suit your requirements.

  • Build a tracer .
    • Create write handlers .
      • List of worker threads .
        • Prepare steps for processing .
          • Create the event dispatcher .
            • Create a task sender .
              • Set the TTL for the table .
                • Calls the task .
                  • Send a single task .
                    • Moves messages from a queue .

                      Get all kandi verified functions for this library.

                      Get all kandi verified functions for this library.

                      celery Key Features

                      Distributed Task Queue (development branch)

                      celery Examples and Code Snippets

                      See all related Code Snippets

                      Community Discussions

                      Trending Discussions on celery
                      • Cannot install additional requirements to apache airflow
                      • run two celery task
                      • How to give celery enough permission to run a root file without compromising security?
                      • No live output from containers running on Docker Desktop with WSL2
                      • Transform nested collection in laravel
                      • How can I properly kill a celery task in a kubernetes environment?
                      • Celery showing django's runserver logs instead of celery logs
                      • Use existing celery workers for Airflow's Celeryexecutor workers
                      • Django-Celery No Periodic Outputs
                      • Using Celery for long running async jobs
                      Trending Discussions on celery

                      QUESTION

                      Cannot install additional requirements to apache airflow

                      Asked 2021-Jun-14 at 16:35

                      I am using the following docker-compose image, I got this image from: https://github.com/apache/airflow/blob/main/docs/apache-airflow/start/docker-compose.yaml

                      version: "3"
                      x-airflow-common: &airflow-common
                        image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:2.0.0-python3.7}
                        environment: &airflow-common-env
                          AIRFLOW__CORE__EXECUTOR: CeleryExecutor
                          AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
                          AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
                          AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
                          AIRFLOW__CORE__FERNET_KEY: ""
                          AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: "true"
                          AIRFLOW__CORE__LOAD_EXAMPLES: "false"
                          AIRFLOW__API__AUTH_BACKEND: "airflow.api.auth.backend.basic_auth"
                          _PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-apache-airflow-providers-apache-spark}
                        volumes:
                          - ./dags:/opt/airflow/dags
                          - ./logs:/opt/airflow/logs
                          - ./plugins:/opt/airflow/plugins
                        user: "${AIRFLOW_UID:-50000}:${AIRFLOW_GID:-50000}"
                        depends_on: &airflow-common-depends-on
                          redis:
                            condition: service_healthy
                          postgres:
                            condition: service_healthy
                      
                      services:
                        postgres:
                          image: postgres:13
                          environment:
                            POSTGRES_USER: airflow
                            POSTGRES_PASSWORD: airflow
                            POSTGRES_DB: airflow
                          volumes:
                            - postgres-db-volume:/var/lib/postgresql/data
                          healthcheck:
                            test: ["CMD", "pg_isready", "-U", "airflow"]
                            interval: 5s
                            retries: 5
                          restart: always
                      
                        redis:
                          image: redis:latest
                          ports:
                            - 6379:6379
                          healthcheck:
                            test: ["CMD", "redis-cli", "ping"]
                            interval: 5s
                            timeout: 30s
                            retries: 50
                          restart: always
                      
                        airflow-webserver:
                          <<: *airflow-common
                          command: webserver
                          ports:
                            - 8080:8080
                          healthcheck:
                            test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
                            interval: 10s
                            timeout: 10s
                            retries: 5
                          restart: always
                          depends_on:
                            <<: *airflow-common-depends-on
                            airflow-init:
                              condition: service_completed_successfully
                      
                        airflow-scheduler:
                          <<: *airflow-common
                          command: scheduler
                          healthcheck:
                            test:
                              [
                                "CMD-SHELL",
                                'airflow jobs check --job-type SchedulerJob --hostname "$${HOSTNAME}"',
                              ]
                            interval: 10s
                            timeout: 10s
                            retries: 5
                          restart: always
                          depends_on:
                            <<: *airflow-common-depends-on
                            airflow-init:
                              condition: service_completed_successfully
                      
                        airflow-worker:
                          <<: *airflow-common
                          command: celery worker
                          healthcheck:
                            test:
                              - "CMD-SHELL"
                              - 'celery --app airflow.executors.celery_executor.app inspect ping -d "celery@$${HOSTNAME}"'
                            interval: 10s
                            timeout: 10s
                            retries: 5
                          restart: always
                          depends_on:
                            <<: *airflow-common-depends-on
                            airflow-init:
                              condition: service_completed_successfully
                      
                        airflow-init:
                          <<: *airflow-common
                          command: version
                          environment:
                            <<: *airflow-common-env
                            _AIRFLOW_DB_UPGRADE: "true"
                            _AIRFLOW_WWW_USER_CREATE: "true"
                            _AIRFLOW_WWW_USER_USERNAME: ${_AIRFLOW_WWW_USER_USERNAME:-airflow}
                            _AIRFLOW_WWW_USER_PASSWORD: ${_AIRFLOW_WWW_USER_PASSWORD:-airflow}
                      
                        flower:
                          <<: *airflow-common
                          command: celery flower
                          ports:
                            - 5555:5555
                          healthcheck:
                            test: ["CMD", "curl", "--fail", "http://localhost:5555/"]
                            interval: 10s
                            timeout: 10s
                            retries: 5
                          restart: always
                          depends_on:
                            <<: *airflow-common-depends-on
                            airflow-init:
                              condition: service_completed_successfully
                      
                        ######################################################
                        # SPARK SERVICES
                        ######################################################
                      
                        jupyterlab:
                          image: andreper/jupyterlab:3.0.0-spark-3.0.0
                          container_name: jupyterlab
                          ports:
                            - 8888:8888
                            - 4040:4040
                          volumes:
                            - shared-workspace:/opt/workspace
                        spark-master:
                          image: andreper/spark-master:3.0.0
                          container_name: spark-master
                          ports:
                            - 8081:8080
                            - 7077:7077
                          volumes:
                            - shared-workspace:/opt/workspace
                        spark-worker-1:
                          image: andreper/spark-worker:3.0.0
                          container_name: spark-worker-1
                          environment:
                            - SPARK_WORKER_CORES=1
                            - SPARK_WORKER_MEMORY=512m
                          ports:
                            - 8082:8081
                          volumes:
                            - shared-workspace:/opt/workspace
                          depends_on:
                            - spark-master
                        spark-worker-2:
                          image: andreper/spark-worker:3.0.0
                          container_name: spark-worker-2
                          environment:
                            - SPARK_WORKER_CORES=1
                            - SPARK_WORKER_MEMORY=512m
                          ports:
                            - 8083:8081
                          volumes:
                            - shared-workspace:/opt/workspace
                          depends_on:
                            - spark-master
                      
                      volumes:
                        postgres-db-volume:
                        shared-workspace:
                          name: "jordi_airflow"
                          driver: local
                          driver_opts:
                            type: "none"
                            o: "bind"
                            device: "/Users/jordicrespoguzman/Projects/custom_airflow_spark/spark_folder"
                      

                      I am trying to run the following DAG:

                      from airflow import DAG
                      from airflow.providers.http.sensors.http import HttpSensor
                      from airflow.sensors.filesystem import FileSensor
                      from airflow.operators.python import PythonOperator
                      from airflow.operators.bash import BashOperator
                      
                      from airflow.providers.apache.spark.operators.spark_submit import SparkSubmitOperator
                      from airflow.operators.email import EmailOperator
                      
                      from datetime import datetime, timedelta
                      import csv
                      import requests
                      import json
                      
                      default_args = {
                          "owner": "airflow",
                          "email_on_failure": False,
                          "email_on_retry": False,
                          "email": "admin@localhost.com",
                          "retries": 1,
                          "retry_delay": timedelta(minutes=5),
                      }
                      
                      
                      def printar():
                          print("success!")
                      
                      
                      with DAG(
                          "forex_data_pipeline",
                          start_date=datetime(2021, 1, 1),
                          schedule_interval="@daily",
                          default_args=default_args,
                          catchup=False,
                      ) as dag:
                      
                          downloading_rates = PythonOperator(task_id="test1", python_callable=printar)
                      
                          forex_processing = SparkSubmitOperator(
                              task_id="spark1",
                              application="/opt/airflow/dags/test.py",
                              conn_id="spark_conn",
                              verbose=False,
                          )
                      
                          downloading_rates  >> forex_processing
                      

                      But I see this error in the airflow ui:

                      Broken DAG: [/opt/airflow/dags/dag_spark.py] Traceback (most recent call last):
                        File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
                        File "/opt/airflow/dags/dag_spark.py", line 7, in <module>
                          from airflow.providers.apache.spark.operators.spark_submit import SparkSubmitOperator
                      ModuleNotFoundError: No module named 'airflow.providers.apache'
                      

                      I have specified to install additional requirements in the docker-compose file:

                      _PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-apache-airflow-providers-apache-spark}
                      

                      I am writing it wrong? how I should specify the additional requirements I want to install in airflow? can I pass a requirements.txt? if so, how I specify the path?

                      ANSWER

                      Answered 2021-Jun-14 at 16:35

                      Support for _PIP_ADDITIONAL_REQUIREMENTS environment variable has not been released yet. It is only supported by the developer/unreleased version of the docker image. It is planned that this feature will be available in Airflow 2.1.1. For more information, see: Adding extra requirements for build and runtime of the PROD image.

                      For the older version, you should build a new image and set this image in the docker-compose.yaml. To do this, you need to follow a few steps.

                      1. Create a new Dockerfile with the following content:
                      FROM apache/airflow:2.0.0
                      RUN pip install --no-cache-dir apache-airflow-providers
                      
                    • Build a new image:
                    • docker build . --tag my-company-airflow:2.0.0
                      
                    • Set this image in docker-compose.yaml file:
                    • echo "AIRFLOW_IMAGE_NAME=my-company-airflow:2.0.0" >> .env
                      

                      For more information, see: Official guide about running Airflow in docker-compose environment

                      In particular, I recommend this fragment which describes what to do as you need to install a new pip package.

                      ModuleNotFoundError: No module named 'XYZ'

                      The Docker Compose file uses the latest Airflow image (apache/airflow). If you need to install a new Python library or system library, you can customize and extend it.

                      I recommend you check out the guide about building Docker Image. This explains how to install even more complex dependencies.

                      I also recommend only using Docker-compose files from the official website and intended for a specific version. Docker-compose files from newer versions may not work with older versions of Airflow, because we are making many improvements to these files all the time to improve stability reliability, and user experience.

                      Source https://stackoverflow.com/questions/67851351

                      Community Discussions, Code Snippets contain sources that include Stack Exchange Network

                      Vulnerabilities

                      No vulnerabilities reported

                      Install celery

                      You can install using 'pip install celery' or download it from GitHub, PyPI.
                      You can use celery like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.

                      Support

                      For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .

                      Find more information at:

                      Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from
                      over 650 million Knowledge Items
                      Find more libraries
                      Reuse Solution Kits and Libraries Curated by Popular Use Cases
                      Explore Kits

                      Save this library and start creating your kit

                      Install
                      • pip install celery

                      Clone
                      • https://github.com/celery/celery.git

                      • gh repo clone celery/celery

                      • git@github.com:celery/celery.git

                      Share this Page

                      share link

                      See Similar Libraries in

                      Consider Popular Pub Sub Libraries
                      Try Top Libraries by celery
                      Compare Pub Sub Libraries with Highest Support
                      Compare Pub Sub Libraries with Highest Quality
                      Compare Pub Sub Libraries with Highest Security
                      Compare Pub Sub Libraries with Permissive License
                      Compare Pub Sub Libraries with Highest Reuse
                      Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from
                      over 650 million Knowledge Items
                      Find more libraries
                      Reuse Solution Kits and Libraries Curated by Popular Use Cases
                      Explore Kits

                      Save this library and start creating your kit