airflow-docker | Apache Airflow Docker Image | Continuous Deployment library

 by   Shinichi-Nakagawa Python Version: Current License: Apache-2.0

kandi X-RAY | airflow-docker Summary

kandi X-RAY | airflow-docker Summary

airflow-docker is a Python library typically used in Devops, Continuous Deployment, Docker applications. airflow-docker has no bugs, it has no vulnerabilities, it has build file available, it has a Permissive License and it has low support. You can download it from GitHub.

Apache Airflow Docker Image.
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              airflow-docker has a low active ecosystem.
              It has 18 star(s) with 10 fork(s). There are 1 watchers for this library.
              OutlinedDot
              It had no major release in the last 6 months.
              There are 1 open issues and 0 have been closed. On average issues are closed in 1122 days. There are no pull requests.
              It has a neutral sentiment in the developer community.
              The latest version of airflow-docker is current.

            kandi-Quality Quality

              airflow-docker has 0 bugs and 0 code smells.

            kandi-Security Security

              airflow-docker has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
              airflow-docker code analysis shows 0 unresolved vulnerabilities.
              There are 0 security hotspots that need review.

            kandi-License License

              airflow-docker is licensed under the Apache-2.0 License. This license is Permissive.
              Permissive licenses have the least restrictions, and you can use them in most projects.

            kandi-Reuse Reuse

              airflow-docker releases are not available. You will need to build from source code and install.
              Build file is available. You can build the component from source.
              Installation instructions, examples and code snippets are available.
              airflow-docker saves you 40 person hours of effort in developing the same functionality from scratch.
              It has 106 lines of code, 4 functions and 3 files.
              It has medium code complexity. Code complexity directly impacts maintainability of the code.

            Top functions reviewed by kandi - BETA

            kandi has reviewed airflow-docker and discovered the below as its top functions. This is intended to give you an instant insight into airflow-docker implemented functionality, and help decide if they suit your requirements.
            • Create an Airflow configuration file .
            • Create new Airflow user .
            • Create a password user .
            • Initialize parameters .
            Get all kandi verified functions for this library.

            airflow-docker Key Features

            No Key Features are available at this moment for airflow-docker.

            airflow-docker Examples and Code Snippets

            No Code Snippets are available at this moment for airflow-docker.

            Community Discussions

            QUESTION

            How to Connect Database(postgres) to Airflow composer On Google Cloud Platform?
            Asked 2022-Mar-10 at 08:09

            I have airflow setup on my local machine.Dags are written in a way that they need to access database(postgres).I am trying to setup similar thing on Google Cloud Platform.But I am not able to connect database to Airflow in a composer.I am Keep getting error "no host postgres" Any Suggestions for setting up airflow on GCP or Connecting Database to airflow composer??

            Here Is Link For My Complete Airflow Folder:(This setup works fine on my local machine with docker)

            https://github.com/digvijay13873/airflow-docker.git

            I am using GCP composer.Postgres Database is in SQL instance. My Table creation Dag is here : https://github.com/digvijay13873/airflow-docker/blob/main/dags/tablecreation.py

            What changes should I do in a My existing Dag to connect it with postgres in SQL instance. I tried Giving public IP address of postgres in Host parameter.

            ...

            ANSWER

            Answered 2022-Mar-10 at 08:09

            Answering your main question, connecting a SQL instance from GCP in Cloud Composer environment can be done in two ways:

            • Using Public IP
            • Using Cloud SQL proxy (recommended): secure access without the need of authorized networks and SSL configuration

            Connecting using Public IP: Postgres: connect directly via TCP (non-SSL)

            Source https://stackoverflow.com/questions/71089074

            QUESTION

            When running Apache Airflow in Docker how can I fix the issue where my DAGs don't become unbroken even after fixing them?
            Asked 2022-Feb-03 at 21:40


            So in my case I've previously ran Airflow locally directly on my machine and now I'm trying to run it through containers using docker while also keeping the history of my previous dags. However I've been having some issues.
            A slight bit of background ... when I first used docker-compose to bring up my containers airflow was sending an error message saying that the column dag_has_import_errors doesn't exist. So I just went ahead and created it and everything seemed to work fine.
            Now however my dags are all broken and when I modify one without fixing the issue I can see see the updated line of code in the brief error information that shows up at the top of the webserver.
            However when I resolve the issue the code doesn't change and DAG remains broken. I'll provide
            this image of the error
            this is the image of the code\

            also the following is my docker-compose file (I commented out airflow db init but may I should have kept it with the db upgrade parameter as true? My compose file is based on this template\

            ...

            ANSWER

            Answered 2022-Feb-03 at 21:40

            LETS GOOOOOOOOOO!
            PAN COMIDO!
            DU GATEAU!
            Finally got it to work :). So the main issue was the fact that I didn't have all the required packages. So I tried doing just pip install configparser in the container and this actually helped for one of the DAGs I had to run. However this didn't seem sustainable nor practical so I decided to just go ahead with the Dockerfile method in effect extending the image. I believe this was the way they called it. So here's my Dockerfile \

            Source https://stackoverflow.com/questions/70944153

            QUESTION

            How to add new user to docker image when running distributed airflow architecture using docker-compose
            Asked 2021-Oct-12 at 11:56

            (THE ORIGINAL QUESTION WAS EDITED TO MAKE IT MORE CLEAR)

            1. SOLUTION AT THE END OF THE QUESTION
            2. ANOTHER SOLUTION IN THE ANSWER

            The goal and the setup

            The main goal is to run container based processing(using the DockerOperator) when the airflow celery worker is also running inside a docker container. At the moment, I'm testing the setup at one machine, but in the end I'll run the celery worker containers at separate machines operating in the same network sharing some of the airflow specific mount points(dags,logs,plugins) and user ids etc.

            I'm launching the whole setup from a docker-compose.yml where I set AIRFLOW_UID to match my UID at the host machine and AIRFLOW_GID to 0 as suggested in the airflow documentation. At the host, my UID belongs to docker group, but it doesn't belong to group 0. The /var/run/docker.sock is mounted into the containers.

            TEST 1

            I followed the example represented here https://towardsdatascience.com/using-apache-airflow-dockeroperator-with-docker-compose-57d0217c8219 . Using the above-mentioned setup with the official airflow image 2.1.4 and DockerOperator. Task run fails, which is related to the fact that the default user doesn't have the needed permissions to /var/run/docker.sock. (I still need to check if adding the user to group 0 at the host would solve the issue as pointed out by @JarekPotiuk in the his comment. The problem is that group 0 is the root group and most likely I'll not get permission to add the user to it)

            ...

            ANSWER

            Answered 2021-Sep-24 at 14:13

            I'm launching the whole setup from a docker-compose.yml where I set AIRFLOW_UID=1234 and AIRFLOW_GID=0. I'm using a docker image based on the official airflow image with the addition that I have created 'newuser' with gid=1234 and 'docker' group with gid that matches the one at the host.

            You should not do it at all. The user will be created automatically by Airflow's image entrypoint when you use a differnt UID than default - see https://airflow.apache.org/docs/docker-stack/entrypoint.html#allowing-arbitrary-user-to-run-the-container. In fact all that you want to do should be possible without having to extend the Airflow image.

            What you need to do, you need to create this user that you want to run inside the container ON THE HOST - not in the container. And it should belong to the docker group ON THE HOST - not in the container.

            Docker works in the way that it uses the same kernel/users that are defined in the system, so when you run something as a user in the container, it is run with the "host" user priviledges, so you you map your docker socket to within the container, it will be able to use the socket/run docker command becaue it will have the right permissions on the host.

            Therefore (in case you run your docker-compose as regular user who already belongs to docker group) the best way is the one suggested in the quick-start - i.e. run airflow with your "host" user that you are logged in with: https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html

            This also makes all the files created in container belong to the "logged in user" (if they are created in directories mounted inside - such as logs directory).

            But if your goal is to use it in "unattended" environment, then likely creating the new user on your host and adding the user to both 0 and docker groups should solve the problem.

            Source https://stackoverflow.com/questions/69316093

            QUESTION

            Airflow: how to get pip packages installed via their docker-compose.yml?
            Asked 2021-May-11 at 14:24

            OK, I am probably very stupid but anyways; How can I install additional pip packages via the docker-compose file of airflow?

            I am assuming that their should be a standard functionality to pick up a requirements.txt or something. When inspecting their repo, I do see some ENV variables like ADDITIONAL_PYTHON_DEPS that hint me that this should be possible, but setting these in the docker-compose file doesn't actually install the library's.

            ...

            ANSWER

            Answered 2021-May-11 at 14:24

            There is a pretty detailed guide on how to achieve what you are looking for on the Airflow docs here. Depending on your requirements, this may be as easy as extending the original image using a From directive while creating a new Dockerfile, or you may need to customize the image to suit your needs.

            If you go with the Extending the image approach your new Dockerfile will be something like this:

            Source https://stackoverflow.com/questions/66699394

            QUESTION

            see airflow ui in Docker
            Asked 2020-Apr-03 at 13:34

            This is gonna sound stupid probably but...

            I'm trying to run airflow on a windows machine. I'm aware that airflow doesn't work on windows so i thought I'd use docker.

            So after installing docker in windows, i opened up my cmd and type:

            ...

            ANSWER

            Answered 2020-Apr-03 at 13:34

            If you want the port 8080 to be exposed to the host you could use -p parameter in docker run commannd. Also you can set the command webserver directlry while starting the container. This will start Airflow with Sequential Executor.

            Source https://stackoverflow.com/questions/61012760

            QUESTION

            Airflow doesn't recognise my S3 Connection setting
            Asked 2020-Mar-31 at 08:29

            I am using Airflow with Kubernetes executor and testing out locally (using minikube), While I was able to get it up and running, I cant seem to store my logs in S3. I have tried all solutions that are described and I am still getting the following error,

            ...

            ANSWER

            Answered 2020-Mar-31 at 08:29

            I am pretty certain this issue is because the s3 logging configuration has not been set on the worker pods. The worker pods don't get given configuration set using environment variables such as AIRFLOW__CORE__REMOTE_LOGGING: True. If you wish to set this variable in the worker pod then you must copy the variable and append AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__ to the copied environment variable name: AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW__CORE__REMOTE_LOGGING: True.

            In this case you would need to duplicate all of your variables specifying config for s3 logging and append AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__ to the copies.

            Source https://stackoverflow.com/questions/60935008

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install airflow-docker

            Pull the image from the Docker repository.

            Support

            For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            CLONE
          • HTTPS

            https://github.com/Shinichi-Nakagawa/airflow-docker.git

          • CLI

            gh repo clone Shinichi-Nakagawa/airflow-docker

          • sshUrl

            git@github.com:Shinichi-Nakagawa/airflow-docker.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link