docker-airflow | building docker based airflow image | Continuous Deployment library

 by   abhioncbr Python Version: Current License: Apache-2.0

kandi X-RAY | docker-airflow Summary

kandi X-RAY | docker-airflow Summary

docker-airflow is a Python library typically used in Telecommunications, Media, Media, Entertainment, Devops, Continuous Deployment, Docker applications. docker-airflow has no bugs, it has no vulnerabilities, it has a Permissive License and it has low support. However docker-airflow build file is not available. You can download it from GitHub.

Repo for building docker based airflow image. Containers support multiple features like writing logs to local or S3 folder and Initializing GCP while container booting. https://abhioncbr.github.io/docker-airflow/
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              docker-airflow has a low active ecosystem.
              It has 30 star(s) with 5 fork(s). There are 2 watchers for this library.
              OutlinedDot
              It had no major release in the last 6 months.
              There are 3 open issues and 13 have been closed. On average issues are closed in 77 days. There are no pull requests.
              It has a neutral sentiment in the developer community.
              The latest version of docker-airflow is current.

            kandi-Quality Quality

              docker-airflow has 0 bugs and 0 code smells.

            kandi-Security Security

              docker-airflow has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
              docker-airflow code analysis shows 0 unresolved vulnerabilities.
              There are 0 security hotspots that need review.

            kandi-License License

              docker-airflow is licensed under the Apache-2.0 License. This license is Permissive.
              Permissive licenses have the least restrictions, and you can use them in most projects.

            kandi-Reuse Reuse

              docker-airflow releases are not available. You will need to build from source code and install.
              docker-airflow has no build file. You will be need to create the build yourself to build the component from source.
              Installation instructions, examples and code snippets are available.
              docker-airflow saves you 8543 person hours of effort in developing the same functionality from scratch.
              It has 17527 lines of code, 1005 functions and 15 files.
              It has high code complexity. Code complexity directly impacts maintainability of the code.

            Top functions reviewed by kandi - BETA

            kandi has reviewed docker-airflow and discovered the below as its top functions. This is intended to give you an instant insight into docker-airflow implemented functionality, and help decide if they suit your requirements.
            • Start the task
            • Return a generator of all the failed dependencies of this task
            • Update the state of the job
            • Check if dependencies met
            • Generate a graph
            • Return a state token
            • Returns the number of days of the given dag run
            • Get the duration of the workflow
            • Gets the height of the chart
            • Decorator to require authentication
            • Get a dag by its dag id
            • Return the TaskInstance for this task
            • Set the value of a variable
            • Returns the headers
            • Get the value of the key
            • Clear a dag run
            • Render an object
            • Return the string representation of this job
            • The main entry point
            • Update the state of all tasks
            • Get landing times
            • Collect all DAGs from the given dag
            • Return a list of task instances
            • Retrieve the number of attempts
            • Clear the contents of a DAG
            • Login
            Get all kandi verified functions for this library.

            docker-airflow Key Features

            No Key Features are available at this moment for docker-airflow.

            docker-airflow Examples and Code Snippets

            No Code Snippets are available at this moment for docker-airflow.

            Community Discussions

            QUESTION

            Connection Airflow Docker to PostgreSql Docker
            Asked 2022-Mar-08 at 18:02

            I am using Airflow and PostgreSQL in Docker.

            So I set up a PostgreSQL database on port 5433. Container (384eaa7b6efb). This is where I have my data which I want to fetch with my dag in Airflow.

            docker ps

            ...

            ANSWER

            Answered 2021-Oct-17 at 15:37
            Short Answer

            Change the host to; host.docker.internal.

            Long Answer

            This depends on the Os you are using. In order to access the host's network from within a container you will need to use the host's IP address in the docker. Conveniently, on Windows and Max this is resolved using the domain host.docker.internal from within the container. As specified in docker's documentation:

            I want to connect from a container to a service on the host

            The host has a changing IP address (or none if you have no network access). We recommend that you connect to the special DNS name host.docker.internal which resolves to the internal IP address used by the host. This is for development purpose and will not work in a production environment outside of Docker Desktop for Mac.

            There is also a workaround for this in linux which has been answered in What is linux equivalent of "host.docker.internal"

            Source https://stackoverflow.com/questions/69605527

            QUESTION

            Airflow on Docker: Can't Write to Volume (Permission Denied)
            Asked 2022-Jan-02 at 11:51

            Goal

            I'm trying to run a simple DAG which creates a pandas DataFrame and writes to a file. The DAG is being run in a Docker container with Airflow, and the file is being written to a named volume.

            Problem

            When I start the container, I get the error:

            ...

            ANSWER

            Answered 2022-Jan-02 at 11:51

            Don't use Puckel Docker Image. It's not been maintained for years, Airflow 1.10 has reached End Of Life in June 2021. You should only look at Airflow 2 and Airflow has official reference image that you can use:

            Airflow 2 has also Quick-Start guides you can use - based on the image and docker compose: https://airflow.apache.org/docs/apache-airflow/stable/start/index.html

            And it also has Helm Chart that can be used to productionize your setup. https://airflow.apache.org/docs/helm-chart/stable/index.html

            Don't waste your (and other's) time on Puckel and Airflow 1.10.

            Source https://stackoverflow.com/questions/70551163

            QUESTION

            How to add airflow variables in docker compose file?
            Asked 2021-Nov-23 at 11:35

            I have an docker compose file which spins up the local airflow instance as below:

            ...

            ANSWER

            Answered 2021-Nov-22 at 15:40

            If you add an environment variable named AIRFLOW_VAR_CONFIG_BUCKET to the list under environment:, it should be accessible by Airflow. Sounds like you're doing that correctly.

            Two things to note:

            • Variables (& connections) set via environment variables are not visible in the Airflow UI. You can test if they exist by executing Variable.get("config_bucket") in code.
            • The Airflow scheduler/worker (depending on Airflow executor) require access to the variable while running a task. Adding a variable to the webserver is not required.

            Source https://stackoverflow.com/questions/70068360

            QUESTION

            Difference in restart and restart_policy in docker_compose.yml
            Asked 2021-Nov-05 at 12:30

            I have a docker-compose file for some services, among them an airflow-webserver. I realized that I can both add restart and deploy-restart_policy to the compose file. I tried searching for a difference between the two, but could only find posts discussing the individual settings (like on-failure or always).

            • What is the difference of setting the configuration?
            • Which should I use?
            • Is it a versioning issue, e.g. restart is from older versions and deploy-restart_policy is the newer one?

            Example docker-compose.yml:

            ...

            ANSWER

            Answered 2021-Nov-05 at 12:30

            The restart and deploy.restart_policy options configure the same thing but depend on the way you run your containers:

            • restart is used by Docker Compose
            • deploy.restart_policy is used by Docker Swarm

            The deploy option is used for Docker Swarm only and is ignored by Docker Compose.

            From the documentation on deploy.restart_policy:

            Configures if and how to restart containers when they exit. Replaces restart.

            And here about restart:

            The restart option is ignored when deploying a stack in swarm mode.

            Source https://stackoverflow.com/questions/69853088

            QUESTION

            FileNotFoundError: [Errno 2] No such file or directory when a task tried to save a file
            Asked 2021-Feb-25 at 13:52

            I'm trying to copy the content of a csv file into postgres database , in two tasks , the first task downloads the csv file and saves it in /temp folder and the other one is a postgres task that copies the elements into the database. However the task fails to save the file with a filenotfound error when trying to save it outside of the dag folder.

            The callable function that saves the file :

            ...

            ANSWER

            Answered 2021-Feb-25 at 13:52

            You need to run the following to map a volume to share data between the container and the host

            see answer here copy file from docker to host system using python script

            Source https://stackoverflow.com/questions/66313677

            QUESTION

            Airflow 2.0 Docker setup
            Asked 2021-Jan-30 at 14:59

            Recently been trying to learn Airflow, but a majority of resources online depended on this repo https://github.com/puckel/docker-airflow which unfortunately has been removed.

            I am not familiar with docker so I'm just trying to set up locally and play around with Airflow. I'm on a windows setup and have already gotten docker working on my computer. Does Airflow have a quick-set-up file for a docker-compose? Or is there any other resources I can look at? Thanks.

            ...

            ANSWER

            Answered 2021-Jan-30 at 06:32

            I recently added a quick start guides to the official Apache Airflow documentation. Unfortunately, this guide has not been released yet. It will be released in Airflow 2.0.1. For now, you can use the development version, and when a stable version is released it will be very easy for you to migrate. I don't expect any major changes to our docker-compose.yaml files. http://apache-airflow-docs.s3-website.eu-central-1.amazonaws.com/docs/apache-airflow/latest/start/docker.html

            Source https://stackoverflow.com/questions/65963934

            QUESTION

            Reading files on local machine from docker container
            Asked 2021-Jan-15 at 19:13

            I am running airflow from a docker image called puckel/docker-airflow. I synchronized my local dags folder with container airflow folder when I started the container using following docker command:

            ...

            ANSWER

            Answered 2021-Jan-15 at 19:13

            Within the container, you have to point to the config file in the mounted volume, so change config_parser.read("C:/dags/file.ini") to config_parser.read("/usr/local/airflow/dags/file.ini").

            Source https://stackoverflow.com/questions/65741547

            QUESTION

            Running puckel/docker-airflow image on Raspberry Pi
            Asked 2021-Jan-02 at 18:09
            1. Why are some docker images incompatible with platforms like Raspberry Pi (linux/arm/v7)?
            2. Furthermore, can you modify a Dockerfile or another config file such that it is compatible?

            Thanks for any suggestions!

            So far I've installed docker and docker-compose then followed the puckel/docker-airflow readme, skipping the optional build, then tried to run the container by:

            ...

            ANSWER

            Answered 2021-Jan-02 at 09:25

            Executable files, i.e. binary files, are dependent on the computer's architecture (amd64, arm...). Docker's image contains binary files. That is, the docker image is computer architecture dependent.

            Therefore, if you look at the registry of docker, the OS and Architecture of the image are specified. Refer to the dockerhub/puckel/docker-airflow you used, linux/amd64 You can see it only supports. In other words, it doesn't work in arm architecture. If you want to run this arm architecture there will be several ways, but the point is one. It is to build the result of building the origin code with arm, not amd64, as docker image.

            In github.com/puckel/docker-airflow, guidelines for building are well stated.

            First, if you look at the Dockerfile provided by the github, it starts from the image FROM python:3.7-slim-buster. For the corresponding python:3.7-slim-buster, it supports linux/arm/v5, linux/arm/v7, linux/arm/v5, linux/arm64/v8. dockerhub/python/3.7-slim-buster

            In other words, you can build to arm architecture

            I have experience creating images for multiple architectures through the docker buildx command. Of course, other methods exist, but I will only briefly introduce the commands below.

            dockerhub/buildx

            • docker buildx is an experimental feature, and it is still recommended Experimental features must not be used in production environments.

            Source https://stackoverflow.com/questions/65537248

            QUESTION

            BashOperator not found on Airflow install on Raspberry Pi
            Asked 2020-Dec-23 at 09:43

            I'm not sure how to approach this issue -- other posts to do with ModuleNotFoundError are solved by reinstalling the relevant package, however it's clear that this is not the issue because the example bash operator DAG runs. So is my issue to do with how Airflow was installed? At this point I'm looking at reinstalling Airflow via the puckel Docker container.

            ...

            ANSWER

            Answered 2020-Dec-23 at 09:43

            Since you are using Airflow 1.10.14 The import should be

            Source https://stackoverflow.com/questions/65419191

            QUESTION

            Use Java with Airflow and Docker
            Asked 2020-Dec-16 at 10:53

            I have a use case where I want to run a jar file via Airflow, all of which has to live in a Docker Container on Mac.

            I have tried installing java separately and also, tried mounting my JAVA_HOME(host) onto the container.

            This is my docker-compose.yaml:

            ...

            ANSWER

            Answered 2020-Dec-16 at 10:53

            I think that you're getting Permission denied because you are running docker with user airflow.

            Can you try to run it as root? (this is risky! don't use in production - it is just an effort to temporary workaround). Avoid using root user!

            Source https://stackoverflow.com/questions/65288026

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install docker-airflow

            DockerFile uses airflow-version as a build-arg.
            build image, if you want to do some customization - docker build -t abhioncbr/docker-airflow:$IMAGE_VERSION --build-arg AIRFLOW_VERSION=$AIRFLOW_VERSION --build-arg AIRFLOW_PATCH_VERSION=$AIRFLOW_PATCH_VERSION -f ~/docker-airflow/docker-files/DockerFile . Arg IMAGE_VERSION value should be airflow version for example, 1.10.3 or 1.10.2 Arg AIRFLOW_PATCH_VERSION value should be the major release version of airflow for example for 1.10.2 it should be 1.10.

            Support

            For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            CLONE
          • HTTPS

            https://github.com/abhioncbr/docker-airflow.git

          • CLI

            gh repo clone abhioncbr/docker-airflow

          • sshUrl

            git@github.com:abhioncbr/docker-airflow.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link