You are currently viewing Apache Airflow CeleryExecutor PostgreSQL Redis: Start the great environment using Docker-Compose in 5 minutes!
Photo by Drew Beamer on Unsplash
Could You Please Share This Post? I Appreciate It And Thank YOU! :) Have A Nice Day!
4.8
(953)

In this post I will show you how to create a fully operational environment which consist of Apache Airflow CeleryExecutor PostgreSQL Redis in 5 minutes, which will include:

Apache Airflow CeleryExecutor PostgreSQL Redis

  • Apache Airflow WebServer
  • Apache Airflow Worker
  • Apache Airflow Scheduler
  • Flower – is a web based tool for monitoring and administrating Celery clusters
  • Redis – is an open source (BSD licensed), in-memory data structure store, used as a database, cache and message broker.

docker-compose.yml script -> Apache Airflow

Create the docker-compose.yml file and paste the script below. Then run the docker-compos up -d command.

(The script below was taken from the site Puckel)

version: '2.1'
services:
    redis:
        image: 'redis:5.0.5'
        # command: redis-server --requirepass redispass

    postgres:
        image: postgres:9.6
        environment:
            - POSTGRES_USER=airflow
            - POSTGRES_PASSWORD=airflow
            - POSTGRES_DB=airflow
        # Uncomment these lines to persist data on the local filesystem.
        #     - PGDATA=/var/lib/postgresql/data/pgdata
        # volumes:
        #     - ./pgdata:/var/lib/postgresql/data/pgdata

    webserver:
        image: puckel/docker-airflow:1.10.4
        restart: always
        depends_on:
            - postgres
            - redis
        environment:
            - LOAD_EX=n
            - FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho=
            - EXECUTOR=Celery
            # - POSTGRES_USER=airflow
            # - POSTGRES_PASSWORD=airflow
            # - POSTGRES_DB=airflow
            # - REDIS_PASSWORD=redispass
        volumes:
            - ./dags:/usr/local/airflow/dags
            # Uncomment to include custom plugins
            # - ./plugins:/usr/local/airflow/plugins
        ports:
            - "8080:8080"
        command: webserver
        healthcheck:
            test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
            interval: 30s
            timeout: 30s
            retries: 3

    flower:
        image: puckel/docker-airflow:1.10.4
        restart: always
        depends_on:
            - redis
        environment:
            - EXECUTOR=Celery
            # - REDIS_PASSWORD=redispass
        ports:
            - "5555:5555"
        command: flower

    scheduler:
        image: puckel/docker-airflow:1.10.4
        restart: always
        depends_on:
            - webserver
        volumes:
            - ./dags:/usr/local/airflow/dags
            # Uncomment to include custom plugins
            # - ./plugins:/usr/local/airflow/plugins
        environment:
            - LOAD_EX=n
            - FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho=
            - EXECUTOR=Celery
            # - POSTGRES_USER=airflow
            # - POSTGRES_PASSWORD=airflow
            # - POSTGRES_DB=airflow
            # - REDIS_PASSWORD=redispass
        command: scheduler

    worker:
        image: puckel/docker-airflow:1.10.4
        restart: always
        depends_on:
            - scheduler
        volumes:
            - ./dags:/usr/local/airflow/dags
            # Uncomment to include custom plugins
            # - ./plugins:/usr/local/airflow/plugins
        environment:
            - FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho=
            - EXECUTOR=Celery
            # - POSTGRES_USER=airflow
            # - POSTGRES_PASSWORD=airflow
            # - POSTGRES_DB=airflow
            # - REDIS_PASSWORD=redispass
        command: worker

Check the container status

Before navigating to pages with the user interface, check that all containers are in “UP” status. To do this, use the command: (Apache Airflow CeleryExecutor PostgreSQL Redis)

docker-compose ps
           Name                         Command                  State                         Ports                   
-----------------------------------------------------------------------------------------------------------------------
airflow-docker_flower_1      /entrypoint.sh flower            Up             0.0.0.0:5555->5555/tcp, 8080/tcp, 8793/tcp
airflow-docker_postgres_1    docker-entrypoint.sh postgres    Up             5432/tcp                                  
airflow-docker_redis_1       docker-entrypoint.sh redis ...   Up             6379/tcp                                  
airflow-docker_scheduler_1   /entrypoint.sh scheduler         Up             5555/tcp, 8080/tcp, 8793/tcp              
airflow-docker_webserver_1   /entrypoint.sh webserver         Up (healthy)   5555/tcp, 0.0.0.0:8080->8080/tcp, 8793/tcp
airflow-docker_worker_1      /entrypoint.sh worker            Up             5555/tcp, 8080/tcp, 8793/tcp          

User Interface

When all containers are running, we can open in turn:

Test DAG

The “dags” directory has been created in the directory where we ran the dokcer-compose.yml file. Let’s create our test DAG in it. For this purpose. I will direct you to my other post, where I described exactly how to do it.

In short: create a test dag (python file) in the “dags” directory. It will automatically appear in Airflow UI. Then just run it. In addition, check monitoring from the Flower UI level. (Apache Airflow CeleryExecutor PostgreSQL Redis)

Apache Airflow CeleryExecutor PostgreSQL Redis: Start the great environment using Docker-Compose in 5 minutes!
Apache Airflow CeleryExecutor PostgreSQL Redis: Start the great environment using Docker-Compose in 5 minutes!

That’s all about Apache Airflow CeleryExecutor PostgreSQL Redis: Start the great environment using Docker-Compose in 5 minutes!

Could You Please Share This Post? 
I appreciate It And Thank YOU! :)
Have A Nice Day!

BigData-ETL: image 7YOU MIGHT ALSO LIKE

How useful was this post?

Click on a star to rate it!

Average rating 4.8 / 5. Vote count: 953

No votes so far! Be the first to rate this post.

As you found this post useful...

Follow us on social media!

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?