bitnami如何使用_使用Bitnami获取完全配置的Apache Airflow Docker开发堆栈
bitnami如何使用
I’ve been using it for around 2 years now to build out custom workflow interfaces, like those used for Laboratory Information Management Systems (LIMs), Computer Vision pre and postprocessing pipelines, and to set and forget other genomics pipelines.
我已經(jīng)使用它大約兩年了,以構(gòu)建自定義的工作流界面,例如用于實(shí)驗(yàn)室信息管理系統(tǒng)(LIM),計(jì)算機(jī)視覺前后處理管道的界面,以及設(shè)置和忘記其他基因組學(xué)管道。
My favorite feature of Airflow is how completely agnostic it is to the work you are doing or where that work is taking place. It could take place locally, on a Docker image, on Kubernetes, on any number of AWS services, on an HPC system, etc. Using Airflow allows me to concentrate on the business logic of what I’m trying to accomplish without getting too bogged down in implementation details.
我最喜歡的Airflow功能是對您正在進(jìn)行的工作或進(jìn)行的工作有多不可知。 它可以在本地,Docker映像,Kubernetes,任何數(shù)量的AWS服務(wù),HPC系統(tǒng)等上進(jìn)行。使用Airflow,我可以專注于自己想要完成的業(yè)務(wù)邏輯,而不會(huì)陷入困境實(shí)施細(xì)節(jié)。
During that time I’ve adopted a set of systems that I use to quickly build out the main development stack with Docker and Docker Compose, using the Bitnami Apache Airflow stack. Generally, I either deploy the stack to production using either the same Docker compose stack if its a small enough instance that is isolated, or with Kubernetes when I need to interact with other services or file systems.
在這段時(shí)間里,我采用了一套系統(tǒng),使用Bitnami Apache Airflow堆棧快速構(gòu)建了Docker和Docker Compose的主要開發(fā)堆棧 。 通常,如果隔離的實(shí)例足夠小,則可以使用相同的Docker compose堆棧將堆棧部署到生產(chǎn)環(huán)境,或者當(dāng)我需要與其他服務(wù)或文件系統(tǒng)進(jìn)行交互時(shí),可以使用Kubernetes部署堆棧。
比塔南vs自己動(dòng)手 (Bitnami vs Roll Your Own)
I used to roll my own Airflow containers using Conda. I still use this approach for most of my other containers, including microservices that interact with my Airflow system, but configuring Airflow is a lot more than just installing packages. Also, even just installing those packages is a pain and I could rarely count on a rebuild actually working without some pain. Then, on top of the packages you need to configure database connections and a message queue.
我曾經(jīng)使用Conda來滾動(dòng)自己的Airflow容器。 我仍然對大多數(shù)其他容器(包括與我的Airflow系統(tǒng)進(jìn)行交互的微服務(wù))使用這種方法,但是配置Airflow不僅僅是安裝軟件包。 另外,即使只是安裝這些軟件包也是一件很痛苦的事,我很少指望重建工作而不會(huì)有些痛苦。 然后,在軟件包的頂部,您需要配置數(shù)據(jù)庫連接和消息隊(duì)列。
In comes the Bitnami Apache Airflow docker compose stack for dev and Bitnami Apache Airflow Helm Chart for prod!
用于開發(fā)人員的Bitnami Apache Airflow泊塢窗組成堆棧和用于產(chǎn)品的Bitnami Apache Airflow掌舵圖表 !
Bitnami, in their own words:
用自己的話說Bitnami :
Bitnami makes it easy to get your favorite open source software up and running on any platform, including your laptop, Kubernetes and all the major clouds. In addition to popular community offerings, Bitnami, now part of VMware, provides IT organizations with an enterprise offering that is secure, compliant, continuously maintained and customizable to your organizational policies. https://bitnami.com/
Bitnami使您可以輕松地在任何平臺上啟動(dòng)并運(yùn)行您喜歡的開源軟件,包括您的筆記本電腦,Kubernetes和所有主要云。 除了受歡迎的社區(qū)產(chǎn)品之外,Bitnami(現(xiàn)在是VMware的一部分)為IT組織提供安全,合規(guī),可連續(xù)維護(hù)且可根據(jù)您的組織策略自定義的企業(yè)產(chǎn)品。 https://bitnami.com/
Bitnami stacks (usually) work completely the same from their Docker Compose stacks to their Helm charts. This means I can test and develop locally using my compose stack, build out new images, versions, packages, etc, and then deploy to Kubernetes. The configuration, environmental variables, and everything else acts the same. It would be a fairly large undertaking to do all this from scratch, so I use Bitnami.
從Docker Compose堆棧到Helm圖表,Bitnami堆棧(通常)的工作原理完全相同。 這意味著我可以使用自己的Compose堆棧在本地進(jìn)行測試和開發(fā),構(gòu)建新的映像,版本,軟件包等,然后部署到Kubernetes。 配置,環(huán)境變量以及其他所有行為均相同。 從頭開始做所有這一切將是一個(gè)相當(dāng)大的任務(wù),所以我使用Bitnami。
They have plenty of enterprise offerings, but everything included here is open source and there is no paywall involved.
他們有大量的企業(yè)產(chǎn)品,但是這里包含的所有內(nèi)容都是開源的,沒有涉及任何付費(fèi)。
And no, I am not affiliated with Bitnami, although I have kids that eat a lot and don’t have any particular ethical aversions to selling out. ;-) I’ve just found their offerings to be excellent.
不,我不隸屬于Bitnami,盡管我的孩子吃得很多,并且對賣出沒有特別的道德厭惡。 ;-)我剛剛發(fā)現(xiàn)他們的產(chǎn)品很棒 。
項(xiàng)目結(jié)構(gòu) (Project Structure)
I like to have my projects organized so that I can run tree and have a general idea of what's happening.
我喜歡組織我的項(xiàng)目,以便可以運(yùn)行tree并大致了解正在發(fā)生的事情。
Apache Airflow has 3 main components, the application, the worker, and the scheduler. Each of these has it’s own Docker image to separate out the services. Additionally, there is a database and an message queue, but we won’t be doing any customization to these.
Apache Airflow具有3個(gè)主要組件,即應(yīng)用程序,工作程序和調(diào)度程序。 它們每個(gè)都有自己的Docker映像以分離出服務(wù)。 另外,有一個(gè)數(shù)據(jù)庫和一個(gè)消息隊(duì)列,但是我們不會(huì)對其進(jìn)行任何自定義。
.└── docker
└── bitnami-apache-airflow-1.10.10
├── airflow
│ └── Dockerfile
├── airflow-scheduler
│ └── Dockerfile
├── airflow-worker
│ └── Dockerfile
├── dags
│ └── tutorial.py
├── docker-compose.yml
So what we have here is a directory called bitnami-apache-airflow-1.10.10. Which brings us to a very important point! Pin your versions! It will save you so, so much pain and frustration!
因此,這里有一個(gè)名為bitnami-apache-airflow-1.10.10 。 這把我們帶到了非常重要的一點(diǎn)! 固定您的版本! 它將為您節(jié)省如此多的痛苦和沮喪!
Then we have one Dockerfile per Airflow piece.
然后,每個(gè)Airflow件只有一個(gè)Dockerfile。
Create this directory structure with:
使用以下命令創(chuàng)建此目錄結(jié)構(gòu):
mkdir -p docker/bitnami-apache-airflow-1.10.10/{airflow,airflow-scheduler,airflow-worker,dags}Docker Compose文件 (The Docker Compose File)
This is my preference for the docker-compose.yml file. I made a few changes for my own preferences, mostly that I pin versions, build my own Docker images, I have volume mounts for the dags, plugins, and database backups along with adding in the docker socket so I can run DockerOperators from within my stack.
這是我對docker-compose.yml文件的偏好。 我根據(jù)自己的喜好做了一些更改,主要是我固定了版本 ,構(gòu)建了自己的Docker映像,為dags , plugins和database backups添加了卷掛載,并添加了DockerOperators套接字,以便可以在我的內(nèi)部運(yùn)行DockerOperators堆棧。
You can always go and grab the original docker-compose here.
您隨時(shí)可以在這里獲取原始的docker-compose 。
version: '2'services:
postgresql:
image: 'docker.io/bitnami/postgresql:10-debian-10'
volumes:
- 'postgresql_data:/bitnami/postgresql'
environment:
- POSTGRESQL_DATABASE=bitnami_airflow
- POSTGRESQL_USERNAME=bn_airflow
- POSTGRESQL_PASSWORD=bitnami1
- ALLOW_EMPTY_PASSWORD=yes
redis:
image: docker.io/bitnami/redis:5.0-debian-10
volumes:
- 'redis_data:/bitnami'
environment:
- ALLOW_EMPTY_PASSWORD=yes
airflow-scheduler:
# image: docker.io/bitnami/airflow-scheduler:1-debian-10
build:
context: airflow-scheduler
environment:
- AIRFLOW_DATABASE_NAME=bitnami_airflow
- AIRFLOW_DATABASE_USERNAME=bn_airflow
- AIRFLOW_DATABASE_PASSWORD=bitnami1
- AIRFLOW_EXECUTOR=CeleryExecutor
# If you'd like to load the example DAGs change this to yes!
- AIRFLOW_LOAD_EXAMPLES=no
# only works with 1.10.11
#- AIRFLOW__WEBSERVER__RELOAD_ON_PLUGIN_CHANGE=true
#- AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION=False
volumes:
- airflow_scheduler_data:/bitnami
- ./plugins:/opt/bitnami/airflow/plugins
- ./dags:/opt/bitnami/airflow/dags
- ./db_backups:/opt/bitnami/airflow/db_backups
- /var/run/docker.sock:/var/run/docker.sock
airflow-worker:
# image: docker.io/bitnami/airflow-worker:1-debian-10
build:
context: airflow-worker
environment:
- AIRFLOW_DATABASE_NAME=bitnami_airflow
- AIRFLOW_DATABASE_USERNAME=bn_airflow
- AIRFLOW_DATABASE_PASSWORD=bitnami1
- AIRFLOW_EXECUTOR=CeleryExecutor
- AIRFLOW_LOAD_EXAMPLES=no
# only works with 1.10.11
#- AIRFLOW__WEBSERVER__RELOAD_ON_PLUGIN_CHANGE=true
#- AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION=False
volumes:
- airflow_worker_data:/bitnami
- ./plugins:/opt/bitnami/airflow/plugins
- ./dags:/opt/bitnami/airflow/dags
- ./db_backups:/opt/bitnami/airflow/db_backups
- /var/run/docker.sock:/var/run/docker.sock
airflow:
# image: docker.io/bitnami/airflow:1-debian-10
build:
# You can also specify the build context
# as cwd and point to a different Dockerfile
context: .
dockerfile: airflow/Dockerfile
environment:
- AIRFLOW_DATABASE_NAME=bitnami_airflow
- AIRFLOW_DATABASE_USERNAME=bn_airflow
- AIRFLOW_DATABASE_PASSWORD=bitnami1
- AIRFLOW_EXECUTOR=CeleryExecutor
- AIRFLOW_LOAD_EXAMPLES=no
# only works with 1.10.11
#- AIRFLOW__WEBSERVER__RELOAD_ON_PLUGIN_CHANGE=True
#- AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION=False
ports:
- '8080:8080'
volumes:
- airflow_data:/bitnami
- ./dags:/opt/bitnami/airflow/dags
- ./plugins:/opt/bitnami/airflow/plugins
- ./db_backups:/opt/bitnami/airflow/db_backups
- /var/run/docker.sock:/var/run/docker.sock
volumes:
airflow_scheduler_data:
driver: local
airflow_worker_data:
driver: local
airflow_data:
driver: local
postgresql_data:
driver: local
redis_data:
driver: local
固定您的版本 (Pin your versions)
The version of Apache Airflow used here is 1.10.10. The 1.10.11 has some cool updates I would like to incorporate, so I will keep an eye on it!
這里使用的Apache Airflow版本為1.10.10 。 我想加入1.10.11一些很酷的更新,所以我會(huì)一直關(guān)注它!
You can always keep up with the latest Apache Airflow versions by checking out the changelog on the main site.
您可以通過在主站點(diǎn)上查看變更日志來始終了解最新的Apache Airflow版本。
We are using Bitnami, which has bots that automatically build and update their images as new releases come along.
我們使用的是Bitnami,它的機(jī)器人會(huì)隨著新版本的發(fā)布自動(dòng)構(gòu)建和更新其映像。
While this approach is great for bots, I highly do not recommend just hoping that the latest version will be backwards compatible and work with your setup.
雖然這種方法非常適合機(jī)器人,但我強(qiáng)烈不建議您僅希望最新版本向后兼容并可以使用您的設(shè)置。
Instead, pin a version, and when a new version comes along test it out in your dev stack. At the time of writing the most recent version is 1.10.11, but it doesn't quite work out of the box, so we are using 1.10.10.
而是固定一個(gè)版本,并在出現(xiàn)新版本時(shí)在開發(fā)堆棧中對其進(jìn)行測試。 在撰寫本文時(shí),最新版本是1.10.11 ,但是它并沒有1.10.10 ,因此我們正在使用1.10.10 。
Bitnami Apache Airflow Docker標(biāo)簽 (Bitnami Apache Airflow Docker Tags)
Generally speaking, a docker tag corresponds to the application version. Sometimes there are other variants as well, such as base OS. Here we can just go with the application version.
一般來說,泊塢窗標(biāo)簽對應(yīng)于應(yīng)用程序版本。 有時(shí)也有其他變體,例如基本OS。 在這里,我們可以使用應(yīng)用程序版本。
Bitnami Apache Airflow Scheduler Image Tags
Bitnami Apache Airflow Scheduler圖像標(biāo)簽
Bitnami Apache Airflow Worker Image Tags
Bitnami Apache Airflow Worker圖像標(biāo)簽
Bitnami Apache Airflow Web Image Tags
Bitnami Apache Airflow Web圖像標(biāo)簽
建立自定義圖像 (Build Custom Images)
In our docker-compose we have placeholders in order to build custom images.
在我們的docker-compose我們有占位符以構(gòu)建自定義圖像。
We’ll just create a minimal Docker file for now. Later I’ll show you how to customize your docker container with extra system or python packages.
現(xiàn)在,我們僅創(chuàng)建一個(gè)最小的Docker文件。 稍后,我將向您展示如何使用額外的系統(tǒng)或python軟件包來自定義docker容器。
氣流應(yīng)用 (Airflow Application)
echo "FROM docker.io/bitnami/airflow:1.10.10" > docker/bitnami-apache-airflow-1.10.10/airflow/DockerfileWill give you this airflow application docker file.
將為您提供此氣流應(yīng)用程序docker文件。
FROM docker.io/bitnami/airflow:1.10.10氣流調(diào)度器 (Airflow Scheduler)
echo "FROM docker.io/bitnami/airflow-scheduler:1.10.10" > docker/bitnami-apache-airflow-1.10.10/airflow-scheduler/DockerfileWill give you this airflow scheduler docker file.
將為您提供此氣流調(diào)度程序docker文件。
FROM docker.io/bitnami/airflow-scheduler:1.10.10氣流工人 (Airflow Worker)
echo "FROM docker.io/bitnami/airflow-worker:1.10.10" > docker/bitnami-apache-airflow-1.10.10/airflow-worker/DockerfileWill give you this airflow worker docker file.
會(huì)給你這個(gè)氣流工人泊塢窗文件。
FROM docker.io/bitnami/airflow-worker:1.10.10調(diào)高堆棧 (Bring Up The Stack)
Grab the docker-compose file above and let's get rolling!
抓住上面的docker-compose文件,讓我們開始吧!
cd docker/bitnami-apache-airflow-1.10.10docker-compose up
If this is your first time running the command this will take some time. Docker will fetch any images it doesn’t already have, and build all the airflow-* images.
如果這是您第一次運(yùn)行該命令,則將需要一些時(shí)間。 Docker將獲取其尚無的任何圖像,并構(gòu)建所有airflow- *圖像。
導(dǎo)航到用戶界面 (Navigate to the UI)
Once everything is up and running navigate to the UI at http://localhost:8080.
一切就緒并運(yùn)行后,導(dǎo)航至位于http://localhost:8080的UI 。
Unless you changed the configuration, your default username/password is user/bitnami.
除非更改配置,否則默認(rèn)username/password為user/bitnami 。
Login to check out your Airflow web UI!
登錄以查看您的Airflow Web UI!
添加自定義DAG (Add in a Custom DAG)
Here’s a DAG that I grabbed from the Apache Airflow Tutorial. I’ve only included it here for the sake of completeness.
這是我從Apache Airflow教程中獲得的DAG 。 為了完整起見,我僅將其包括在此處。
from datetime import timedelta# The DAG object; we'll need this to instantiate a DAG
from airflow import DAG
# Operators; we need this to operate!
from airflow.operators.bash_operator import BashOperator
from airflow.utils.dates import days_ago
# These args will get passed on to each operator
# You can override them on a per-task basis during operator initialization
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': days_ago(2),
'email': ['airflow@example.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5),
# 'queue': 'bash_queue',
# 'pool': 'backfill',
# 'priority_weight': 10,
# 'end_date': datetime(2016, 1, 1),
# 'wait_for_downstream': False,
# 'dag': dag,
# 'sla': timedelta(hours=2),
# 'execution_timeout': timedelta(seconds=300),
# 'on_failure_callback': some_function,
# 'on_success_callback': some_other_function,
# 'on_retry_callback': another_function,
# 'sla_miss_callback': yet_another_function,
# 'trigger_rule': 'all_success'
}
dag = DAG(
'tutorial',
default_args=default_args,
description='A simple tutorial DAG',
schedule_interval=timedelta(days=1),
)
# t1, t2 and t3 are examples of tasks created by instantiating operators
t1 = BashOperator(
task_id='print_date',
bash_command='date',
dag=dag,
)
t2 = BashOperator(
task_id='sleep',
depends_on_past=False,
bash_command='sleep 5',
retries=3,
dag=dag,
)
dag.doc_md = __doc__
t1.doc_md = """\
#### Task Documentation
You can document your task using the attributes `doc_md` (markdown),
`doc` (plain text), `doc_rst`, `doc_json`, `doc_yaml` which gets
rendered in the UI's Task Instance Details page.

"""
templated_command = """
{% for i in range(5) %}
echo "{{ ds }}"
echo "{{ macros.ds_add(ds, 7)}}"
echo "{{ params.my_param }}"
{% endfor %}
"""
t3 = BashOperator(
task_id='templated',
depends_on_past=False,
bash_command=templated_command,
params={'my_param': 'Parameter I passed in'},
dag=dag,
)
t1 >> [t2, t3]
Anyways, grab this file and put it in your code/bitnami-apache-airflow-1.10.10/dags folder. The name of the file itself doesn't matter. The DAG name will be whatever you set in the file.
無論如何,請抓取此文件并將其放在您的code/bitnami-apache-airflow-1.10.10/dags文件夾中。 文件本身的名稱無關(guān)緊要。 DAG名稱將是您在文件中設(shè)置的名稱。
Airflow will restart itself automatically, and if you refresh the UI you should see your new tutorial DAG listed.
Airflow會(huì)自動(dòng)重新啟動(dòng),如果刷新UI,您應(yīng)該會(huì)看到列出的新tutorial DAG。
構(gòu)建自定義氣流Docker容器 (Build Custom Airflow Docker Containers)
If you’d like to add additonal system or python packages you can do so.
如果您想添加附加系統(tǒng)或python軟件包,可以這樣做。
# code/bitnami-apache-airflow-1.10.10/airflow/DockerfileFROM docker.io/bitnami/airflow:1.10.10
# From here - https://github.com/bitnami/bitnami-docker-airflow/blob/master/1/debian-10/Dockerfile
USER root
RUN apt-get update && apt-get upgrade -y && \
apt-get install -y vim && \
rm -r /var/lib/apt/lists /var/cache/apt/archives
RUN bash -c "source /opt/bitnami/airflow/venv/bin/activate && \
pip install flask-restful && \
deactivate"
To be clear, I don’t especially endorse this approach anymore, except that I like to add flask-restful for creating custom REST API plugins.
需要明確的是,我不再特別贊成這種方法,除了我喜歡添加flask-restful來創(chuàng)建自定義REST API插件。
I like to treat Apache Airflow the way I treat web applications. I’ve been burned too many times, so now my web apps take care of routing and rendering views, and absolutely nothing else.
我喜歡以對待Web應(yīng)用程序的方式來對待Apache Airflow。 我已經(jīng)被燒死了太多次 ,所以現(xiàn)在我的Web應(yīng)用程序負(fù)責(zé)路由和渲染視圖,而別無其他。
Airflow is about the same, except it handles the business logic of my workflows and absolutely nothing else. If I have some crazy pandas/tensorflow/opencv/whatever stuff I need to do I’ll build that into a separate microservice and not touch my main business logic. I like to think of Airflow as the spider that sits in the web.
氣流幾乎相同,除了它處理我的工作流的業(yè)務(wù)邏輯外, 別無其他 。 如果我需要一些瘋狂的Pandas / tensorflow / opencv /其他東西,我會(huì)將其構(gòu)建到單獨(dú)的微服務(wù)中,而不涉及我的主要業(yè)務(wù)邏輯。 我喜歡將Airflow視為網(wǎng)絡(luò)中的蜘蛛。
Still, I’m paranoid enough that I like to build my own images so I can then push them to my own docker repo.
盡管如此,我還是很偏執(zhí),喜歡構(gòu)建自己的映像,因此可以將它們推送到我自己的docker存儲庫中。
總結(jié)以及從這里去哪里 (Wrap Up and Where to go from here)
Now that you have your foundation its time to build out your data science workflows! Add some custom DAGs, create some custom plugins, and generally build stuff.
現(xiàn)在,您已經(jīng)有時(shí)間構(gòu)建數(shù)據(jù)科學(xué)工作流了! 添加一些自定義DAG,創(chuàng)建一些自定義插件,并通常構(gòu)建東西 。
If you’d like to request a tutorial please feel free to reach out to me at jillian@dabbleofdevops.com or on twitter.
如果您想索取教程,請隨時(shí)通過jillian@dabbleofdevops.com或通過Twitter與我聯(lián)系。
備忘單 (Cheat Sheet)
Here are some hopefully helpful commands and resources.
這里有一些希望有用的命令和資源。
登錄到您的Apache Airflow實(shí)例 (Log into your Apache Airflow Instance)
The default username and password is user and bitnami.
默認(rèn)的用戶名和密碼為user和bitnami 。
Docker撰寫命令 (Docker Compose Commands)
Build
建立
cd code/bitnami-apache-airflow-1.10.10/docker-compose build
Bring up your stack! Running docker-compose up makes all your logs come up on STDERR/STDOUT.
提起你的籌碼! 運(yùn)行docker-compose up可使您的所有日志顯示在STDERR / STDOUT上。
cd code/bitnami-apache-airflow-1.10.10/docker-compose build && docker-compose up
If you’d like to run it in the background instead use -d.
如果您想在后臺運(yùn)行它,請使用-d 。
cd code/bitnami-apache-airflow-1.10.10/docker-compose build && docker-compose up -d
Bitnami Apache Airflow配置 (Bitnami Apache Airflow Configuration)
You can further customize your Airflow instance using environmental variables that you pass into the docker-compose file. Check out the README for details.
您可以使用傳遞到docker-compose文件的環(huán)境變量進(jìn)一步自定義Airflow實(shí)例。 查看自述文件以了解詳細(xì)信息。
加載DAG文件 (Load DAG files)
Custom DAG files can be mounted to /opt/bitnami/airflow/dags or copied during the Docker build phase.
自定義DAG文件可以掛載到/opt/bitnami/airflow/dags或在Docker構(gòu)建階段復(fù)制。
使用Docker Compose指定環(huán)境變量 (Specifying Environment variables using Docker Compose)
version: '2'services:
airflow:
image: bitnami/airflow:latest
environment:
- AIRFLOW_FERNET_KEY=46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho=
- AIRFLOW_EXECUTOR=CeleryExecutor
- AIRFLOW_DATABASE_NAME=bitnami_airflow
- AIRFLOW_DATABASE_USERNAME=bn_airflow
- AIRFLOW_DATABASE_PASSWORD=bitnami1
- AIRFLOW_PASSWORD=bitnami123
- AIRFLOW_USERNAME=user
- AIRFLOW_EMAIL=user@example.com
在Docker之后清理 (Clean up after Docker)
Docker can take up a lot of room on your filesystem.
Docker會(huì)在您的文件系統(tǒng)上占用大量空間。
If you’d like to clean up just the Airflow stack then:
如果您只想清理氣流堆疊,請:
cd code/docker/bitnami-apache-airflow-1.10.10docker-compose stop
docker-compose rm -f -v
Running docker-compose rm -f forcibly removes all the containers, and the -v also removes all data volumes.
運(yùn)行docker-compose rm -f強(qiáng)制刪除所有容器,并且-v也將刪除所有數(shù)據(jù)卷。
隨處刪除所有docker映像 (Remove all docker images everywhere)
This will stop all running containers and remove them.
這將停止所有正在運(yùn)行的容器并刪除它們。
docker container stop $(docker container ls -aq)docker system prune -f -a
This will remove all containers AND data volumes
這將刪除所有容器和數(shù)據(jù)卷
docker system prune -f -a --volumesOriginally published at https://www.dabbleofdevops.com.
最初發(fā)布在 https://www.dabbleofdevops.com 。
翻譯自: https://towardsdatascience.com/get-a-fully-configured-apache-airflow-docker-dev-stack-with-bitnami-ed1671d63ea0
bitnami如何使用
總結(jié)
以上是生活随笔為你收集整理的bitnami如何使用_使用Bitnami获取完全配置的Apache Airflow Docker开发堆栈的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 数字货币量化交易是什么意思
- 下一篇: cox风险回归模型参数估计_信用风险管理