Command airflow db clean does not work anymore with BashOperator in Airflow 3+ #56281
-
I use Airflow 3.1.0 with Docker Compose (official file). If I had to make it by hand, I would open the worker container:
And issue the following command:
I can also confirm the container does know the connection string:
Anyway when I run this command using a
I get the following error:
Which seems to be an explicit forbidden with a volontary bad shaped connection string. I have confirmed that the My question is why my Related question I opened on SO: https://stackoverflow.com/questions/79779432/how-can-i-run-airflow-db-clean-command-using-bashoperator-with-airlfow-3 |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
You have a few ways to handle this:
The simplest approach is to keep DB cleaning as a manual or scheduled job outside Airflow: Use a cron job or systemd timer that executes the command inside the container. Example:
Create a small Docker container (or script) that runs the cleaning command. Schedule it via Airflow using a KubernetesPodOperator or DockerOperator instead of BashOperator. The container can safely have the real DB connection string, bypassing the “blocked” environment.
If you really want to use BashOperator, you can override the environment:
Caution: This works, but it bypasses the security protection, so use it only if you understand the risks.
If your goal is to clean old DAG runs or logs, you can use Airflow’s Python API (airflow.models.DagRun deletion) in a PythonOperator, which is safer than calling destructive CLI commands. Recommended Approach For production, the safest and most maintainable way is: Keep airflow db clean outside DAGs, running either via cron or as a one-off container job. If you need to automate via DAG, use DockerOperator or KubernetesPodOperator with explicit DB connection, not BashOperator. Avoid passing destructive commands directly inside DAGs, because Airflow 3 intentionally prevents it for safety. |
Beta Was this translation helpful? Give feedback.
You have a few ways to handle this:
airflow db clean
outside the DAGThe simplest approach is to keep DB cleaning as a manual or scheduled job outside Airflow:
Use a cron job or systemd timer that executes the command inside the container.
Example:
docker exec airflow-airflow-worker-1 airflow db clean --clean-before-timestamp 20250930 --yes
Create a small Docker container (or script) that runs the cleaning command.
Schedule it via Airflow using a KubernetesPodOperator or DockerOperator instead of BashOperator.
The container can safely have the real DB connection string, bypassing the “blocked” environment.