-
Notifications
You must be signed in to change notification settings - Fork 16.3k
Description
Apache Airflow version
2.6.0
What happened
I upgraded my Airflow 2.5.3 to 2.6.0 using the official Helm chart 1.9.0 installation on a Kubernetes cluster. The DB migration job fails with a circular import of "TaskInstanceKey". The image I'm using is apache/airflow:2.6.0-python3.10. I'm using CeleryKubernetesExecutor in my configuration.
Here is the stacktrace:
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 8, in <module>
sys.exit(main())
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/__main__.py", line 48, in main
args.func(args)
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/cli/cli_config.py", line 51, in command
return func(*args, **kwargs)
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/cli.py", line 112, in wrapper
return f(*args, **kwargs)
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/cli/commands/db_command.py", line 84, in upgradedb
db.upgradedb(
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/session.py", line 76, in wrapper
return func(*args, session=session, **kwargs)
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/db.py", line 1544, in upgradedb
import_all_models()
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/__init__.py", line 60, in import_all_models
__getattr__(name)
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/__init__.py", line 78, in __getattr__
val = import_string(f"{path}.{name}")
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/module_loading.py", line 36, in import_string
module = import_module(module_path)
File "/usr/local/lib/python3.10/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/dag.py", line 82, in <module>
from airflow.models.dagrun import DagRun
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/dagrun.py", line 57, in <module>
from airflow.models.taskinstance import TaskInstance as TI
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/taskinstance.py", line 99, in <module>
from airflow.sentry import Sentry
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/sentry.py", line 195, in <module>
Sentry = ConfiguredSentry()
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/sentry.py", line 92, in __init__
executor_class, _ = ExecutorLoader.import_default_executor_cls()
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/executors/executor_loader.py", line 158, in import_default_executor_cls
executor, source = cls.import_executor_cls(executor_name)
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/executors/executor_loader.py", line 134, in import_executor_cls
return _import_and_validate(cls.executors[executor_name]), ConnectorSource.CORE
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/executors/executor_loader.py", line 129, in _import_and_validate
executor = import_string(path)
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/module_loading.py", line 36, in import_string
module = import_module(module_path)
File "/usr/local/lib/python3.10/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/executors/celery_kubernetes_executor.py", line 26, in <module>
from airflow.executors.kubernetes_executor import KubernetesExecutor
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/executors/kubernetes_executor.py", line 44, in <module>
from airflow.kubernetes import pod_generator
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/kubernetes/pod_generator.py", line 46, in <module>
from airflow.kubernetes.kubernetes_helper_functions import add_pod_suffix, rand_str
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/kubernetes/kubernetes_helper_functions.py", line 26, in <module>
from airflow.models.taskinstance import TaskInstanceKey
ImportError: cannot import name 'TaskInstanceKey' from partially initialized module 'airflow.models.taskinstance' (most likely due to a circular import) (/home/airflow/.local/lib/python3.10/site-packages/airflow/models/taskinstance.py)
What you think should happen instead
The DB migration job will start without an error on circular import.
How to reproduce
I have a complex automation pipeline with many configurations, so, for now, I will not put my details configurations here. Please let me know if you need specific details.
I installed Airflow on my Kubernetes cluster using the official Helm chart 1.9.0. My database is Postgres. The DB migration job starts, but it fails with the error above.
Operating System
Linux, AWS EKS-based Kubernetes
Versions of Apache Airflow Providers
No response
Deployment
Official Apache Airflow Helm Chart
Deployment details
My full configuration is large and contains sensitive data. Please let me know if you need specific details.
I installed Airflow on my Kubernetes cluster using the official Helm chart 1.9.0. My database is Postgres. The DB migration job starts, but it fails with the error above.
Anything else
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct