Skip to content

astro dev parse Fails when parsing the SnowflakeOperator in a dag #601

@ReadytoRocc

Description

@ReadytoRocc

Describe the bug
astro dev parse Fails when parsing the SnowflakeOperator in a dag. Error returned below:

8eecb7-test-1  | ______________ test_file_imports[dags/snowflake-dag.py] ______________
8eecb7-test-1  | 
8eecb7-test-1  | rel_path = 'dags/snowflake-dag.py'
8eecb7-test-1  | rv = 'Traceback (most recent call last):\n  File "/usr/local/lib/python3.9/logging/__init__.py", line 1146, in __init__\n  ...r: [Errno 2] No such file or directory: \'/usr/local/airflow/NON_DEFAULT_OS_ENV_VALUE/snowflake_ssm_rt_telemetry.log\''
8eecb7-test-1  | 
8eecb7-test-1  |     @pytest.mark.parametrize("rel_path,rv", get_import_errors(), ids=[x[0] for x in get_import_errors()])
8eecb7-test-1  |     def test_file_imports(rel_path,rv):
8eecb7-test-1  |        """ Test for import errors on a file """
8eecb7-test-1  |        if rel_path and rv : #Make sure our no op test doesn't raise an error
8eecb7-test-1  | >              raise Exception(f"{rel_path} failed to import with message \n {rv}")
8eecb7-test-1  | E     Exception: dags/snowflake-dag.py failed to import with message 
8eecb7-test-1  | E      Traceback (most recent call last):
8eecb7-test-1  | E       File "/usr/local/lib/python3.9/logging/__init__.py", line 1146, in __init__
8eecb7-test-1  | E         StreamHandler.__init__(self, self._open())
8eecb7-test-1  | E       File "/usr/local/lib/python3.9/logging/__init__.py", line 1175, in _open
8eecb7-test-1  | E         return open(self.baseFilename, self.mode, encoding=self.encoding,
8eecb7-test-1  | E     FileNotFoundError: [Errno 2] No such file or directory: '/usr/local/airflow/NON_DEFAULT_OS_ENV_VALUE/snowflake_ssm_rt_telemetry.log'

What CLI Version did you experience this bug?

Astro CLI Version: 1.0.2

What Operating System is the above CLI installed on?
macOS

🪜 Steps To Reproduce

Run astro dev parse with the following DAG in your dags directory:

from airflow import DAG
from airflow.providers.snowflake.operators.snowflake import SnowflakeOperator
from datetime import datetime, timedelta

default_args = {
    "owner": "airflow",
    "depends_on_past": False,
    "email_on_failure": False,
    "email_on_retry": False,
    "retries": 1,
    "retry_delay": timedelta(minutes=5),
}
with DAG(
    "snowflake_dag",
    start_date=datetime(2022, 3, 16),
    default_args=default_args,
    schedule_interval=None,
    max_active_tasks=1,
    catchup=False,
) as dag:

    sf = SnowflakeOperator(
        task_id="sf",
        snowflake_conn_id="sf_fe_sa",
        sql="SYSTEM$WAIT(0)",
    )

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions