google-cloud-composer

Is It possible to trigger a Dag from an on_failure_callback?

Is It possible to trigger a Dag from an on_failure_callback? Question: I would like to trigger a Dag when a task is failed. I want to use the "on_failure_callback", however, I have not found information about it. Do you know if it is possible to trigger the dag from the "on_failure_callback"? In the past, I …

Total answers: 2

Passing results of BigQuery query task to the next task while using template macro

Passing results of BigQuery query task to the next task while using template macro Question: This seems a peculiar struggle, so I’m sure I’m missing something. Somehow I can’t seem to pass values using XCOM, unless I’m using functions to execute the tasks that provide and use the information and call them from PythonOperator. This …

Total answers: 2

Why os.getppid() and multiprocessing.parent_process().pid got different result using multiprocessing in airflow 2.x?

Why os.getppid() and multiprocessing.parent_process().pid got different result using multiprocessing in airflow 2.x? Question: I found that when using airflow, using multiprocessing causes an assert error. I solved my error ( this discussion and this discussion ). but I was curious about how process actually works in airflow job, so I ran the code. def process_function(i): …

Total answers: 1

Trigger Cloud Composer Using Google Cloud Function

Trigger Cloud Composer Using Google Cloud Function Question: I have ran this exact code below but get an error in when attempting to trigger the dag using a cloud function. The error and code are described below: gcs-dag-trigger-function 8bhxprce8hze Traceback (most recent call last): File "/layers/google.python.pip/pip/lib/python3.9/site-packages/flask/app.py", line 2073, in wsgi_app response = self.full_dispatch_request() File "/layers/google.python.pip/pip/lib/python3.9/site-packages/flask/app.py", …

Total answers: 1

Cloud Composer / Airflow start new task only when Cloud DataFusion task is really finished

Cloud Composer / Airflow start new task only when Cloud DataFusion task is really finished Question: I have the following task in Airflow (Cloud Composer) that triggers a Cloud DataFusion pipeline. The problem is: Airflow considers this task already a success when (within DataFusion) the DataProc cluster has been provisioned and the actual job has …

Total answers: 1

Configure volumes in airflow GKEStartPodOperator operator

Configure volumes in airflow GKEStartPodOperator operator Question: I have a google cloud composer environment. In my DAG I want to create a pod in GKE. When I come to deploy a simple app based on a docker container that doesn’t need any volume configuration or secrets, everything works fine, for example: kubernetes_max = GKEStartPodOperator( # …

Total answers: 1

Google cloud composer get airflow webserver_id

Google cloud composer get airflow webserver_id Question: I have a GCP project my_project_id containing a composer instance my_project_id_cmpsr_id. In ordeer to get access Airflow rest API I need to retrieve the so called webserver_id. So the GCP airflow web server url is of the form {webserver-id}.appspot.com as specified here in the documentation # This should …

Total answers: 3