Airfllow cannot find json file
Question:
I’ve the following structure on my Aitflow process:
dags/mainDag.py
dags/BigQuery/deleteData.py
dags/BigQuery/insertData.py
dags/support/gcp.json
dags/support/__init__py
My mainDag.py
is calling deleteData.py
and the insertData.py
and this works! But my problem is: In these both files I use the gcp.json
like this:
credentialsPath = "~/airflow/dags/support/gqp.json"
bigqueryClient = bigquery.Client.from_service_account_json(credentialsPath)
And in the Airflow Webserver I had this error:
FileNotFoundError: [Errno 2] No such file or directory: ‘~/airflow/dags/support/gqp.json’
But I can cat the file content on my bash using this path successfully.
I read this two question in the stack, [airflow: how can i put the method for read a json file in a local library and [Airflow – Python file NOT in the same DAG folder but neither works!
Is there anyone who know how to solve this?
Answers:
If you try:
import os
credentialsPath = "~/airflow/dags/support/gqp.json"
print(os.path.isfile(credentialsPath))
You will see that the output is False
. This is because python doesn’t expand the ~
to your user home directory. You can do this using the os.path.expanduser
function:
import os
credentialsPath = os.path.expanduser("~/airflow/dags/support/gqp.json")
print(os.path.isfile(credentialsPath))
Now, this will output True
as your file path has been expanded with your home directory.
import os
os.path.join(
os.getenv("AIRFLOW_HOME", ""), "dags/support/gqp.json"
)
This should work.
I’ve the following structure on my Aitflow process:
dags/mainDag.py
dags/BigQuery/deleteData.py
dags/BigQuery/insertData.py
dags/support/gcp.json
dags/support/__init__py
My mainDag.py
is calling deleteData.py
and the insertData.py
and this works! But my problem is: In these both files I use the gcp.json
like this:
credentialsPath = "~/airflow/dags/support/gqp.json"
bigqueryClient = bigquery.Client.from_service_account_json(credentialsPath)
And in the Airflow Webserver I had this error:
FileNotFoundError: [Errno 2] No such file or directory: ‘~/airflow/dags/support/gqp.json’
But I can cat the file content on my bash using this path successfully.
I read this two question in the stack, [airflow: how can i put the method for read a json file in a local library and [Airflow – Python file NOT in the same DAG folder but neither works!
Is there anyone who know how to solve this?
If you try:
import os
credentialsPath = "~/airflow/dags/support/gqp.json"
print(os.path.isfile(credentialsPath))
You will see that the output is False
. This is because python doesn’t expand the ~
to your user home directory. You can do this using the os.path.expanduser
function:
import os
credentialsPath = os.path.expanduser("~/airflow/dags/support/gqp.json")
print(os.path.isfile(credentialsPath))
Now, this will output True
as your file path has been expanded with your home directory.
import os
os.path.join(
os.getenv("AIRFLOW_HOME", ""), "dags/support/gqp.json"
)
This should work.