google-cloud-storage

Signed Url is working after the expiration date Cloud Storage python

Signed Url is working after the expiration date Cloud Storage python Question: I have found this question that seems dead so i will add more context. i am using the python google-cloud-storage sdk to generate signed urls using blob.generated_signed_url(). Here is the full method i use : blob.generate_signed_url(expiration=datetime.now() + timedelta(minutes=1), version="v4") I live in France …

Total answers: 1

Overwrite single file in a Google Cloud Storage bucket, via Python code

Overwrite single file in a Google Cloud Storage bucket, via Python code Question: I have a logs.txt file at certain location, in a Compute Engine VM Instance. I want to periodically backup (i.e. overwrite) logs.txt in a Google Cloud Storage bucket. Since logs.txt is the result of some preprocessing made inside a Python script, I …

Total answers: 2

Sending large file to gcloud worked on another internet connection but not mine

Sending large file to gcloud worked on another internet connection but not mine Question: So I am doing this to send my 400 megabyte ai model to the cloud model_file = pickle.dumps(model) blob = bucket.blob("models/{user_id}.pickle") blob.upload_from_string(model_file) it takes a long time to process then i get three errors: ssl.SSLWantWriteError: The operation did not complete (write) …

Total answers: 1

How to list objects of one depth level without listing sub-objects by GCP Cloud Storage Python API?

How to list objects of one depth level without listing sub-objects by GCP Cloud Storage Python API? Question: The Cloud Storage Python API allows to list objects using prefix, which limits the listing to certain sub-branches of objects in the bucket. bucket_name = "my-bucket" folders = "logs/app" storage_client.list_blobs(bucket_name, prefix=folders) This operations will return all objects …

Total answers: 1

Recursively copy a child directory to the parent in Google Cloud Storage

Recursively copy a child directory to the parent in Google Cloud Storage Question: I need to recursively move the contents of a sub-folder to a parent folder in google cloud storage. This code works for moving a single file from sub-folder to the parent. client = storage.Client() bucket = client.get_bucket(BUCKET_NAME) source_path = Path(parent_dir, sub_folder, filename).as_posix() …

Total answers: 1

Unable To Update Image On Google Cloud Storage via API

Unable To Update Image On Google Cloud Storage via API Question: I am trying to overwrite an image in my Cloud Storage over the Python API, but after I overwrite it and refresh (and delete browsercache) the Cloud Webpage or the public link the image is still the same, even the next day but sometimes …

Total answers: 1

Access google storage client using dictionary

Access google storage client using dictionary Question: I have a service account in a form of dictionary. Below is the service account service_account = { "type": "service_account", "project_id": "project_id", "private_key_id": "private_key_id", "private_key": "PRIVATE KEY", "client_email": "email", "client_id": "111111", "auth_uri": "https://auth.com", "token_uri": "https://token.com", "auth_provider_x509_cert_url": "https://certs.com", "client_x509_cert_url": "https://www.cert.com" } The above details are simulated. I want to …

Total answers: 1

How to load a BigQuery table from a file in GCS Bucket using Airflow?

How to load a BigQuery table from a file in GCS Bucket using Airflow? Question: I am new to Airflow, and I am wondering, how do I load a file from a GCS Bucket to BigQuery? So far, I have managed to do BigQuery to GCS Bucket: bq_recent_questions_query = bigquery_operator.BigQueryOperator( task_id=’bq_recent_questions_query’, sql=""" SELECT owner_display_name, title, …

Total answers: 2

Exporting a BigQuery table with Airflow: "extract_table() missing 1 required positional argument: 'self'; 1764166" error

Exporting a BigQuery table with Airflow: "extract_table() missing 1 required positional argument: 'self'; 1764166" error Question: I’m trying to use an airflow task to export a BigQuery table to a Google Cloud Storage, but I’m getting the following error message: {standard_task_runner.py:93} ERROR – Failed to execute job 1819 for task export_objs_table_to_bucket (extract_table() missing 1 required …

Total answers: 2