Uploading file with python returns Request failed with status code', 403, 'Expected one of', <HTTPStatus.OK: 200>

Question:

blob.upload_from_filename(source) gives the error

raise exceptions.from_http_status(response.status_code, message, >response=response)
google.api_core.exceptions.Forbidden: 403 POST >https://www.googleapis.com/upload/storage/v1/b/bucket1-newsdata->bluetechsoft/o?uploadType=multipart: (‘Request failed with status >code’, 403, ‘Expected one of’, )

I am following the example of google cloud written in python here!

 from google.cloud import storage

 def upload_blob(bucket, source, des):
    client = storage.Client.from_service_account_json('/path')
    storage_client = storage.Client()
    bucket = storage_client.get_bucket(bucket)
    blob = bucket.blob(des)
    blob.upload_from_filename(source)

I used gsutil to upload files, which is working fine.
Tried to list the bucket names using the python script which is also working fine.
I have necessary permissions and GOOGLE_APPLICATION_CREDENTIALS set.

Asked By: user9730761

||

Answers:

This question is more appropriate for a support case.

As you are getting a 403, most likely you are missing a permission on IAM, the Google Cloud Platform support team will be able to inspect your resources and configurations.

This whole things wasn’t working because I didn’t have permission storage admin in the service account that I am using in GCP.

Allowing storage admin to my service account solved my problem.

Answered By: user9730761

This is what worked for me when the google documentation didn’t work. I was getting the same error with the appropriate permissions.

import pathlib
import google.cloud.storage as gcs

client = gcs.Client()

#set target file to write to
target = pathlib.Path("local_file.txt")

#set file to download
FULL_FILE_PATH = "gs://bucket_name/folder_name/file_name.txt"

#open filestream with write permissions
with target.open(mode="wb") as downloaded_file:

        #download and write file locally 
        client.download_blob_to_file(FULL_FILE_PATH, downloaded_file)
Answered By: tb.

As other answers have indicated that this is related to the issue of permission, I have found one following command as useful way to create default application credential for currently logged in user.

Assuming, you got this error, while running this code in some machine. Just following steps would be sufficient:

  • SSH to vm where code is running or will be running. Make sure you are user, who has permission to upload things in google storage.
  • Run following command:
    gcloud auth application-default login
  • This above command will ask to create token by clicking on url. Generate token and paste in ssh console.

That’s it. All your python application started as that user, will use this as default credential for storage buckets interaction.

Happy GCP’ing 🙂

Answered By: sandeepsign

For me, the issue was that I used different accounts for the server account and runtime server account when creating the cloud function. I didn’t understand that they could be the same, so changing permissions in only one of them of course did not work.

Go to "cloud functions" -> click the name of the function -> "edit" -> set the service account under the "Eventarc trigger" -> "service account" to the same email address as for the service account in "Runtime, build, connections and security settings" -> "Runtime" -> "runtime service account".

Answered By: chilifan
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.