Sending large file to gcloud worked on another internet connection but not mine

Question:

So I am doing this to send my 400 megabyte ai model to the cloud

model_file = pickle.dumps(model)

blob = bucket.blob("models/{user_id}.pickle")
blob.upload_from_string(model_file)

it takes a long time to process then i get three errors:

  • ssl.SSLWantWriteError: The operation did not complete (write) (_ssl.c:2483)
  • requests.exceptions.SSLError: HTTPSConnectionPool(host=’storage.googleapis.com’, port=443): Max retries exceeded with url:
  • urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host=’storage.googleapis.com’, port=443): Max retries exceeded with url:

However when i was on another connection although still took a bit of time, it got sent.

Any help would be appreciated thanks

Asked By: tidekis doritos

||

Answers:

I had the same issue. blob.upload_from_string() does not seem to activate multipart uploads for large strings. Setting the chunk_size attribute to, e.g. 8 MB solved my issue. Here’s an example code:

blob = bucket.blob("models/{user_id}.pickle")
blob.chunk_size = 8 * 1024 * 1024
blob.upload_from_string(model_file)
Answered By: mpa