Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

put method for storage throws error after authenticating server as service account #58

Open
Gwali-1 opened this issue Jan 6, 2023 · 0 comments

Comments

@Gwali-1
Copy link

Gwali-1 commented Jan 6, 2023

hello ,
First of thanks for maintaining this project, this is my first time using pyrebase and it has been immensely helpful.
I came across an error when i tried to upload files into my storage bucket after authenticating my server as admin by providing path to my ServiceAccount credentials in my config object. This was supposed to make firebase ignore security rules and that checked out but i got this error.
ERROR:
total bytes could not be determined. Please pass an explicit size, or supply a chunk size for a streaming transfer.

I looked into the error and found some discussions on github and stackoverflow that kinda gave me an idea of what was going on. looked at the put method in the storage class in the source code and then realized there was a upload_from_file method being called on the blob instance.

blob = self.bucket.blob(path)
  if isinstance(file, str):
           return blob.upload_from_filename(filename=file)
   else:
        nn return blob.upload_from_file(file_obj=file)

I traced that method into another file and saw it was a method implementation under the blob class in the google storage blob source code . The blob object instantiated in the put method has a chunk_size attribute that is checked for presence in the the object when the upload_from_file is called on it , if that fails, it then checks if the file size was provided also (and it that fails we get the error )

...

if self.chunk_size is not None:
            upload.chunksize = self.chunk_size

            if total_bytes is None:
                upload.strategy = RESUMABLE_UPLOAD
elif total_bytes is None:
            raise ValueError('total bytes could not be determined. Please '
                             'pass an explicit size, or supply a chunk size '
                             'for a streaming transfer.')

...

After the instantiation of the the blob object above with the path argument , there was not chunk_size argument specified . (my guess is this is needed to provide a directive on how the file is uploaded in chunks of the size you specify ). The value provided must be a multiple of 256 KB per the API specification.
These are links to some of the discussions and answers i found on stack overflow and github around this problem https://stackoverflow.com/questions/59210646/uploading-image-file-to-google-bucket-in-python

googleapis/google-cloud-python#3429

solution 1
A fix to this problem is simply providing a chunk_size argument to the blob object instance . If you look into the upload_from_file method source code i believe you’ll find out why this works.
code will now look like this

blob = self.bucket.blob(path, chunk_size="put chunk size here eg 262144")
  if isinstance(file, str):
           return blob.upload_from_filename(filename=file)
   else:
        return blob.upload_from_file(file_obj=file)

solution 2
Providing a size argument to the upload_from_file method will also solve the error as the error is produced when a conditional statement fails after checking if size was provided after it fails to find the chunk_size too. I suspect this solution causes the file to be uploaded in a single chunk other than in small chunks.

return blob.upload_from_file(file_obj=file, size="size of file here")

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant