Insert Files into Firebase Cloud Storage with Put Bytes We use google cloud storage to store .parquet files as part of our data processing. We often want to load a .parquet file into memory, to be read directly into a pandas Dataframe without downloading it on disk.
The code below does the trick. My question is: would it be useful to include a download_as_buffer method in storage.blob?
Insert Files into Firebase Cloud Storage with Put Bytes
When running on Google Cloud Platform, no action needs to be taken to authenticate. Otherwise, the simplest way of authenticating your API calls is to download a service account JSON file then set the GOOGLE_APPLICATION_CREDENTIALS environment variable to refer to it. The credentials will automatically be used to authenticate. See the Getting Started With Authentication guide for more details.
Without using the cloud, it can be difficult to develop and manage a server that lets users upload image files, especially at high scale. You have to queue requests to the process responsible for uploading the files to control the flow rate, and you have to prevent the system from going down due to request overload. You also need to set appropriate resource limits for finite resources (such as RAM) of each server that’s involved.
Things are slightly easier with digital files. Sure, the volume of digital assets produced by individuals and businesses is going through the roof, and everybody seems to be on the lookout for convenient data storage. Yet, the professed data-geddon still seems very unlikely — with so many cloud storage services on the market.
When you perform a simple upload, basic metadata is created and some attributes are inferred from the file, such as the MIME type or modifiedTime. You can use a simple upload in cases where you have small files and file metadata isn’t important.