How to Upload a File to Google Cloud Storage on Python 3

How to upload a file to Google Cloud Storage on Python 3?

Use the standard gcloud library, which supports both Python 2 and Python 3.

Example of Uploading File to Cloud Storage

from gcloud import storage
from oauth2client.service_account import ServiceAccountCredentials
import os

credentials_dict = {
'type': 'service_account',
'client_id': os.environ['BACKUP_CLIENT_ID'],
'client_email': os.environ['BACKUP_CLIENT_EMAIL'],
'private_key_id': os.environ['BACKUP_PRIVATE_KEY_ID'],
'private_key': os.environ['BACKUP_PRIVATE_KEY'],
}
credentials = ServiceAccountCredentials.from_json_keyfile_dict(
credentials_dict
)
client = storage.Client(credentials=credentials, project='myproject')
bucket = client.get_bucket('mybucket')
blob = bucket.blob('myfile')
blob.upload_from_filename('myfile')

File upload, using Python(Local System) to Google Cloud Storage

You can do it in this way,

from gcloud import storage

client = storage.Client()

bucket = client.get_bucket('<your-bucket-name>')

blob = bucket.blob('my-test-file.txt')

filename = "%s/%s" % (folder, filename)
blob = bucket.blob(filename)

# Uploading string of text
blob.upload_from_string('this is test content!')

# Uploading from a local file using open()
with open('photo.jpg', 'rb') as photo:
blob.upload_from_file(photo)

# Uploading from local file without open()
blob.upload_from_filename('photo.jpg')

blob.make_public()
url = blob.public_url

For an explanation of each of the above line check out this blog post (example for the above is taken from this blog):
https://riptutorial.com/google-cloud-storage/example/28256/upload-files-using-python

upload multiple files at once from google drive to google cloud storage

The error message is telling you the issue

'Resource' object has no attribute 'get'

its not service.get its service.files().get

source_file_name =service.files().get(fileId=fileId, fields="files(name)").execute()["files(name)"]

you actually had it right with service.files().list you just removed forgot it with the get request.

google cloud storage compose objects and upload the composite one

I successfully replicated your error if the composed object already exists in your bucket.

I managed to fix it because I notice that the list_blobs function will return the list of objects including the composed object or "feeds/file.csv" file if it's already existing in your bucket. What you can do is to create a list variable in the list_blobs function then remove the feeds/file.csv item if it already exists in the list.

Below is the snippet code on list_blobs function:

def list_blobs(storage_client, bucket_name):
blobs = storage_client.list_blobs(bucket_name)
#create a blob_list variable
blob_list = [ blob.name for blob in blobs ]

#Check if BUCKET_FILE_NAME or "feeds/file.csv" is showing in the list, else do nothing
if BUCKET_FILE_NAME in blob_list:
# If BUCKET_FILE_NAME exists, remove from the list
blob_list.remove(BUCKET_FILE_NAME)

# your previous code: return [ blob.name for blob in blobs ]
return blob_list

Sample Output:

Bucket <Bucket_NAME> is now publicly readable

Python GCP- Upload Files to GCP Bucket

Your error says cannot use a string pattern on a bytes-like object

Python 3 gives your data in bytes. You can't encode it. Change the code

file_data = base64.urlsafe_b64decode(response.get('data').encode('UTF-8'))

to

file_data = base64.urlsafe_b64decode(response.get('data'))

Upload CSV file to Google Cloud Storage using Python

We can use the google python client api to upload files to google cloud storage.

First, install the api client as follows.

>pip install --upgrade google-api-python-client

Then, enable api authentication to get application default credentials.

>gcloud beta auth application-default login

Below is a sample code which uploads a local file to google cloud storage using application default credentials.

from googleapiclient import discovery
from oauth2client.client import GoogleCredentials

credentials = GoogleCredentials.get_application_default()
service = discovery.build('storage', 'v1', credentials=credentials)

filename = 'C:\\MyFiles\\sample.csv'
bucket = 'my_bucket'

body = {'name': 'dest_file_name.csv'}
req = service.objects().insert(bucket=bucket, body=body, media_body=filename)
resp = req.execute()

This will upload the file inside my_bucket. The full google storage url for the uploaded file would be gs://my_bucket/dest_file_name.csv

How to create text file in Google Cloud Storage without saving it locally?

Using @johnhanley's suggestions this is the code to implement blob.upload_from_string():

from google.cloud import storage

def write_to_blob(bucket_name,file_name):
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(file_name)
blob.upload_from_string("written in python")

write_to_blob(bucket_name="test-bucket",file_name="from_string.txt")

Saved in Google Cloud Storage:

Sample Image

Inside from_string.txt:

Sample Image



Related Topics



Leave a reply



Submit