Uploading downloaded files to Google Cloud Storage bucket and deleting successful files from local storage.

Google Cloud Storage

This is a slightly different service from Google Drive and can be thought of as the same thing as Amazon’s AWS S3 service. Basically, it’s a remote storage space that is pretty cheap to rent. One of these days, we’ll do a small comparison of Google Cloud solutions vs. AWS (spoiler alert: they’re basically the same). For reasons we’ll talk about in a future post, we’re gradually migrating away from AWS and over to Google Cloud.

See what we did there? Priming some Future Post topics.

This post is also slightly out of order with the general series since we’ve skipped over setting up the Google Cloud environment, server, etc…

The code snippet for downloading/uploading files to a bucket come straight from the documentation. Minimum useful example below, and it requires having Google’s Python client library installed:

from google.cloud import storage

def upload_blob(source_file_name, destination_blob_name):
    """Uploads a file to the bucket."""
    storage_client = storage.Client()
    bucket_name = "works-in-progress" 
    bucket = storage_client.get_bucket(bucket_name) #find the bucket that holds your files
    blob = bucket.blob(destination_blob_name) #create a file on the server with the name of the file you want saved there
    blob.upload_from_filename(source_file_name) #upload your local file to the server file in previous line
    return source_file_name, destination_blob_name, storage_client #you don't need to return any of this, but we use it later

def check_blob_exists(filePath, storage_client):
    """Checks if a file exists on the cloud storage"""
    bucket = storage_client.get_bucket("works-in-progress")
    blobs = bucket.list_blobs() #list all existing blobs (files) in storage. This is a generator of Blob objects
    blob_names = [blob.name for blob in blobs] #make a list of all the filenames
    for blob in blob_names:
        print("Blob name: {}, filePath: {}".format(blob, filePath)) #sanity check
        if filePath in blob: #in case the file we are checking against the server one has a partial name match ("2019-01-01_hello-world.png" vs "works-in-progress/folder1/folder2/folder3/2019-01-01_hello-world.png")
            print("successful upload")
            return True
            print('unsuccessful upload')
            return False

Now we can just wrap everything up into a nice little pipeline of: [get local file name] => [upload to cloud] => [check if upload was successful] => [if yes, delete local file], using built-int Python.

        source_file_name, destination_blob_name, storage_client = upload_blob(localPath, localPath) #upload the file
        if check_blob_exists(localPath, storage_client): #was upload successful?
            await client.send_message(log_channel, "File {} uploaded to {}".format(source_file_name,
            if os.path.isfile(localPath): #make sure the file exists locally...don't want the program to crash
                os.remove(localPath) #delete file
                await client.send_message(log_channel, "Deleted local file!") #print some logging info
                await client.send_message(log_channel, "Failed to delete local file {}".format(localPath))