Using Google cloud storage with Loopback (Node.js) app

Currently I am developing my music streaming api/services/app using Loopback framework. The last things I need to complete my api to enable uploading music tracks to Google cloud storage with Loopback storage component. I searched on the Internet about how to connect Google cloud storage with loopback component. Unfortunately, I could not find one. So, in this post, I will write about how to use Google cloud storage services with Loopback framework as a reference for myself.

I will skip all the parts of creating a new google cloud project and setting up loopback app.

Create a storage bucket

A Google storage bucket can be created by a few ways. However, I will use Google cloud platform console to create a new storage bucket.
1. Open the cloud storage browser
2. Click Create bucket.
3. Specify a Name for our bucket, the default Storage class for the bucket, and a Location where the bucket data will be stored.create a new bucket
4. Then, click Create.

You can skip this part if you already have one.

Authentication

Two types of Authentication flows

When comes to Authentication in Google Cloud Storage, there are two general types:

  1. A server-centric flow that allows an app to directly hold the credentials of a service account to complete authentication.

  2. A user-centric flow allows an app to obtain credentials from an end user.

Authentication scopes

These scopes are for authorisation permission on specified resources for an authenticated identity. There are five type of scopes than GCS can be authorised.

  1. read-only - access to read data and bucket lists
  2. read-write - access to read and change data but not metadata like IAM policies
  3. full-control - access full control over data and can modify IAM policies
  4. cloud-platform.ready-only - higher level of read-only, also allow access to data across Google Cloud Platform services
  5. cloud-platform - view and manage data across all GCP services

Service account credentials

It is the most common to authenticate Google cloud platform with Service accounts. Service accounts represent software rather than people. Every project has its associated accounts with it and we do not need a user to authenticate to get an access token. We just need a private key from the Google Cloud Platform Console, which we can use to send a signed request for an access token.

Generating a service account credential

Now, let's generate a service account by creating a private key in the Cloud Platform Console with an OAuth Client ID.

The private key can be produced in JSON and PKCS12 format. But as a javaScript developer, I prefer JSON.

  1. Open Credentials from API Manager in Google Cloud Platform Console.
  2. Click Create credentials (if a dialog prompts to select project, select a project and continue this step).
  3. Select Service account key.
  4. In Create service account key page, choose New service account for service account drop-down.
  5. Then, specify a Name for the service account.
  6. For role, you can select your roles that you like and as many as you like. I choose Project owner in my case as I would like to use this account for other Google Cloud Services.
  7. Leave the default Service account ID or you can generate a different one.
  8. Select Key type: JSON and click Create. Create a service account
  9. A notification window of notifying you that a service account created and the private key will be automatically downloaded as a json file.
  10. Move the downloaded JSON file to the root of you project.

That's all we need to do in the Google Cloud Platform to get authentication done. Now, we will come back to our terminal and app to implement GCS authentication by using various methods and languages in simple steps.

Implementation

gsutil authentication

By using the associated service account private key file, type the following command in Terminal to authenticate with a service account.

gcloud auth activate-service-account --key-file=[key_file_path.json]

To authenticate with an user account credentials, type the following:

gcloud auth application-default login

This command will take you to the browser, and authenticate with google account.

gcloud auth uses the cloud-platform scope when getting an access token.

Client Library Authentication

Install Cloud Node.js Client (Loopback)

Cloud Node.js Client is the Node.js idiomatic client for Google Cloud Platform services. Although it can support almost all the Google Platform services, we are going to install only storage client for Cloud storage services. Install the package with npm command as following.

$ npm install --save @google-cloud/storage

The Cloud Node.js Client is still in beta release. However, I got no difficulties so far using in my project.

Configure Cloud Services' Variables

It is not essential, but it is good practice to use all the power of Loopback that can simply define Node ENV-specific variables. However, you can also done in plain Node.js app in an similar way.

For me, I got config.local.js in my Loopback server directory root to add the Google Cloud configuration. The complete config.local.js will be look like the following code.

/*
 * Local configuration to datasource and host.
 */
"use strict";

var env = (process.env.NODE_ENV || "development");  
var isDevEnv = env === "development" || env === "test";

if (!isDevEnv) {  
  // NODE ENV for Production
  process.env["GCLOUD_PROJECT"] = "xxxx-300989";
  process.env["KEY_FILE"] = "../xxxx.json";

} else {
  // NODE ENV for Development
  process.env["GCLOUD_PROJECT"] = "xxx-300989";
  process.env["KEY_FILE"] = "../xxx.json";

};

You can read more about Loopbacks' Environment-specific configuration details in this link.

Then all we need to authenticate and manage data in Google Cloud Storage is as following:

// Authenticating on a per-API-basis
// Choose the Pancasikha bucket and create read stream
  const storage = require('@google-cloud/storage');
  const bucket = storage({
     projectId: process.env.GCLOUD_PROJECT,
     keyFilename: process.env.GCLOUD_KEY_FILE
  }).bucket(process.env.GCLOUD_BUCKET);

// Uploads the file
bucket  
  .upload("local_file_path_to_upload")
  .then(() => {
    console.log(`${filename} uploaded to ${bucketName}.`);
  })
  .catch((err) => {
    console.error('ERROR:', err);
  });

// Downloads the file
const options = {  
  // The path to which the file should be downloaded, e.g. "./file.txt"
  destination: destFilename
};
bucket  
  .file("file_path_on_gcs_to_download")
  .download(options)
  .then(() => {
    console.log(`gs://${bucketName}/${srcFilename} downloaded to ${destFilename}.`);
  })
  .catch((err) => {
    console.error('ERROR:', err);
  });

// Move the file
var gFile = bucket.file("old_file_path_on_gcs");  
var newFilePath = "new_file_path_on_gcs";  
gFile.copy(newFilePath, function(err, copiedFile, apiResponse) {  
        if (!err) { gFile.delete(); }
});

For other gcs object operations, take a look at Object Basics in How-to Guides in Cloud Storage Documentation.

API authentication

This API authentication is done by both JSON and XML, but Google strongly recommend only to use a verified client library instead due to the complexity of managing and refreshing access tokens.

This is how listing objects resources can be done with JSON API.

GET /storage/v1/b/example-bucket/o HTTP/1.1  
Host: www.googleapis.com  
Authorization: Bearer ya29.AHES6ZRVmB7fkLtd1XTmq6mo0S1wqZZi3-Lh_s-6Uw7p8vtgSwg  

Thanks so much for your time taking to read this post and I hope you enjoyed reading this. You might also be interested in my next post Using Google cloud storage from javaScript client side, Android and iOS app

Nay Win Myint

Founder and CEO of Pancasikha Music Streaming Provider, JavaScript full-stack and Android developer and Graphic designer.

Rangoon, Myanmar