Engineering

Signed URLs for file sharing over Google Cloud CDN

Author headshot
By Richard Albayaty

Securely sharing files with users over Google Cloud CDN using Node.js.

Back to all articles
Signed URLs for file sharing over Google Cloud CDN

Anvil believes that privacy and a seamless user experience do not have to be mutually exclusive. In many workflows it is necessary to upload supporting materials or download files: signed PDFs of a business agreement, a regulatory memo, or simply a spreadsheet of data, etc. Passing these files back and forth between users and servers needs to be both fast and secure.

While the use of secure transport protocols (TLS/HTTPS) is becoming more pervasive—some browsers even highlight non-HTTPS sites as being insecure—security does not stop at the transport layer. Application level authorization and access controls are also important mechanisms to restrict access to user data.

When providing a download link to users, some things an application developer must consider:

  1. Who is allowed to access this link?
  2. Could someone other than the intended recipient of the link access it?
  3. How long should the link allow access to the resource?

Expiring signed URLs are a tool to help you restrict access to files and answer these questions at the application-level.

What does this mean practically? user A is given a link through the CDN to a server nearest them to download someSignedDoc.pdf. This link expires in some predetermined amount of time, let’s say two minutes. After those two minutes pass, the link will no longer be authorized to return the resource behind it. If an extremely sensitive link meant for user A were to be exposed to user B within that time window, the signing key used to create it could be cycled to immediately prevent access.

In this post, we will go over how to create a signed CDN URL in Node.js to allow secure access to a file for a limited amount of time in Google Cloud Storage.

Project setup

Getting your GCP project set up is outside of the scope of this post, but thankfully Google does a great job walking you through the setups to get you up and running using either gcloud, the command console, or even snippets of code in Node.js.

  • Setting up a CDN with Cloud Storage provides good step-by-step instructions on setting up a bucket, copying a file into the bucket, making the bucket public, and creating a HTTPS load balancer with external IP to leverage a CDN.
  • Using Signed URLs gets you set up to create the signing key you need to create the signed URLs.

Generate a signed URL for Cloud Storage

Prep

To get started, let’s import (assuming ES6) the dependencies we will be using.

import { createHmac } from 'crypto'
import moment from 'moment'
import fetch from 'node-fetch'
import base64url from 'base64url'
import fs from 'fs'
import path from 'path'

From the project we will need the signing key name, the key itself, and the signing url for your CDN. To note here, the signing key you have was given as a base64 string which needs to be converted into a Buffer to perform the URL signing later on.

const keyName = process.env.GCP_SIGNING_KEYNAME
const key = Buffer.from(process.env.GCP_SIGNING_KEY, 'base64')
const signingURL = process.env.GCP_SIGNING_URL

Define the filename, path to the file in your bucket, and how long you want the link to be accessible for this particular resource.

const filename = 'somefile.png'
const pathToFile = `path/to/${filename}`
const linkDurationInSeconds = 60

The good stuff

Here we append the file path to the signing URL, which gets encoded as a valid URI (to account for things like spaces in a filename for instance). The link expiration is calculated as the number of seconds you want the link to remain valid in unix timestamp seconds.

To create the signature for our signed URL, we append two query parameters to the encoded URI: Expires (the expiration) and KeyName (the name of the signing key from our project). Finally, we SHA1 hash this string then base64url encode it to get the signature you need for the last step.

const encodedURI = encodeURI(path.join(signingURL, pathToFile))
const expiration = parseInt(
  moment().utc().add(linkDurationInSeconds, 'seconds').format('X')
)
const urlToSign = `${encodedURI}?Expires=${expiration}&KeyName=${keyName}`
const signedURLInBase64 = createHmac('sha1', key)
  .update(urlToSign)
  .digest('base64')
const signature = base64url.fromBase64(signedURLInBase64)
const signedURL = `${urlToSign}&Signature=${signature}`
console.log({ signedURL })

The result should look something like

https://some.cdn.bucket.domain.com/path/to/some%20document.pdf?Expires=<>&KeyName=<>&Signature=<>

Now we test this signed URL by using it to fetch the data. Our library provides a helper to deal with the various status codes that could be returned when making a request. If you wait longer than the 60 seconds we defined for link expiration, making a request with the link should return a 403 error status code.

Try it for yourself!

function handleResponse(response) {
  const status = response.status
  switch (status) {
    case 200:
      // Returns a Promise!
      return response.buffer()
    case 403:
      throw new Error('Unauthorized: bad signing key|keyName or expired URL')
    case 404:
      throw new Error('File not found')
    default:
      const reason = response.statusText
      throw new Error(`${status}:${reason}`)
  }
}
const response = await fetch(signedURL, { method: 'get' })
try {
  const data = await handleResponse(response)
  fs.writeFileSync(path.join(__dirname, filename), data, { encoding: null })
} catch (error) {
  console.error(error)
}

The whole #!

Here is the example code that we highlighted above all together. If you intend to share the signed URLs directly with end users, you won’t need the data fetching portion.

import { createHmac } from 'crypto'
import moment from 'moment'
import fetch from 'node-fetch'
import base64url from 'base64url'
import fs from 'fs'
import path from 'path'

const keyName = process.env.GCP_SIGNING_KEYNAME
const key = Buffer.from(process.env.GCP_SIGNING_KEY, 'base64')
const signingURL = process.env.GCP_SIGNING_URL

const filename = 'somefile.png'
const pathToFile = `path/to/${filename}`
const linkDurationInSeconds = 60

// Generate the signed URL
const encodedURI = encodeURI(path.join(signingURL, pathToFile))
const expiration = parseInt(
  moment().utc().add(linkDurationInSeconds, 'seconds').format('X')
)
const urlToSign = `${encodedURI}?Expires=${expiration}&KeyName=${keyName}`
const signedURLInBase64 = createHmac('sha1', key)
  .update(urlToSign)
  .digest('base64')
const signature = base64url.fromBase64(signedURLInBase64)
const signedURL = `${urlToSign}&Signature=${signature}`
console.log({ signedURL })

// Fetch the asset for testing purposes
function handleResponse(response) {
  const status = response.status
  switch (status) {
    case 200:
      // Returns a Promise!
      return response.buffer()
    case 403:
      throw new Error('Unauthorized: bad signing key|keyName or expired URL')
    case 404:
      throw new Error('File not found')
    default:
      const reason = response.statusText
      throw new Error(`${status}:${reason}`)
  }
}
const response = await fetch(signedURL, { method: 'get' })
try {
  const data = await handleResponse(response)
  fs.writeFileSync(path.join(__dirname, filename), data, { encoding: null })
} catch (error) {
  console.error(error)
}

But wait there's more!

A quick note for power users: it can be cumbersome to generate several signed URLs for a nested set of resources: catImages/gray.png, catImages/yellow.png, catImages/white.png. One solution for this situation would be to include a URLPrefix (in this case URLPrefix=catImages/) query parameter in the signed URL and then only sign the query parameters Expires, KeyName, and URLPrefix. Excluding the base URL for the Signature enables the consumer to swap out the base URL including the sub-file resource and access anything matching the URLPrefix using the same set of query parameters.

To sign up for our free developer sandbox or learn more about our API, head over to our developer center at www.useanvil.com/developers. There, you will find comprehensive documentation, simple tutorials, and client libraries to help you get started quickly and easily.

If you have questions, please do not hesitate to contact us at: developers@useanvil.com

Get a Document AI demo (from a real person)

Request a 30-minute demo and we'll be in touch soon. During the meeting our team will listen to your use case and suggest which Anvil products can help.
    Want to try Anvil first?
    Sign up for free or try it now with any document.
    Want to try Anvil first?
    Sign up for free or try it now with any document.