Skip to main content

Need Help Using Spok Knowledge?

Certain site features may not be supported by older versions of Internet Explorer.
Chrome, Edge, Firefox, or Safari will provide the most optimal experience.
Spok

AWS video publishing

Setup

  1. Download and install the following and add the executable for each one to your system environment's PATH variable. You will need admin credentials to do this.
    1. The Amazon Web Services Command Line Interface: https://aws.amazon.com/cli/
    2. The latest version of Python: https://www.python.org/downloads/
    3. Docker: https://hub.docker.com/editions/community/docker-ce-desktop-windows
      1. Or here for Mac: https://hub.docker.com/editions/community/docker-ce-desktop-mac 
  2. Set up Duo multi-factor authentication, if you haven't already. We all received emails about how to set this up.

    We will use the aws-adfs tool to log in, as described here: https://github.com/SpokCCP/aws-adfs. The following instructions are adapted from the README.md file on that github repository, up to the "Troubleshooting" section.
  3. On the command line, run the following command:
    docker pull spok/aws-adfs
  4. Run the tool (inside its Docker image) using the following command:
    docker run -it -v ~/.aws:/root/.aws spok/aws-adfs --profile spokdev --region us-east-1
    
  5. The tool will prompt you for your corporate password.
    aws-adfs-prompt.png
    1. You can ignore the error message on the 2nd line, if you encounter it.
  6. After you enter your password, the tool will prompt you to select an AWS role. We have our own SpokDocumentation role. Select the one in the spokdev profile (option 0).
    aws-adfs-roles.png
  7. The tool will use your corporate credentials log you in with temporary AWS credentials.
    aws-adfs-result.png
  8. Now you can use the AWS CLI to explore the services we have set up and to upload videos.

The Spok Video On Demand Workflow

We are using the CloudFormation template here: https://docs.aws.amazon.com/solutions/latest/video-on-demand/welcome.html

CloudFormation allows us to string together multiple AWS services into a single workflow. Here is the workflow architecture:

(taken from https://docs.aws.amazon.com/solutions/latest/video-on-demand/architecture.html)

This template is even more complex than what is shown above, but we don't need to worry about most of it. Here is how the essential pieces work:

  1. First, you drop a video into an S3 bucket. S3 is a generic object storage service. Objects placed in S3 can be of almost any type.
    1. Our S3 buckets are the following:

      1. spok-vids-source
      2. spok-vids-destination
    2. We will use the AWS CLI to upload our video files. DevOps has strongly encouraged us to do it this way, instead of using the browser-based AWS Management Console.
    3. Everything after this point is automated.
  2. Lambda Step functions watching the S3 bucket trigger an ingest process, which does a few things:
    1. Validates the input
    2. Gets a signed S3 URL for the source video and gets metadata about it.
    3. Sends the information about each step of the process to a DynamoDB database for future reference.
    4. Kicks off the processing workflow.
  3. In the processing workflow, more Lambda Step Functions:
    1. Get the dimensions of the video and select an encoding template based on the video's height.
    2. Submit an encoding job to AWS Elemental MediaConvert.
    3. Update the DynamoDB table.
  4. AWS Elemental MediaConvert converts the source video into several formats and sends back the output. Here are the media types that it outputs:
    1. Two adaptive-bitrate streaming formats, consisting of file segments and a playlist file for telling a player how to string them together.
      1. HLS for Apple devices
      2. MPEG-DASH (more generic and codec-agnostic than HLS)
    2. The original source file again in the original format.
    3. These 3 file formats are output to the spok-vids-destination bucket, inside a new output bucket with an automatically-generated MediaConvert name. So for example, here is the full output hierarchy for a video:
      s3-buckets.png
  5. At this point, you can link to the output files using a CloudFront URL.
    1. The CloudFront base URL is http://d1sbu4ajmsqv5n.cloudfront.net. This represents the spok-vids-destination bucket.
    2. The full URL of an item would then be http://d1sbu4ajmsqv5n.cloudfront.net/<path within spok-vids-destination>

Exploring the S3 Buckets with the AWS Management Console

Do not use the AWS Management Console to upload videos or otherwise modify the buckets or their contents. DevOps has strongly encouraged us to use the AWS CLI to do this instead. Use the Management Console only to view the contents of the buckets.

If you want to use the graphical, browser-based AWS Management Console to explore the contents of our S3 buckets, do the following:

  1. In your browser, navigate to https://fs.spok.com/adfs/ls/idpinitiatedsignon.aspx
  2. If you see the option to Sign out from all the sites that you have accessed, do so and reload the page before continuing. You might see an error after clicking Sign Out. Disregard it and reload.
  3. Select Sign in to one of the following sites: and select AWS AD SignIn Integration from the dropdown menu.
    aws-vod-screen1.png
  4. Sign in with your corporate credentials.
  5. Select the first role in the list: AWS-SpokDocumentation in the spokdev account.
    aws-vod-screen2.png
  6. In the AWS Management Console, navigate to S3 by going to the Services dropdown on the top menu bar, then Storage > S3.
  7. Search for our buckets and click around to view their contents.
    1. spok-vids-source
    2. spok-vids-destination

Exploring the S3 buckets with AWS CLI

To list the files in a bucket, run the following command:
    aws s3 ls --profile spokdev s3://<bucket path>

For example, to view the HLS files that were output in the spok-vids-destination bucket:

aws s3 ls --profile spokdev s3://spok-vids-destination/9dc75eeb-...c62feba2a/hls/

Uploading a File to the Source Bucket

To upload a file to a bucket, run the following command:

    aws s3 cp --profile spokdev <local file path> s3://<bucket name>

The VOD workflow will run automatically, after which you (or possibly just me) will receive emails from no-reply@sns.amazonaws.com, containing JSON data about how the workflow ran. For example:

{
  "destBucket": "spok-vids-destination",
  "ecodeJobId": "1556125603120-gr5zg2",
  "workflowStatus": "Complete",
  "frameCapture": false,
  "workflowName": "spok-vod",
  "workflowTrigger": "Video",
  "encodingProfile": 720,
  "cloudFront": "d1sbu4ajmsqv5n.cloudfront.net",
  "archiveSource": false,
  "startTime": "2019-04-24 17:06.3",
  "jobTemplate": "spok-vod_Ott_720p_Avc_Aac_16x9_qvbr",
  "srcVideo": "Spok Go Quick Start - Signing Out.mp4",
  "srcBucket": "spok-vids-source",
  "srcHeight": 700,
  "srcWidth": 436,
  "EndTime": "2019-04-24 17:07.4",
  "mp4Outputs": [
    "s3://spok-vids-destination/90b63549-ab11-413d-909e-2460a818f65e/mp4/Spok Go Quick Start - Signing Out_Mp4_Avc_Aac_16x9_1280x720p_24Hz_4.5Mbps_qvbr.mp4"
  ],
  "mp4Urls": [
    "https://d1sbu4ajmsqv5n.cloudfront.net/90b63549-ab11-413d-909e-2460a818f65e/mp4/Spok Go Quick Start - Signing Out_Mp4_Avc_Aac_16x9_1280x720p_24Hz_4.5Mbps_qvbr.mp4"
  ],
  "hlsPlaylist": "s3://spok-vids-destination/90b63549-ab11-413d-909e-2460a818f65e/hls/Spok Go Quick Start - Signing Out.m3u8",
  "hlsUrl": "https://d1sbu4ajmsqv5n.cloudfront.net/90b63549-ab11-413d-909e-2460a818f65e/hls/Spok Go Quick Start - Signing Out.m3u8",
  "dashPlaylist": "s3://spok-vids-destination/90b63549-ab11-413d-909e-2460a818f65e/dash/Spok Go Quick Start - Signing Out.mpd",
  "dashUrl": "https://d1sbu4ajmsqv5n.cloudfront.net/90b63549-ab11-413d-909e-2460a818f65e/dash/Spok Go Quick Start - Signing Out.mpd",
  "guid": "90b63549-ab11-413d-909e-2460a818f65e"
}