You are here

Creating/Submitting Digital Video (D5.b)

Table of Contents

Digitizing Analog Video Sources

This document does not tell you how to digitize video from analog sources. ITS and MPC staff can advise you on the hardware and software required for doing that. Some resources that can help you with this are listed in the document Digital Audio and Video Resources.

The purpose of this document is to present the basics of digital video and to present the specifications for digital video files for purposes of long-term preservation (archiving) and for submitting to CONTENTdm for Web delivery.

The Basics of Digital Video

When considering creating basic video, the technician should be aware of the influence of frame size, frame rate and bit depth on the quality and size of the resulting digital file.

  • Frame Size is the height and width of the video window measured in pixels. The larger the frame size, the larger the number of pixels and the larger the file size. The larger the file size, the greater the amount of bandwidth it takes to download.

  • Frame Rate is the number of frames displayed per second. 30 frames/sec is the frame rate of television and movies and is the rate at which humans discern full motion. Lower frame rates may result in the video appearing jerky.

  • Bit-depth determines the number of colors that will be used to view the movie. The higher the bit-depth, the higher the number of colors and, thus, the larger the file size.

The Video File Package

Each digital video file consists of the moving image itself, the audio stream, information about how to decode the data (playback and resynchronization of the audio and video) and a description of the data stream. Each kind of data is stored separately within the file.

File Size

Short of using compression, there are other ways to reduce file size and bandwidth requirements. Reduce the audio from stereo to mono, reduce the audio sample rate to 22 kHz or less, reduce the frame rate to 15 frames per second or less, and reduce the frame size to 320 x 240 or 160 x 120. Experiment with various frame rates and frame sizes to find the best trade offs for sound quality (audio that does not sound muffled or fuzzy), visual clarity, and smooth, crisp movements.

Decisions on file size and video quality must take into consideration the playback mechanism used by the end user. Over the Internet, it may not be possible to provide the viewer with high-quality digital output. The creation of streaming video for Internet usage must also consider bandwidth usage. The majority of Internet content uses an 8-bit screen of 160 x 120 pixels, at 10-15 frames per second. The following table demonstrates how the increase in screen size, bit-depth and frames-per-second will affect the file size and thus bandwidth required to play it properly.

Figure 1: The influence of frame size, bit depth and frame rate on bandwidth.

Frame Size

Bit depth (bits)

Frame Rate
(frames per second)

Bandwidth (Megabits/second)

Possible Purpose

720 x 480

24 (full color)




640 x 480





320 x 240





320 x 240





160 x 120





160 x 120





(Adapted from Choosing A Suitable Digital Video Format (AHDS) .)

As of June 2007, YouTube encodes videos as 320 x 240, limits data rate to 750 kbps, 100 MB maximum filesize. You should reduce the frame rate to 15 fps and boost the color saturation slightly in your editor to counteract fading during the encoding process.


So much information packed together can create very large files, so most digital video file formats use some form of compression (usually "lossy" compression) to shorten transmission time. For example, a 15-megabyte WAVE (.wav) video can be reduced to just 150 kilobytes if converted to the RealAudio file (.ra) format. Compression allows most videos to be viewed on the Web, but a decompression algorithm is required to "undo" the compression on the user's end. Unfortunately, compressed files usually degrade the quality of the video and compressed files themselves often are not editable. Compressed files are not suitable for long-term preservation.

File Formats

The MPEG-s and MPEG-4 formats are suitable to high-quality digital video. MPEG-4 has a lower transfer rate than MPEG-2 and is intended for streaming video. Other codecs, such as QuickTime, Windows Media, and Real Video are useful for specific purposes, but not suitable for preservation.

Streaming vs. Downloading

Digital video may be delivered to the viewer's computer in two ways—streaming or non-streaming. Streaming video is preferred over non-streaming, but it requires a video server and client software. A streamed video begins playback on client software as soon as enough of the video has loaded to begin and sustain playback at a continuous rate. A cache is established from Random Access Memory (RAM) on the client computer and is used to receive the file, insure that frames are in the correct order, establish timing, refresh compressed frames and check for dropped packets. The video file continues to download into the client cache even as the beginning of the video is being viewed. Examples of streaming video players are QuickTime, Windows Media Player, and RealVideo.

Tips for Digital Video Production

  • To optimize digital video quality (for preservation or for creating a very high quality, short clip), you need to…

    • Use software designed to capture/produce video from the kind of recording you are using as your source

    • Use uncompressed file format (AVI or MPEG-2/DVD file formats)

    • Record the data at a frame size of 720 x 480

    • Record the data at a sample rate of 48 kHz.

    • Record the data at 24 bit depth (3 channel, full color)

    • Record the data at a frame rate of 30.

    • If the original is low quality or even compressed, save a preservation copy as AVI or MPEG-2/DVD to protect it from further degradation.

  • To optimize delivery of digital video files over the Internet, you need to…

    • Shorten the clip as much as possible

    • Reduce the frame size, frame rate and bit depth (some software packages have preset options to help you optimize for Web aelivery).

    • Reduce audio sampling frequency to 22 kHz

    • Use a compressed file format (such as MPEG-4, QuickTime or the non-proprietary Ogg Theora)

  • Quality Control

    • Review the final product for any fast-moving segments to see if there are any "slicing"-type artifacts, blurriness, or choppiness. These may result when you encode video from interlaced sources such as most DV and HDV camcorders. To fix these problems search you software's help files for "deinterlace" and follow the instructions there. One tool that can help with this is ArcSoft MediaConvert. Adobe Premiere Elements has a Field Option called "Always Deinterlace."

- Top -

Software for Creating and Managing Video

Simple Video Editing Software

  • iMovie (Macintosh)
  • Windows Movie Maker
  • Adobe Premier Elements

Full Video Editors

- Top -

Figure 2: Digital Video Specifications

View Figure 2:  Digital Video Specifications.

- Top -

(Reviewed: October 1, 2013)