A still image, or a frame, is one of the most basic elements of a video. By playing multiple frames in sequence, you get a moving image. When it comes to broadcasting, since there are often thousands of frames in a video, and since each of those frames consumes thousands of bits of data, the amount of data transmitted within each frame makes it nearly impossible to stream raw, uncompressed video footage.
With the advent of video compression technology, there are a number of techniques for manipulating image data in individual frames so that your footage can be played on a wide variety of bandwidths and devices. One of these techniques involves using inter-frame prediction with what are known as keyframes (sometimes referred to as i-frames).
What is a Keyframe?
If you were to look at several side-by-side frames in a video, you might notice that many of the elements in the background do not change their position or coloring from one frame to the next. Even though the pixels that make up these elements remain stationary and unchanged, the uncompressed frames will render these elements with new pixels each time. This results in more data being used to render these elements than would otherwise be necessary if keyframes were being utilized.
A keyframe is a single, complete image from your video used to predict which elements of the next frame will remain unchanged. By setting keyframes at regular intervals throughout your broadcast, your encoder can use these images to create what are known as p-frames (predicted or predictive frames). These p-frames are incomplete images that use the previous keyframe to fill in the missing pixels and create a full image without using as much data.
Set Keyframe Distance
When you upload a video to the Library, we recommend your video has a keyframe interval of 2-3 seconds. For example, for a video encoded at 60 frames per second 192 would be an optimal keyframe interval (meaning that a new keyframe will be used every 64 frames, or every 3.2 seconds). One of the reasons we recommend a lower interval is because the adaptive live-streaming technology used in most video players also relies on keyframe intervals to determine when changes in streaming bitrate need to occur. A low keyframe interval ensures a more stable and higher-quality stream at variable bitrates. A higher keyframe interval might result in a video with blurry background images and other video artifacts.
For example, let’s say that a viewer is traveling on a train while streaming your video over a mobile phone. If they were to travel to an area with a weaker signal, adaptive streaming would kick in and lower the bitrate. This would allow your viewer to continue streaming without interruption. However, the switch to a lower bitrate is only made on the next keyframe interval. This means that a longer keyframe can cause your video to buffer if the next keyframe is too far away.
In order to optimize your keyframe distance for an exported video, you can use the following table to determine which interval is best according to your video's frame rate:
|Frame rate||Audio sample rate||Keyframe interval|
While any audio sample rate between 24 and 96 kHz will work with these keyframe intervals, 48 kHz is recommended for optimal efficiency.