A codec is software used to encode and decode a digital video signal. Engineers try various solutions to maintain video quality while reducing the amount of data, using state-of-the-art compression algorithm design.
A large portion of your work will comprise preparing and testing various configurations.
Codecs
At the time of this writing, AIR for Android supports codecs for On2 VP6, H.263 (Sorenson Spark), and H.264.
H.264, also called MPEG-4 Part 10 or AVC for Advanced Video Coding, delivers highquality video at lower bit rates than H.263 and On2. It is more complicated to decode, however, and requires native GPU playback or a fast compressor to ensure smooth playback.
H.264 supports the following profiles: Baseline, Extended, Main, and various flavors of High. Test the profiles, as not all of them work with hardware-accelerated media decoding. It appears that only Baseline is using this at the time of this writing.
AAC (Advanced Audio Coding) is the audio codec generally paired with H.264. Nellymoser and Speex are supported, but do not utilize hardware decoding.
MPEG-4 (Moving Picture Experts Group) H.264 is an industry-standard video compression format. It refers to the container format, which can contain several tracks. The file synchronizes and interleaves the data. In addition to video and audio, the container includes metadata that can store information such as subtitles. It is possible to contain more than one video track, but AIR only recognizes one.
Encoding
You can use Adobe Media Encoder CS5 or a third-party tool such as Sorenson Squeeze or On2 Flix to encode your video.
It is difficult to encode video for every device capacity and display size. Adobe recommends grouping devices into low-end, medium-end, and high-end groups.
If your video is embedded or attached to your application, prepare and provide only one file and use a medium-quality solution to serve all your users. If your video is served over a network, prepare multiple streams.
Gather as much information as possible from the user before selecting the video to play. The criteria are the speed of the network connection and the performance of the device.
Decoding
Containers are wrappers around video and audio tracks holding metadata. MP4 is a common wrapper for the MPEG-4 format and is widely compatible. F4V is Adobe’s own format, which builds on the open MPEG-4 standard media file format and supports H.264/AAC-based content. FLV, Adobe’s original video container file format, supports codecs such as Sorenson Spark and On2 VP6, and can include an alpha channel and additional metadata such as cue points.
Video decoding is a multithreaded operation. H.264 and AAC are decoded using hardware acceleration on mobile devices to improve frame rate and reduce battery consumption. Rendering is still done in the CPU.
Bit Rate
Bit rate is the number of bits dedicated to the video in one second (measured in kilobits per second or kbps). During the encoding process, the encoder varies the number of bits given in various portions of the video based on how complicated they are, while keeping the average as close to the bit rate you set as possible.
Because the average is calculated on the fly and is not always accurate, it is best to select the two-pass mode even though it takes longer. The first pass analyzes the video and records a statistics log; the second pass encodes the video using the log to stay as close to the desired average bit rate as possible.
Use the network connection speed as a guide for your encoding. The recommendation is to use 80% to 90% of the available bandwidth for video/audio combined, and keep the rest for network fluctuations. Try the following H.264/AAC rates as a starting point:
- WiFi: 500 to 1,000 kbps, audio up to 160 kbps
- 3G: 350 to 450 kbps, audio up to 128 kbps
- 2.5G: 100 kbps, audio up to 32 kbps
Frame Rate
Reduce high frame rates whenever possible. Downsampling by an even factor guarantees a better result. For instance, a film at 30 fps can be downsampled to 15 fps; a film at 24 fps can be downsampled to 12 or 18 fps.
Do not use content encoded at a high frame rate and assume that a lower frame rate in AIR will adjust it. It will not.
If your footage was captured at a frame rate greater than 24 fps and you want to keep the existing frame rate, look at reducing other settings such as the bit rate.
If your video is the only moving content in your application, you can use a frame rate as low as 12 fps because the video plays at its native frame rate regardless of the application’s frame rate. A low frame rate
reduces drain on the battery.
Resolution
The pixel resolution is simply the width and height of your video. Never use a video that is larger than the intended display size. Prepare the video at the dimension you need.
High resolution has a greater impact on mobile video playback performance than bit rate. A conservative resolution of 480×360 plays very well; 640×480 is still good. A higher resolution will be challenging on most devices and will result in a poor viewing experience on devices that are not using the GPU for decoding or on devices with a 500 MHz CPU. Resolution recommendations are:
- WiFi or 3G: 480×320
- 2.5G: 320×240
In fact, you can often encode smaller and scale up without a noticeable decrease in picture quality. The high PPI on most devices will still display a high-quality video.
Decrease your video size by even divisors of 16. MPEG video encoders work by dividing the video frames into blocks of 16 by 16, called macroblocks. If the dimension does not divide into 16 or close to it, the encoder must do extra work and this may impact the overall encoding target. As an alternate solution, resort to multiples of eight, not four. It is an important practice to achieve maximum compression efficiency.
As for all mobile content, get rid of superfluous content. If necessary, crop the video to a smaller dimension or edit its content, such as trimming a long introduction.
For more information on mobile encoding guidelines, read Adobe’s white paper at http://download.macromedia.com/flashmediaserver/mobile-encoding-android-v2_7.pdf.
Performance
Hardware is improving quickly, but each device’s architecture is a little different. If you want to target the high end of the market, you can add such comments when submitting your applications to the Android Market.
In addition to your encoding settings, there are some best practices to obey for optimal video playback. They are all simple to apply:
- Do not put anything on top of or behind the video, even if it is transparent. This would need to be calculated in the rendering process and would negatively affect video playback.
- Make sure your video window is placed on a full pixel (no half-pixel boundaries).
- Do not use bitmap caching on the video or any other objects on the stage. Do not use filters such as drop shadows or pixel benders. Do not skew or rotate the video. Do not use color transformation or objects with alpha.
- Do not show more than one video at the same time.
- Stop all other processes unless they are absolutely necessary. If you use a progress bar, only call for progress update using a timer every second, not on the enter frame event.