Blograby

Playing Video

You can play videos running from your device or loaded remotely.

Embedded Video

You can embed a video in your application using Flash Professional. Embedded video will appear in the library as a symbol. Create a MovieClip and add the video content to it. You can then control its playback by calling the standard MovieClip navigation methods.

Using this approach is simple, but it has disadvantages. The video is compiled into the application and adds to its size. Also, it is always loaded in memory and cannot be removed.

As an alternative, you can embed the video in an external .swf file which you load using the Loader class.

External Video

You can package the video with your application. It is placed in the application directory. The application will not display until all of the assets are loaded. You can also serve the video from a remote web server. The code is identical in both cases.

Progressive Video

To load video locally, you need to know the path of the file in your application directory.

NetConnection creates a connection with the local filesystem when calling its connect method. Pass a null parameter in its construction to indicate that it is not streaming.

Within the connection, NetStream opens the channel between AIR and the local filesystem. Pass the connection object as a parameter in its construction, and use its play method to receive video data. Note that this object needs its client property defined as well as the onMetaData method to prevent a runtime error.

The Video object displays the video data.

In this example, the Video object dimensions are hardcoded:

[code]

import flash.net.NetConnection;
import flash.net.NetStream;
import flash.media.Video;
import flash.events.NetStatusEvent;
var connection:NetConnection;
var video:Video;
video = new Video();
video.width = 480;
video.height = 320;
connection = new NetConnection();
connection.addEventListener(NetStatusEvent.NET_STATUS, netConnectionEvent);
connection.connect(null);
function netConnectionEvent(event:NetStatusEvent):void {
event.target.removeEventListener(NetStatusEvent.NET_STATUS,
netConnectionEvent);
if (event.info.code == “NetConnection.Connect.Success”) {
var stream:NetStream = new NetStream(connection);
stream.addEventListener(NetStatusEvent.NET_STATUS, netStreamEvent);
var client:Object = new Object();
client.onMetaData = onMetaData;
stream.client = client;
// attach the stream to the video to display
video.attachNetStream(stream);
stream.play(“someVideo.flv”);
addChild(video);
}
}
function onMetaData(info:Object):void {}

[/code]

At the time of this writing, video.smoothing is always false. This is consistent with AIR runtime default settings, but does not provide the best video experience. Setting video.smoothing to true does not change it.

SD card

You can play videos from the SD card. Playback is nearly as fast as playing back from the device.

You need to resolve the path to where the video is located before playing it. In this example, there is a directory called myVideos on the SD card and a video called myVideo inside it:

[code]

var videosPath:File = File.documentsDirectory.resolvePath(“myVideos”);
var videoName:String = “myVideo.mp4”;
stream.play(videosPath + “/” + videoName);

[/code]

Browsing for video

You cannot use CameraRoll to browse for videos, but you can use the filesystem.

You could create a custom video player for the user to play videos installed on the device or on the SD card. The browseForOpen method triggers the filesystem to search for videos:

[code]

import flash.filesystem.File;
import flash.net.FileFilter;
import flash.media.Video;
var video:Video;
var filter:FileFilter = new FileFilter(“video”, “*.mp4;*.flv;*.mov;*.f4v”);
var file:File = new File();
file.addEventListener(Event.SELECT, fileSelected);
file.browseForOpen(“open”, [filter]);

[/code]

At the time of this writing, it seems that only the FLV format is recognized when browsing the filesystem using AIR.

A list of the video files found appears. The following code is executed when the user selects one of the files. The video file is passed in the Event.SELECT event as file.tar get and is played using its url property. Note how the video is sized and displayed in the onMetaData function. We will cover this technique next:

[code]

import flash.net.NetConnection;
import flash.net.NetStream;
function fileSelected(event:Event):void {
video = new Video();
var connection:NetConnection = new NetConnection();
connection.connect(null);
var stream:NetStream = new NetStream(connection);
var client:Object = new Object();
client.onMetaData = onMetaData;
stream.client = client;
video.attachNetStream(stream);
stream.play(event.target.url);
}
function onMetaData(info:Object):void {
video.width = info.width;
video.height = info.height;
addChild(video);
}

[/code]

Metadata

The client property of NetStream is used to listen to onMetaData. In this example, we use the video stream width and height, received in the metadata, to scale the Video object. Other useful information is the duration, the frame rate, and the codec:

[code]

// define the Stream client to receive callbacks
var client:Object = new Object();
client.onMetaData = onMetaData;
stream.client = client;
// attach the stream to the video
video.attachNetStream(stream);
stream.play(“someVideo.flv”);
// size the video object based on the metadata information
function onMetaData(info:Object):void {
video.width = info.width;
video.height = info.height;
addChild(video);
trace(info.duration);
trace(info.framerate);
trace(info.codec);
for (var prop:String in info) {
trace(prop, data[prop]);
}
}

[/code]

Cue points

The FLVPlaybackComponent gives us the ability to add cue points to a video. The component listens to the current time code and compares it to a dictionary of cue points. When it finds a match, it dispatches an event with the cue point information.

The cue points come in two forms. Navigation cue points are used as markers for chapters or time-specific commentary. Event cue points are used to trigger events such as calling an ActionScript function. The cue point object looks like this:

[code]

var cuePoint:Object = {time:5, name:”cue1″, type:”actionscript”,
parameters:{prop:value}};

[/code]

This component is not available in AIR for Android. If you want to use something similar, you need to write the functionality yourself. It can be a nice addition to bridge your video to your AIR content if you keep your cue points to a minimum. Use them sparsely, as they have an impact on performance.

Cue points can be embedded dynamically server-side if you are recording the file on Flash Media Server.

Buffering

The moov atom, video metadata that holds index information, needs to be placed at the beginning of the file for a progressive file. Otherwise, the whole file needs to be completely loaded in memory before playing. This is not an issue for streaming. Look at Renaun Erickson’s wrapper to fix the problem, at http://renaun.com/blog/code/qtindexswapper/.

By default, the application uses an input buffer. To modify the default buffering time, use the following:

[code]var stream:NetStream = new NetStream(connection);
stream.bufferTime = 5; // value in seconds[/code]

When using a streaming server, managing bandwidth fluctuation is a good strategy:

[code]

var stream:NetStream = new NetStream(connection);
stream.addEventListener(NetStatusEvent.NET_STATUS, netStreamEvent);
function netStreamEvent(event:NetStatusEvent):void {
var buffTime:int;
swith(event.info.code) {
case “NetStream.Buffer.Full” :
buffTime = 15.0;
break;
case “NetStream.Buffer.empty” :
buffTime = 2.0;
break;
}
stream.bufferTime = buffTime;
}

[/code]

Read Fabio Sonnati’s article on using dual-threshold buffering, at http://www.adobe.com/devnet/flashmediaserver/articles/fms_dual_buffering.html.

Exit mobile version