Blograby

Audio Assets

As with visual assets, there are different methods for using audio assets in your application. We will go over the available options next.

Embedding Files

You can embed sounds in your application by adding them to your Flash library or your Flash Builder project. Embedded files should be small, like the ones used for sound effects or user interface audio feedback.

Your application will not display until all of its assets are loaded. Test it. If it sits on a black screen for too long, you may want to group the sounds in an external .swf file that you load as a separate process.

Using Flash Professional

Unless you place audio on the timeline, you need to give it a linkage name. Go to Library→Properties→Linkage and click on Export for ActionScript. Change the name so that it doesn’t include an extension, and add a capital letter to conform to class naming conventions. For instance, “mySound.mp3” should be “MySound”. Note that the Base class becomes flash.media.Sound:

var mySound:MySound = new MySound();
mySound.play();

Using Flash Builder

Place your audio file in your project folder. Embed it and assign it to a class so that you can create an instance of it:

import flash.media.Sound;
[Embed(source=”mySound.mp3″)]
public var Simple:Class;
var mySound:Sound = new Simple as Sound;
mySound.play();

Using External Files

Using external files is best for long sounds or if you want the flexibility to replace the files without recompiling your application.

import flash.media.Sound;
import flash.net.URLRequest;
var urlRequest:URLRequest = new URLRequest(“mySound.mp3”);
var sound:Sound = new Sound();
sound.load(urlRequest);
sound.play();

This example works for a small file, which loads quickly. We will cover how to handle larger files in the section “Loading Sounds”

Settings and the Audio Codec

The Flash Authoring tool offers the option to modify audio files directly in the library. You can change compression, convert from stereo to mono, and choose a sample rate without requiring an external audio tool. Settings are set globally in the Publish Settings panel and can be overwritten for individual files in the library.

If you own Soundbooth, or another external audio application, you can launch it for an individual sound from within the development tools and make changes, which will be applied to the sound in your project. You can, for instance, change the track from stereo to mono or adjust the volume.

In Flash Professional, select the track in the library, click the top pull-down menu, and select “Edit with” to launch the audio editing application. In Flash Builder, single-click the asset, right-click, and select “Open with” to launch the sound application.

The most professional approach, of course, is to work in an audio application directly, as you have more control over your sound design: all files can be opened together and you can set uniform settings such as volume. Prepare your audio carefully beforehand to remove any unnecessary bytes. For background music, write a small file which loops, rather than a long track.

Compression

Supported compressed formats are MP3 (MPEG-1 Audio Layer 3), AAC (Advanced Audio Coding), WAV (Waveform Audio File Format), and AIFF (Audio Interchange File Format).

MP3 can be imported dynamically using the Sound object. MP3 adds a problematic small silence at the beginning of the track. MP3 encodes incoming audio data in blocks. If the data does not fill up a complete block, the encoder adds padding at the beginning and the end of the track. Read André Michelle’s blog on the issue, and a potential solution, at http://blog.andre-michelle.com/2010/playback-mp3-loop-gapless/.

AAC audio can also be loaded dynamically using the NetStream class. AAC is considered the successor of the MP3 format. It is particularly interesting to us because it is hardware-decoded in AIR for Android:

import flash.net.NetConnection;
import flash.net.NetStream;
var connection:NetConnection = new NetConnection();
connection.connect(null);
var stream:NetStream = new NetStream(connection);
var client:Object = new Object();
client.onMetaData = onMetaData;
stream.client = client;
stream.play(“someAudio.m4a”);

To control or manipulate an AAC file, refer to the section “Playing Sounds” Here is some sample code:

var mySound:SoundTransform;
stream.play(“someAudio.m4a”);
mySound = stream.soundTransform;
// change volume
mySound.volume = 0.75;
stream.soundTransform = mySound;

You can embed WAV or AIFF files in your project or library. Or you can use one of the third-party tools mentioned earlier.

Supported uncompressed settings are Adaptive Differential Pulse Code Modulation (ADPCM), and Raw, which uses no compression at all. Uncompressed formats must be embedded.

Bit rate

The bit rate represents the amount of data encoded for one second of a sound file. The higher the bit rate, the better the audio fidelity, but the bigger the file size. For mobile applications, consider reducing the bit rate that you would normally choose for desktop applications.

Bit rate is represented in kilobits per second (kbps), and ranges from 8 to 160 kbps. The default audio publish setting in Flash Professional is 16 kbps Mono.

Sampling rate

The sampling rate is the number of samples taken from an analog audio signal to make a digital signal—44.1 kHz represents 44,100 samples per second. The most common rates are 11.025, 22.05, and 44.1; 44.1 kHz/16-bit is referred to as CD-Quality and is the sampling rate Flash Player always assumes is used.

Stereo or mono

Stereo or mono
The external speaker on Android devices is monophonic. The headphones are usually stereo, although the output may not be true stereo.

Exit mobile version