Playing Audio on an iPad using AVAudioPlayer (Xcode 4)

PreviousTable of ContentsNext
Video Playback from within an iOS 4 iPad Application (Xcode 4)Recording Audio on an iPad with AVAudioRecorder (Xcode 4)


Learn SwiftUI and take your iOS Development to the Next Level
SwiftUI Essentials – iOS 16 Edition book is now available in Print ($39.99) and eBook ($29.99) editions. Learn more...

Buy Print Preview Book


The iOS 4 SDK provides a number of mechanisms for implementing audio playback from within iPad applications. The easiest technique from the perspective of the application developer is to use the AVAudioPlayer class which is part of the AV Foundation Framework.

The goal of this chapter is to provide an overview of audio playback using the AVAudioPlayer class. Once the basics have been covered, a tutorial is worked through step by step. The topic of recording audio from within an iPad application is covered in the next chapter entitled Recording Audio on an iPad with AVAudioRecorder.

Supported Audio Formats

The AV Foundation framework supports the playback of a variety of different audio formats and codecs including both software and hardware based decoding. Codecs and formats currently supported are as follows:

  • AAC (MPEG-4 Advanced Audio Coding)
  • ALAC (Apple Lossless)
  • AMR (Adaptive Multi-rate)
  • HE-AAC (MPEG-4 High Efficiency AAC)
  • iLBC (internet Low Bit Rate Codec)
  • Linear PCM (uncompressed, linear pulse code modulation)
  • MP3 (MPEG-1 audio layer 3)
  • µ-law and a-law

If an audio file is to be included as part of the resource bundle for an application it may be converted to a supported audio format prior to inclusion in the application project using the Mac OS X afconvert command-line tool. For details on how to use this tool, run the following command in a Terminal window:

afconvert –h

Receiving Playback Notifications

An application receives notifications from an AVAudioPlayer instance by declaring itself as the object’s delegate and implementing some or all the following AVAudioPlayerDelegate protocol methods:

  • audioPlayerDidFinishPlaying: – Called when the audio playback finishes. An argument passed through to the method indicates whether the playback completed successfully or failed due to an error.
  • audioPlayerDecodeErrorDidOccur: - Called when a decoding error is encountered by the AVAudioPlayer object during audio playback. An error object containing information about the nature of the problem is passed through to this method as an argument.
  • audioPlayerBeginInterruption: – Called when audio playback has been interrupted by a system event such as an incoming phone call. Playback is automatically paused and the current audio session deactivated.
  • audioPlayerEndInterruption: - Called after an interruption ends. The current audio session is automatically activated and playback may be resumed by calling the play method of the corresponding AVAudioPlayer instance.

Controlling and Monitoring Playback

Once an AVAudioPlayer instance has been created the playback of audio may be controlled and monitored programmatically via the methods and properties of the instance. For example, the self explanatory play, pause and stop methods may be used to control playback. Similarly, the volume property may be used to adjust the volume level of the audio playback whilst the playing property may be accessed to identify whether or not the AVAudioPlayer object is currently playing audio.

In addition, playback may be delayed to begin at a later time using the playAtTime instance method which takes as an argument the number of seconds (as an NSTimeInterval value) to delay before beginning playback.

The length of the current audio playback may be obtained via the duration property whilst the time current point in the playback is stored in the currentTime property.

Playback may also be programmed to loop back and repeatedly play a specified number of times using the numberofLoops property.

Creating the iPad Audio Example Application

The remainder of this chapter will work through the creation of a simple iPad iOS application that plays an audio file. The user interface of the application will consist of play and stop buttons to control playback and a slider to adjust the playback volume level. In addition, the total duration of the audio file will be displayed and the current playback time updated in real-time. Begin the project by launching Xcode and creating a new iPad iOS view-based application named audio.

Adding the AVFoundation Framework

Since the iOS 4 AVAudioPlayer class is part of the AV Foundation framework it will be necessary to add the framework to the project. This can be achieved by selecting the product target entry from the project navigator panel (the top item named audio) and clicking on the Build Phases tab in the main panel. In the Link Binary with Libraries section click on the ‘+’ button, select the AVFoundation.framework entry from the resulting panel and click on the Add button.

Adding an Audio File to the Project Resources

In order to experience audio playback it will be necessary to add an audio file to the project resources. For this purpose, any supported audio format file will be suitable. Having identified a suitable audio file, drag and drop it into the Supporting Files category of the project navigator panel of the main Xcode window. For the purposes of this tutorial we will be using an MP3 file named Kalimba.mp3.

Learn SwiftUI and take your iOS Development to the Next Level
SwiftUI Essentials – iOS 16 Edition book is now available in Print ($39.99) and eBook ($29.99) editions. Learn more...

Buy Print Preview Book

Creating Actions and Outlets

The application is going to need action methods for the play and stop buttons in addition to the volume control. Since we will need to be able to read the current value of the volume slider control it will also be necessary to declare a corresponding outlet. In order to display the current playback time a label object will be added to the user interface for which an outlet will be required. We also need to declare a reference to an AVAudioPlayer audioPlayer object and specify that the view controller class implements the AVAudioPlayerDelegate protocol. Finally, an NSTimer object will be needed to update the playback time label at regular intervals.

With the above requirements in mind, select the audioViewController.h file and modify it to import the <AVFoundation/AVFoundation.h> file and declare these references and actions:

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

@interface audioViewController : UIViewController
    <AVAudioPlayerDelegate>
{
    AVAudioPlayer *audioPlayer;
    UISlider *volumeControl;
    UILabel *timerLabel;
    NSTimer *playbackTimer;
}
@property (nonatomic, retain) IBOutlet UISlider *volumeControl;
@property (nonatomic, retain) IBOutlet UILabel *timerLabel;
@property (nonatomic, retain) NSTimer *playbackTimer;
-(IBAction) playAudio;
-(IBAction) stopAudio;
-(IBAction) adjustVolume;
@end

Implementing the Action Methods

The next step in our iPad audio player tutorial is to implement the action methods for the two buttons and the slider. Select the audioViewController.m file and add these methods as outlined in the following code fragment (note also the addition of the synthesize directive for the volume control, timer label and playback timer):

#import "audioViewController.h"

@implementation audioViewController
@synthesize volumeControl, timerLabel, playbackTimer;

-(void)playAudio
{
    playbackTimer = [NSTimer scheduledTimerWithTimeInterval:1.0
                                target:self
                                selector:@selector(updateTime)
                                userInfo:nil
                                repeats:YES];
    [audioPlayer play];
}
-(void)stopAudio
{
    [playbackTimer invalidate];
    [audioPlayer stop];
}
-(void)adjustVolume
{
    if (audioPlayer != nil)
    {
        audioPlayer.volume = volumeControl.value;
    }
} 

The above playAudio and stopAudio action methods simply call the appropriate methods of the audioPlayer object to start and stop playback. In the case of the playAudio method, an NSTimer object is configured to repeatedly call a method named updateTime every second. It will be the responsibility of this method to update the timerLabel object with the elapsed playback time. The timer is then invalidated when the stopAudio method is triggered.

Updating the Playback Time

The NSTimer object initialized in the playAudio method will cause the updateTime method to be called once every second until playback is stopped. It is the job of this method to identify both the total duration of the audio file and the current playback time and update the timerLabel accordingly. The code for this method, to be placed in the audioViewController.m file, reads as follows:

-(void)updateTime
{
    float minutes = floor(audioPlayer.currentTime/60);
    float seconds = audioPlayer.currentTime - (minutes * 60);

    float duration_minutes = floor(audioPlayer.duration/60);
    float duration_seconds = 
       audioPlayer.duration - (duration_minutes * 60);

    NSString *timeInfoString = [[NSString alloc] 
       initWithFormat:@"%0.0f.%0.0f / %0.0f.%0.0f",
       minutes, seconds, 
       duration_minutes, duration_seconds];

    timerLabel.text = timeInfoString;
    [timeInfoString release];
}

The code in this method converts both the duration and currentTime properties of the audioPlayer object to minutes and seconds, formats a string using these values and displays it on the timerLabel object.

Creating Initializing the AVAudioPlayer Object

Now that we have an audio file to play and appropriate action methods written the next step is to create an AVAudioPlayer instance and initialize it with a reference to the audio file. Since we only need to initialize the object once when the application launches a good place to write this code is in the viewDidLoad method of the audioViewController.m file. Remove the comment markers (/* and */) from around this method and modify it as follows:

- (void)viewDidLoad {
    [super viewDidLoad];
    NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
            pathForResource:@"Kalimba"
            ofType:@"mp3"]];
 
        NSError *error;
        audioPlayer = [[AVAudioPlayer alloc]
                         initWithContentsOfURL:url
                         error:&error];
        if (error)
        {
                NSLog(@"Error in audioPlayer: %@", 
                      [error localizedDescription]);
        } else {
                audioPlayer.delegate = self;
                [audioPlayer prepareToPlay];
        }
}

In the above code we create an NSURL reference using the filename and type of the audio file added to the project resources. Keep in mind that this will need to be modified to reflect the audio file used in your own project.

Next, an AVAudioPlayer instance is created using the URL of the audio file. Assuming no errors were detected, the current class is designated as the delegate for the audio player object. Finally a call is made to the audioPlayer object’s prepareToPlay method. This performs initial buffering tasks so that there is no buffering delay when the play button is subsequently selected by the user.

Implementing the AVAudioPlayerDelegate Protocol Methods

As previously discussed, by declaring our view controller as the delegate for our AVAudioPlayer instance our application will be able to receive notifications relating to the playback. Templates of these methods are as follows and may be placed in the audioViewController.m file as follows:

-(void)audioPlayerDidFinishPlaying:
(AVAudioPlayer *)player successfully:(BOOL)flag
{
}
-(void)audioPlayerDecodeErrorDidOccur:
(AVAudioPlayer *)player error:(NSError *)error
{
}
-(void)audioPlayerBeginInterruption:(AVAudioPlayer *)player
{
}
-(void)audioPlayerEndInterruption:(AVAudioPlayer *)player
{
}

For the purposes of this tutorial it is not necessary to implement any code for these methods and they are provided solely for completeness.

Designing the User Interface

Select the audioViewController.xib file and display the Object library (View -> Utilities -> Object Library) and drag and drop components from the Library onto the View window and modify properties so that the interface appears as illustrated in the following figure:


The user interface design of an example AVAudioPlayer based iPad application


Ctrl-click on the File’s Owner object, drag the blue line to the slider control in the View window and select the volumeControl outlet from the resulting menu to connect the control to the outlet. Repeat this task to connect the timerLabel outlet to the label object.

Select the Play button in the View window and display the connections inspector (View -> Utilities -> Connections Inspector). Click on the small circle to the right of the Touch Up Inside event and drag the line to the File’s Owner object. Release the pointer and select the playAudio action from the resulting menu. Repeat these steps to connect the Touch Up Inside event of the Stop button to the stopAudio action and the Value Changed event of the slider control to the adjustVolume method.

Releasing Memory

The last step before trying out the application is to release any memory allocated during the application lifecycle:

- (void)viewDidUnload {
    // Release any retained subviews of the main view.
    // e.g. self.myOutlet = nil;
    audioPlayer = nil;
    volumeControl = nil;
}
- (void)dealloc {
    [audioPlayer release];
    [volumeControl release];
    [super dealloc];
}

Building and Running the Application

Once all the requisite changes have been made and saved, test the application in the iOS iPad Simulator by clicking on the Run button located in the Xcode toolbar. Once the application appears, click on the Play button to begin playback. Watch the playback timer update, adjust the volume using the slider and stop playback using the Stop button:

An AVAudioPlayer example app running on an iPad


Now that the basics of audio playback have been covered the next chapter will look at Recording Audio on an iPad with AVAudioRecorder.


Learn SwiftUI and take your iOS Development to the Next Level
SwiftUI Essentials – iOS 16 Edition book is now available in Print ($39.99) and eBook ($29.99) editions. Learn more...

Buy Print Preview Book



PreviousTable of ContentsNext
Video Playback from within an iOS 4 iPad Application (Xcode 4)Recording Audio on an iPad with AVAudioRecorder (Xcode 4)