Recording Audio on an iPad with AVAudioRecorder
Previous | Table of Contents | Next |
Playing Audio on an iPad using AVAudioPlayer | Integrating Twitter into iPad iOS 5 Applications |
Learn SwiftUI and take your iOS Development to the Next Level |
In addition to audio playback, the iOS AV Foundation framework provides the ability to record sound on the iPad using the AVAudioRecorder class. This chapter will work step-by-step through a tutorial demonstrating the use of the AVAudioRecorder class to record audio.
An Overview of the iPad AVAudioRecorder Tutorial
The goal of this chapter is to create an iOS 5 iPad application that will record and playback audio. It will do so by creating an instance of the AVAudioRecorder class and configuring it with a file to contain the audio and a range of settings dictating the quality and format of the audio. Playback of the recorded audio file will be performed using the AVAudioPlayer class which was covered in detail in the chapter entitled Playing Audio on an iPad using AVAudioPlayer.
Audio recording and playback will be controlled by buttons in the user interface that are connected to action methods which, in turn, will make appropriate calls to the instance methods of the AVAudioRecorder and AVAudioPlayer objects respectively. The view controller of the example application will also implement the AVAudioRecorderDelegate and AVAudioPlayerDelegate protocols and a number of corresponding delegate methods in order to receive notification of events relating to playback and recording.
Creating the Recorder Project
Being by launching Xcode and creating a new iPad iOS single view-based application named record with a corresponding class prefix.
Since the iOS 5 AVAudioRecorder class is part of the AV Foundation framework it will be necessary to add the framework to the project. This can be achieved by selecting the product target entry from the project navigator panel (the top item named record) and clicking on the Build Phases tab in the main panel. In the Link Binary with Libraries section click on the ‘+’ button, select the AVFoundation.framework entry from the resulting panel and click on the Add button.
Declarations, Actions and Outlets
The completed application will need instances of the AVAudioPlayer and AVAudioRecorder classes, a range of outlets for the user interface buttons and play, stop and record action methods. It will also be necessary to import the <AVFoundation/AVFoundation.h> file and declare that the view controller is going to implement the AVAudioRecorderDelegate and AVAudioPlayerDelegate protocols. Bringing all of these requirements together results in a modified recordViewController.h file that reads as follows:
#import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> @interface recordViewController : UIViewController <AVAudioRecorderDelegate, AVAudioPlayerDelegate> @property (strong, nonatomic) AVAudioRecorder *audioRecorder; @property (strong, nonatomic) AVAudioPlayer *audioPlayer; @property (strong, nonatomic) IBOutlet UIButton *playButton; @property (strong, nonatomic) IBOutlet UIButton *recordButton; @property (strong, nonatomic) IBOutlet UIButton *stopButton; -(IBAction) recordAudio; -(IBAction) playAudio; -(IBAction) stop; @end
Creating the AVAudioRecorder Instance
When the application is first launched, an instance of the AVAudioRecorder class needs to be created. This will be initialized with the URL of a file into which the recorded audio is to be saved. Also passed as an argument to the initialization method is an NSDictionary object indicating the settings for the recording such as bit rate, sample rate and audio quality. A full description of the settings available may be found in the appropriate Apple iOS reference materials.
As is often the case, a good location to initialize the AVAudioRecorder instance is the viewDidLoad method of the view controller located in the recordViewController.m file. Select the file in the project navigator, locate this method and modify it so that it reads as follows:
- (void)viewDidLoad { [super viewDidLoad]; self.playButton.enabled = NO; self.stopButton.enabled = NO; NSArray *dirPaths; NSString *docsDir; dirPaths = NSSearchPathForDirectoriesInDomains( NSDocumentDirectory, NSUserDomainMask, YES); docsDir = [dirPaths objectAtIndex:0]; NSString *soundFilePath = [docsDir stringByAppendingPathComponent:@"sound.caf"]; NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath]; NSDictionary *recordSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:AVAudioQualityMin], AVEncoderAudioQualityKey, [NSNumber numberWithInt:16], AVEncoderBitRateKey, [NSNumber numberWithInt: 2], AVNumberOfChannelsKey, [NSNumber numberWithFloat:44100.0], AVSampleRateKey, nil]; NSError *error = nil; audioRecorder = [[AVAudioRecorder alloc] initWithURL:soundFileURL settings:recordSettings error:&error]; if (error) { NSLog(@"Error: %@", [error localizedDescription]); } else { [audioRecorder prepareToRecord]; } }
Since no audio has yet been recorded, the above method disables the play and stop buttons. It then identifies the application’s documents directory and constructs a URL to a file in that location named sound.caf. An NSDictionary object is then created containing the recording quality settings before an instance of the AVAudioRecorder class is created. Assuming no errors are encountered the audioRecorder instance is prepared to begin recording when requested to do so by the user.
Implementing the Action Methods
Once the user interface has been designed later in this chapter the various buttons will be connected to the action methods declared previously in the recordViewController.h file. The next step, therefore, is to implement these action methods. Now is also a good opportunity to add the synthesize directive for the button outlets. Select the recordViewController.m file and modify it as outlined in the following code excerpt:
#import "recordViewController.h" @interface recordViewController () @end @implementation recordViewController @synthesize playButton, stopButton, recordButton, audioRecorder, audioPlayer; -(void) recordAudio { if (!audioRecorder.recording) { self.playButton.enabled = NO; self.stopButton.enabled = YES; [audioRecorder record]; } } -(void)stop { self.stopButton.enabled = NO; self.playButton.enabled = YES; self.recordButton.enabled = YES; if (audioRecorder.recording) { [audioRecorder stop]; } else if (audioPlayer.playing) { [audioPlayer stop]; } } -(void) playAudio { if (!audioRecorder.recording) { self.stopButton.enabled = YES; self.recordButton.enabled = NO; NSError *error; audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:audioRecorder.url error:&error]; audioPlayer.delegate = self; if (error) NSLog(@"Error: %@", [error localizedDescription]); else [audioPlayer play]; } } . . . @end
Each of the above methods performs the step necessary to enable and disable appropriate buttons in the user interface and to interact with the AVAudioRecorder and AVAudioPlayer object instances to record or play back audio.
Implementing the Delegate Methods
In order to receive notification about the success or otherwise of recording or playback it is necessary to implement some delegate methods. For the purposes of this tutorial we will need to implement the methods to indicate errors have occurred and also when playback finished. Once again, edit the recordViewController.m file and add these methods as follows:
-(void)audioPlayerDidFinishPlaying: (AVAudioPlayer *)player successfully:(BOOL)flag { self.recordButton.enabled = YES; self.stopButton.enabled = NO; } -(void)audioPlayerDecodeErrorDidOccur: (AVAudioPlayer *)player error:(NSError *)error { NSLog(@"Decode Error occurred"); } -(void)audioRecorderDidFinishRecording: (AVAudioRecorder *)recorder successfully:(BOOL)flag { } -(void)audioRecorderEncodeErrorDidOccur: (AVAudioRecorder *)recorder error:(NSError *)error { NSLog(@"Encode Error occurred"); }
Designing the User Interface
Select the recordViewController.xib file to load it into the Interface Builder tool. Once loaded, drag UIButton objects from the Library window (View -> Utilities -> Object Library) and position them on the View window. Once placed in the view, modify the text on each button so that the user interface appears as illustrated in Figure 60 1:
Figure 60-1
To connect the Record button to the recordButton outlet Crtl-click on the File’s Owner object and drag the resulting blue line to the Record button in the view window. Release the pointer and select the recordButton outlet from the menu. Repeat these steps to connect the playButton and stopButton outlets to the corresponding buttons.
Select the Record button on the View window and display the connections inspector (View -> Utilities -> Show Connections Inspector). Click in the small circle to the right of the Touch Up Inside event, drag the line to the File’s Owner and select the recordAudio action from the menu. Repeat these steps to connect the Stop and Play buttons to the stop and playAudio actions.
Updating the viewDidUnload Method
Locate and update the viewDidUnload method as follows:
- (void)viewDidUnload { // Release any retained subviews of the main view. // e.g. self.myOutlet = nil; self.audioPlayer = nil; self.audioRecorder = nil; self.stopButton = nil; self.recordButton = nil; self.playButton = nil; }
Testing the Application
Follow the steps outlined in Testing iOS 5 Apps on the iPad – Developer Certificates and Provisioning Profiles to configure the application for installation on an iPad device. Configure Xcode to install the application on the connected iPad device and build and run the application by clicking on the Run button in the main toolbar. Once loaded onto the device, touch the Record button and record some sound. Touch the Stop button when the recording is completed and use the Play button to play back the audio. In the event that audio playback is not audible, make sure that the mute switch on the side of the device is not set.
Learn SwiftUI and take your iOS Development to the Next Level |
Previous | Table of Contents | Next |
Playing Audio on an iPad using AVAudioPlayer | Integrating Twitter into iPad iOS 5 Applications |