34,333
edits
Changes
New page: <table border="0" cellspacing="0" width="100%"> <tr> <td width="20%">Previous<td align="center">[[iOS 8 App Development Essentials|Table of C...
<table border="0" cellspacing="0" width="100%">
<tr>
<td width="20%">[[Playing Audio on iOS 8 using AVAudioPlayer|Previous]]<td align="center">[[iOS 8 App Development Essentials|Table of Contents]]<td width="20%" align="right">[[Integrating Twitter and Facebook into iOS 8 Applications using Swift|Next]]</td>
<tr>
<td width="20%">Playing Audio on iOS 8 using AVAudioPlayer<td align="center"><td width="20%" align="right">Integrating Twitter and Facebook into iOS 8 Applications using Swift</td>
</table>
<hr>
<google>BUY_IOS8</google>
In addition to audio playback, the iOS AV Foundation Framework provides the ability to record sound on iOS using the AVAudioRecorder class. This chapter will work step-by-step through a tutorial demonstrating the use of the AVAudioRecorder class to record audio.
== An Overview of the AVAudioRecorder Tutorial ==
The goal of this chapter is to create an iOS 8 application that will record and playback audio. It will do so by creating an instance of the AVAudioRecorder class and configuring it with a file to contain the audio and a range of settings dictating the quality and format of the audio. Playback of the recorded audio file will be performed using the AVAudioPlayer class which was covered in detail in the chapter entitled [[Playing Audio on iOS 8 using AVAudioPlayer]].
Audio recording and playback will be controlled by buttons in the user interface that are connected to action methods which, in turn, will make appropriate calls to the instance methods of the AVAudioRecorder and AVAudioPlayer objects respectively.
The view controller of the example application will also implement the AVAudioRecorderDelegate and AVAudioPlayerDelegate protocols and a number of corresponding delegate methods in order to receive notification of events relating to playback and recording.
== Creating the Recorder Project ==
Begin by launching Xcode and creating a new Universal single view-based application named Record using the Swift programming language.
== Designing the User Interface ==
Select the Main.storyboard file and, once loaded, drag Button objects from the Object Library window (View -> Utilities -> Show Object Library) and position them on the View window. Once placed in the view, modify the text on each button so that the user interface appears as illustrated in Figure 89-1:
[[Image:ios_8_recording_demo_ui.png]]
Figure 89-1
With the scene view selected within the storyboard canvas, display the Auto Layout Resolve Auto Layout Issues menu and select the Reset to Suggested Constraints menu option listed in the All Views in View Controller section of the menu.
Select the “Record” button object in the view canvas, display the Assistant Editor panel and verify that the editor is displaying the contents of the ViewController.swift file. Ctrl-click on the Record button object and drag to a position just below the class declaration line in the Assistant Editor. Release the line and, in the resulting connection dialog, establish an outlet connection named recordButton. Repeat these steps to establish outlet connections for the “Play” and “Stop” buttons named playButton and stopButton respectively.
Continuing to use the Assistant Editor, establish Action connections from the three buttons to methods named recordAudio, playAudio and stopAudio.
Close the Assistant Editor panel, select the ViewController.swift file and modify it to import the AVFoundation framework, declare adherence to some delegate protocols and to add properties to store references to AVAudioRecorder and AVAudioPlayer instances:
<pre>
import UIKit
import AVFoundation
class ViewController: UIViewController, AVAudioPlayerDelegate, AVAudioRecorderDelegate {
var audioPlayer: AVAudioPlayer?
var audioRecorder: AVAudioRecorder?
.
.
</pre>
<google>BUY_IOS8</google>
== Creating the AVAudioRecorder Instance ==
When the application is first launched, an instance of the AVAudioRecorder class needs to be created. This will be initialized with the URL of a file into which the recorded audio is to be saved. Also passed as an argument to the initialization method is an NSDictionary object indicating the settings for the recording such as bit rate, sample rate and audio quality. A full description of the settings available may be found in the appropriate Apple iOS reference materials.
As is often the case, a good location to initialize the AVAudioRecorder instance is the viewDidLoad method of the view controller located in the ViewController.swift file. Select the file in the project navigator, locate this method and modify it so that it reads as follows:
<pre>
override func viewDidLoad() {
super.viewDidLoad()
playButton.enabled = false
stopButton.enabled = false
let dirPaths =
NSSearchPathForDirectoriesInDomains(.DocumentDirectory,
.UserDomainMask, true)
let docsDir = dirPaths[0] as String
let soundFilePath =
docsDir.stringByAppendingPathComponent("sound.caf")
let soundFileURL = NSURL(fileURLWithPath: soundFilePath)
let recordSettings =
[AVEncoderAudioQualityKey: AVAudioQuality.Min.rawValue,
AVEncoderBitRateKey: 16,
AVNumberOfChannelsKey: 2,
AVSampleRateKey: 44100.0]
var error: NSError?
let audioSession = AVAudioSession.sharedInstance()
audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord,
error: &error)
if let err = error {
println("audioSession error: \(err.localizedDescription)")
}
audioRecorder = AVAudioRecorder(URL: soundFileURL,
settings: recordSettings, error: &error)
if let err = error {
println("audioSession error: \(err.localizedDescription)")
} else {
audioRecorder?.prepareToRecord()
}
}
</pre>
Since no audio has yet been recorded, the above method disables the play and stop buttons. It then identifies the application’s documents directory and constructs a URL to a file in that location named sound.caf. An NSDictionary object is then created containing the recording quality settings before an audio session and an instance of the AVAudioRecorder class are created. Assuming no errors are encountered, the audioRecorder instance is prepared to begin recording when requested to do so by the user.
== Implementing the Action Methods ==
The next step is to implement the action methods connected to the three button objects. Select the ViewController.swift file and modify it as outlined in the following code excerpt:
<pre>
@IBAction func recordAudio(sender: AnyObject) {
if audioRecorder?.recording == false {
playButton.enabled = false
stopButton.enabled = true
audioRecorder?.record()
}
}
@IBAction func stopAudio(sender: AnyObject) {
stopButton.enabled = false
playButton.enabled = true
recordButton.enabled = true
if audioRecorder?.recording == true {
audioRecorder?.stop()
} else {
audioPlayer?.stop()
}
}
@IBAction func playAudio(sender: AnyObject) {
if audioRecorder?.recording == false {
stopButton.enabled = true
recordButton.enabled = false
var error: NSError?
audioPlayer = AVAudioPlayer(contentsOfURL: audioRecorder?.url,
error: &error)
audioPlayer?.delegate = self
if let err = error {
println("audioPlayer error: \(err.localizedDescription)")
} else {
audioPlayer?.play()
}
}
}
</pre>
Each of the above methods performs the steps necessary to enable and disable appropriate buttons in the user interface and to interact with the AVAudioRecorder and AVAudioPlayer object instances to record or play back audio.
== Implementing the Delegate Methods ==
In order to receive notification about the success or otherwise of recording or playback it is necessary to implement some delegate methods. For the purposes of this tutorial we will need to implement the methods to indicate errors have occurred and also when playback finished. Once again, edit the ViewController.swift file and add these methods as follows:
<pre>
func audioPlayerDidFinishPlaying(player: AVAudioPlayer!, successfully flag: Bool) {
recordButton.enabled = true
stopButton.enabled = false
}
func audioPlayerDecodeErrorDidOccur(player: AVAudioPlayer!, error: NSError!) {
println("Audio Play Decode Error")
}
func audioRecorderDidFinishRecording(recorder: AVAudioRecorder!, successfully flag: Bool) {
}
func audioRecorderEncodeErrorDidOccur(recorder: AVAudioRecorder!, error: NSError!) {
println("Audio Record Encode Error")
}
</pre>
== Testing the Application ==
Follow the steps outlined in Testing Apps on iOS 8 Devices with Xcode 6 to configure the application for installation on an iOS device. Configure Xcode to install the application on the connected device and build and run the application by clicking on the run button in the main toolbar. Once loaded onto the device, the operating system will seek permission to allow the app to record audio. Select “OK” and touch the Record button to record some sound. Touch the Stop button when the recording is completed and use the Play button to play back the audio.
<google>BUY_IOS8</google>
<hr>
<table border="0" cellspacing="0" width="100%">
<tr>
<td width="20%">[[Playing Audio on iOS 8 using AVAudioPlayer|Previous]]<td align="center">[[iOS 8 App Development Essentials|Table of Contents]]<td width="20%" align="right">[[Integrating Twitter and Facebook into iOS 8 Applications using Swift|Next]]</td>
<tr>
<td width="20%">Playing Audio on iOS 8 using AVAudioPlayer<td align="center"><td width="20%" align="right">Integrating Twitter and Facebook into iOS 8 Applications using Swift</td>
</table>
<tr>
<td width="20%">[[Playing Audio on iOS 8 using AVAudioPlayer|Previous]]<td align="center">[[iOS 8 App Development Essentials|Table of Contents]]<td width="20%" align="right">[[Integrating Twitter and Facebook into iOS 8 Applications using Swift|Next]]</td>
<tr>
<td width="20%">Playing Audio on iOS 8 using AVAudioPlayer<td align="center"><td width="20%" align="right">Integrating Twitter and Facebook into iOS 8 Applications using Swift</td>
</table>
<hr>
<google>BUY_IOS8</google>
In addition to audio playback, the iOS AV Foundation Framework provides the ability to record sound on iOS using the AVAudioRecorder class. This chapter will work step-by-step through a tutorial demonstrating the use of the AVAudioRecorder class to record audio.
== An Overview of the AVAudioRecorder Tutorial ==
The goal of this chapter is to create an iOS 8 application that will record and playback audio. It will do so by creating an instance of the AVAudioRecorder class and configuring it with a file to contain the audio and a range of settings dictating the quality and format of the audio. Playback of the recorded audio file will be performed using the AVAudioPlayer class which was covered in detail in the chapter entitled [[Playing Audio on iOS 8 using AVAudioPlayer]].
Audio recording and playback will be controlled by buttons in the user interface that are connected to action methods which, in turn, will make appropriate calls to the instance methods of the AVAudioRecorder and AVAudioPlayer objects respectively.
The view controller of the example application will also implement the AVAudioRecorderDelegate and AVAudioPlayerDelegate protocols and a number of corresponding delegate methods in order to receive notification of events relating to playback and recording.
== Creating the Recorder Project ==
Begin by launching Xcode and creating a new Universal single view-based application named Record using the Swift programming language.
== Designing the User Interface ==
Select the Main.storyboard file and, once loaded, drag Button objects from the Object Library window (View -> Utilities -> Show Object Library) and position them on the View window. Once placed in the view, modify the text on each button so that the user interface appears as illustrated in Figure 89-1:
[[Image:ios_8_recording_demo_ui.png]]
Figure 89-1
With the scene view selected within the storyboard canvas, display the Auto Layout Resolve Auto Layout Issues menu and select the Reset to Suggested Constraints menu option listed in the All Views in View Controller section of the menu.
Select the “Record” button object in the view canvas, display the Assistant Editor panel and verify that the editor is displaying the contents of the ViewController.swift file. Ctrl-click on the Record button object and drag to a position just below the class declaration line in the Assistant Editor. Release the line and, in the resulting connection dialog, establish an outlet connection named recordButton. Repeat these steps to establish outlet connections for the “Play” and “Stop” buttons named playButton and stopButton respectively.
Continuing to use the Assistant Editor, establish Action connections from the three buttons to methods named recordAudio, playAudio and stopAudio.
Close the Assistant Editor panel, select the ViewController.swift file and modify it to import the AVFoundation framework, declare adherence to some delegate protocols and to add properties to store references to AVAudioRecorder and AVAudioPlayer instances:
<pre>
import UIKit
import AVFoundation
class ViewController: UIViewController, AVAudioPlayerDelegate, AVAudioRecorderDelegate {
var audioPlayer: AVAudioPlayer?
var audioRecorder: AVAudioRecorder?
.
.
</pre>
<google>BUY_IOS8</google>
== Creating the AVAudioRecorder Instance ==
When the application is first launched, an instance of the AVAudioRecorder class needs to be created. This will be initialized with the URL of a file into which the recorded audio is to be saved. Also passed as an argument to the initialization method is an NSDictionary object indicating the settings for the recording such as bit rate, sample rate and audio quality. A full description of the settings available may be found in the appropriate Apple iOS reference materials.
As is often the case, a good location to initialize the AVAudioRecorder instance is the viewDidLoad method of the view controller located in the ViewController.swift file. Select the file in the project navigator, locate this method and modify it so that it reads as follows:
<pre>
override func viewDidLoad() {
super.viewDidLoad()
playButton.enabled = false
stopButton.enabled = false
let dirPaths =
NSSearchPathForDirectoriesInDomains(.DocumentDirectory,
.UserDomainMask, true)
let docsDir = dirPaths[0] as String
let soundFilePath =
docsDir.stringByAppendingPathComponent("sound.caf")
let soundFileURL = NSURL(fileURLWithPath: soundFilePath)
let recordSettings =
[AVEncoderAudioQualityKey: AVAudioQuality.Min.rawValue,
AVEncoderBitRateKey: 16,
AVNumberOfChannelsKey: 2,
AVSampleRateKey: 44100.0]
var error: NSError?
let audioSession = AVAudioSession.sharedInstance()
audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord,
error: &error)
if let err = error {
println("audioSession error: \(err.localizedDescription)")
}
audioRecorder = AVAudioRecorder(URL: soundFileURL,
settings: recordSettings, error: &error)
if let err = error {
println("audioSession error: \(err.localizedDescription)")
} else {
audioRecorder?.prepareToRecord()
}
}
</pre>
Since no audio has yet been recorded, the above method disables the play and stop buttons. It then identifies the application’s documents directory and constructs a URL to a file in that location named sound.caf. An NSDictionary object is then created containing the recording quality settings before an audio session and an instance of the AVAudioRecorder class are created. Assuming no errors are encountered, the audioRecorder instance is prepared to begin recording when requested to do so by the user.
== Implementing the Action Methods ==
The next step is to implement the action methods connected to the three button objects. Select the ViewController.swift file and modify it as outlined in the following code excerpt:
<pre>
@IBAction func recordAudio(sender: AnyObject) {
if audioRecorder?.recording == false {
playButton.enabled = false
stopButton.enabled = true
audioRecorder?.record()
}
}
@IBAction func stopAudio(sender: AnyObject) {
stopButton.enabled = false
playButton.enabled = true
recordButton.enabled = true
if audioRecorder?.recording == true {
audioRecorder?.stop()
} else {
audioPlayer?.stop()
}
}
@IBAction func playAudio(sender: AnyObject) {
if audioRecorder?.recording == false {
stopButton.enabled = true
recordButton.enabled = false
var error: NSError?
audioPlayer = AVAudioPlayer(contentsOfURL: audioRecorder?.url,
error: &error)
audioPlayer?.delegate = self
if let err = error {
println("audioPlayer error: \(err.localizedDescription)")
} else {
audioPlayer?.play()
}
}
}
</pre>
Each of the above methods performs the steps necessary to enable and disable appropriate buttons in the user interface and to interact with the AVAudioRecorder and AVAudioPlayer object instances to record or play back audio.
== Implementing the Delegate Methods ==
In order to receive notification about the success or otherwise of recording or playback it is necessary to implement some delegate methods. For the purposes of this tutorial we will need to implement the methods to indicate errors have occurred and also when playback finished. Once again, edit the ViewController.swift file and add these methods as follows:
<pre>
func audioPlayerDidFinishPlaying(player: AVAudioPlayer!, successfully flag: Bool) {
recordButton.enabled = true
stopButton.enabled = false
}
func audioPlayerDecodeErrorDidOccur(player: AVAudioPlayer!, error: NSError!) {
println("Audio Play Decode Error")
}
func audioRecorderDidFinishRecording(recorder: AVAudioRecorder!, successfully flag: Bool) {
}
func audioRecorderEncodeErrorDidOccur(recorder: AVAudioRecorder!, error: NSError!) {
println("Audio Record Encode Error")
}
</pre>
== Testing the Application ==
Follow the steps outlined in Testing Apps on iOS 8 Devices with Xcode 6 to configure the application for installation on an iOS device. Configure Xcode to install the application on the connected device and build and run the application by clicking on the run button in the main toolbar. Once loaded onto the device, the operating system will seek permission to allow the app to record audio. Select “OK” and touch the Record button to record some sound. Touch the Stop button when the recording is completed and use the Play button to play back the audio.
<google>BUY_IOS8</google>
<hr>
<table border="0" cellspacing="0" width="100%">
<tr>
<td width="20%">[[Playing Audio on iOS 8 using AVAudioPlayer|Previous]]<td align="center">[[iOS 8 App Development Essentials|Table of Contents]]<td width="20%" align="right">[[Integrating Twitter and Facebook into iOS 8 Applications using Swift|Next]]</td>
<tr>
<td width="20%">Playing Audio on iOS 8 using AVAudioPlayer<td align="center"><td width="20%" align="right">Integrating Twitter and Facebook into iOS 8 Applications using Swift</td>
</table>