Monday 15 September 2014

More AVAudioPlayerNode with Swift and CompletionHandlers

Continuing on the subject of my previous posts looking at the new AVFoundation Audio classes in OSX10.10/iOS8 with Swift I finally found the error and a relatively obvious one at that. It wasn't closures specifically, but reading about them extended my knowledge and helped finding the cause of the problem.

I had the type-alias for the AVAudioNodeCompletionHandler all wrong. Not sure where I got that definition from, but my newness to the terminology of the error report put me off the scent. Xcode was quite clearly saying what the problem was:




Taking out atTime and options brings the function down to the simpler case of:

func scheduleBuffer(_ bufferAVAudioPCMBuffer!,
  completionHandler completionHandlerAVAudioNodeCompletionHandler!)

What the error message is saying (when I finally understood it) was that the completion handler Tuple  - e.g. parameter types were not correct - the correct one is () and I had used (AVAudioPCMBuffer!, AVAudioTime!). It helped reading about closures to understand this, although that wasn't the cause of the problem. It does help understanding the syntax and concepts of Swift in a good deal more detail though.

The type alias of AVAudioNodeCompletionHandler is far much simpler, and for completeness is described below:

typealias AVAudioNodeCompletionHandler = @objc_block () -> Void

Putting this into the code (again in a Playground this is too slow), you get something like this:

fun handler() -> Void
{
    // do some audio work
}

player.scheduleBuffer(buffer,atTime:nil, options:nil, completionHandler: handler)

Or, now with my new found understanding of closures, like this:

player.scheduleBuffer(buffer,atTime:nil, options:nil,
    completionHandler: { () -> Void in
        // do some audio work

    })

Trying this again with the completion handler trick works nicely this time, but still annoyingly beats, so there is some other effect here that isn't working

//
//  main.swift
//  Audio
//
//  Created by hondrou on 11/09/2014.
//  Copyright (c) 2014 hondrou. All rights reserved.
//

import Foundation
import AVFoundation

let twopi:Float = 2.0 * 3.14159

var freq:Float = 440.00
var sampleRate:Float = 44100.00

var engine = AVAudioEngine()
var player:AVAudioPlayerNode = AVAudioPlayerNode()
var mixer = engine.mainMixerNode


var length = 4000

var buffer = AVAudioPCMBuffer(PCMFormat: player.outputFormatForBus(0),frameCapacity:AVAudioFrameCount(length))

buffer.frameLength = AVAudioFrameCount(length)


engine.attachNode(player)
engine.connect(player,to:mixer,format:mixer.outputFormatForBus(0))

var error:NSErrorPointer = nil
engine.startAndReturnError(error)

var j:Int=0;

func handler() -> Void
{
    for (var i=0; i<length; i++)
    {
        var val:Float = 5.0 * sin(Float(j)*twopi*freq/sampleRate)
        buffer.floatChannelData.memory[i] = val
        j++
    }
    
    player.scheduleBuffer(buffer,atTime:nil,options:.InterruptsAtLoop,completionHandler:handler)
}

handler()

player.play()


while (true)
{
    NSThread.sleepForTimeInterval(2)
    
    freq += 10
}

Hmph.... more still to get sorted


Now, just in case you wondered (I did) where I cooked up the handler function parameters, it was all down to mixing up two function type aliases that I'd been looking at. The previous incorrect handler function for the completionHandler below:

func handler(buffer:AVAudioPCMBuffer!,time:AVAudioTime!) -> Void
{
}

Is completely the proper type of function if you are installing an audio node tap block:

typealias AVAudioNodeTapBlock = (AVAudioPCMBuffer!, AVAudioTime!) -> Void

Which I'd also been thinking about at the time (and we will most likely be coming to next in our investigations).

For completeness, this is used in the AudioNode function installTapOnBus below:

func installTapOnBus(_ busAVAudioNodeBus,
          bufferSize bufferSizeAVAudioFrameCount,
              format formatAVAudioFormat!,
               block tapBlockAVAudioNodeTapBlock!)

8 comments:

  1. Hi. Not sure my last post was sent correctly??

    Do you know how to speed the the installTapOnBus callback? bufferSize: seems to be hard locked to 18000 which is approx 373 milliseconds roundtrip time, far too slow to analyse the buffer for realtime pitch detection from the microphone.

    Any advice would be much appreciated.

    ReplyDelete
    Replies
    1. Hi Geoff, sorry for not getting to your comment earlier, I've had some other things on that have distracted my time from being able to take a look before.

      I just had a dabble this morning to how installing taps work as I haven't had a chance until now. I was also a bit put off by another posting in a similar vein to this and the audio generation I have been trying. My next plan was to try to install a tap and see if I could do something tricky that way.

      Anyway, it seems that I could get a smaller bufferSize working ok. It's pretty kludgy code, but I just tried reading the input and averaging out the values for a small buffer. I tried this and adjusted some audio going in and the numbers go up and down, so it's no more scientific than that.

      I posted the code today, so take a look at the most recent:
      http://hondrouthoughts.blogspot.com/2014/09/avfoundation-audio-monitoring.html

      Please let me know how you are getting on, the more people looking at this and sharing their experiences the better.

      Delete
  2. Hey there. Thanks for getting back to me. I found a solution in the end and managed to speed up the tap down to 40ms (better than the fixed 374ms). I posted a solution on the Apple support forums here:

    https://devforums.apple.com/thread/249510?tstart=0

    Hope you find it useful.

    ReplyDelete
  3. It's not easy working in Swift in the latest GM versions of Xcode. Lots of bugs like errors as you type rather than after you type hanging around in the gutter. Code colouring gone mad, Engine crashes etc. I am finding Swift a great language but the xCode regression is becoming annoying.

    Anyway, I managed to get a basic level metering working using AVAudioEngine and an Accelerate function but for some reason the screen updates (visual db meter) were about 1/2 second late. Are you interested in the swift code to play with?

    ReplyDelete
  4. Hi Geoff, thank you for updating. I'm having exactly the same experience using XCode 6.1 GM seed 1. Crashing all the time while typing and very frustrating. I have had to put things on the back-burner for the past week or so due to some work commitments and couldn't face fighting with XCode in the evening when it should have been fun.

    I'd really appreciate looking at the Swift code you have. All learning is good at the moment as there seems to be very little out there on audio use. I'm still bashing my head against how to do synthesis in Swift as the language and compile seems fast enough but the AVFoundation API bindings seem to be too limited to make it easily.

    ReplyDelete
  5. I see this post is quite old, but I'll leave this here in case someone else gets stuck with it (I've spent ~5 hours on that).

    You can fix the choppy sound by using two independent buffers:

    let buffer = AVAudioPCMBuffer(PCMFormat: player.outputFormatForBus(0),frameCapacity:AVAudioFrameCount(length))
    let buffer2 = AVAudioPCMBuffer(PCMFormat: player.outputFormatForBus(0),frameCapacity:AVAudioFrameCount(length))
    buffer.frameLength = AVAudioFrameCount(length)
    buffer2.frameLength = AVAudioFrameCount(length)

    func handler() -> Void
    {
    let qc = length
    if(currentBuffer) {
    for (var i=0; i<qc; i=i+1)
    {
    let val:Float = 1.0 * sin(Float(j)*(2.0 * 3.14159265)*frequency/sampleRate)
    buffer.floatChannelData.memory[i] = val
    j += 1
    }
    player.scheduleBuffer(buffer,atTime:nil,options:.InterruptsAtLoop,completionHandler:handler)
    } else {
    for (var i=0; i<qc; i=i+1)
    {
    let val:Float = 1.0 * sin(Float(j)*(2.0 * 3.14159265)*frequency/sampleRate)
    buffer2.floatChannelData.memory[i] = val
    j += 1
    }
    player.scheduleBuffer(buffer2,atTime:nil,options:.InterruptsAtLoop,completionHandler:handler)
    }
    nextTime = nextTime + length;
    currentBuffer = !currentBuffer;
    }

    handler()
    handler()

    This sounds totally smooth.

    ReplyDelete
  6. Good to see an update. When I get time I will revisit all this and see how things have improved. I feel the need to switch to Swift more often than not these days.

    ReplyDelete
    Replies
    1. Hello Geoff! I hope you still look at this site. Any chance you still have the audio metering code you were talking about earlier?

      Delete