从音频单元的渲染线程调用MusicDeviceMIDIEvent [英] Calling MusicDeviceMIDIEvent from the audio unit's render thread

查看:266
本文介绍了从音频单元的渲染线程调用MusicDeviceMIDIEvent的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

关于MusicDeviceMIDIEvent,我不了解一件事.在我见过的每个示例中(搜索过Github和Apple示例),总是在主线程中使用它.现在,为了使用样本偏移量参数,文档指出:

There's one thing I don't understand about MusicDeviceMIDIEvent. In every single example I ever seen (searched Github and Apple examples) it was always used from the main thread. Now, in order to use the sample offset parameter the documentation states:

inOffsetSampleFrame: 如果要从音频单元的渲染线程安排MIDI事件,则可以提供一个 在下一个音频单元渲染中应用该事件时,音频单元可以应用的采样偏移量. 这使您可以安排采样时间,即应用MIDI命令的时间,特别是 开始新笔记时很重要.如果您不在音频单元的渲染线程中进行调度, 那么您应该将此值设置为0

inOffsetSampleFrame: If you are scheduling the MIDI Event from the audio unit's render thread, then you can supply a sample offset that the audio unit may apply when applying that event in its next audio unit render. This allows you to schedule to the sample, the time when a MIDI command is applied and is particularly important when starting new notes. If you are not scheduling in the audio unit's render thread, then you should set this value to 0

即使在最简单的情况下,即使只有最简单的采样器音频单元和io单元,由于采样器不允许渲染回调甚至如果可以(或者如果只是使用io的回调函数来敲入),那会感到有点骇人听闻,因为render回调函数不适用于计划MIDI事件?

Still, even in the most simple case, in which you only have a sampler audio unit and an io unit, how can you schedule MIDI events from the audio unit's render thread since the sampler doesn't allow a render callback and even if it would (or if you use the io's callback just to tap in), it would feel hackish, since the render callback is not intended for schedule MIDI events?

如何从音频单元的渲染线程正确调用此功能?

How does one correctly calls this function from the audio unit's render thread?

推荐答案

renderNotify回调是从渲染线程进行调度的理想场所.您甚至可以在MusicDevice本身上设置renderNotify.这就是AUSampler上的样子.

A renderNotify callback is a perfect place to do scheduling from the render thread. You can even set the renderNotify on the MusicDevice itself. Here's what it might look like on an AUSampler.

OSStatus status = AudioUnitAddRenderNotify(sampler, renderNotify, sampler);

在此示例中,我通过inRefCon参数传递了采样器作为参考,并且仅发送了一个note-on(144)以每44100个采样记录64个注释,但是在应用程序中,您将使用对您的Midi设备的引用,以及进行计划所需的所有值.请注意检查渲染标志是否为预渲染.

In this example I passed the sampler in as a reference via the inRefCon argument, and am just sending a note-on(144) to note 64 every 44100 samples, but in an application you would pass in a c struct to inRefCon with a reference to your midi device, and all the values you need to do your scheduling. Note the checking of the render flag for pre-render.

static OSStatus renderNotify(void                         * inRefCon,
                             AudioUnitRenderActionFlags   * ioActionFlags,
                             const AudioTimeStamp         * inTimeStamp,
                             UInt32                       inBusNumber,
                             UInt32                       inNumberFrames,
                             AudioBufferList              * ioData) {

    AudioUnit sampler = inRefCon;
    if (ioActionFlags & kAudioUnitRenderAction_PreRender) {
        for (int i = 0; i < inNumberFrames; i++) {
            if (fmod(inTimeStamp->mSampleTime + i, 44000) == 0) {
                MusicDeviceMIDIEvent(sampler,144, 64, 127, i); // i is the offset from render start, so use it for offset argument.
            }
        }
    }

    return noErr;
}

这篇关于从音频单元的渲染线程调用MusicDeviceMIDIEvent的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆