音频单元(v3)可以代替应用间音频将音频发送到主机应用吗? [英] Can an audio unit (v3) replace inter-app audio to send audio to a host app?

查看:108
本文介绍了音频单元(v3)可以代替应用间音频将音频发送到主机应用吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的音乐表演应用程序使用AVAudioEngine播放音频,并使用应用程序间音频将引擎的输出发布到其他应用程序.这使用户可以将音频输入到在同一设备上运行的混音器应用程序中.由于IAA在iOS上已被弃用,而Mac上不支持IAA,因此我尝试用音频单元代替此功能.

我已经使用Xcode模板添加了类型为augn的音频单元扩展,并且我理解internalRenderBlock是实际上返回音频数据的东西.但是,扩展程序如何访问容器(主)应用程序中播放的音频?

这甚至有可能吗?我希望这将是一个常见的用例,因为音频单元被定位为IAA的替代品,但是我还没有看到任何人做这样的事的例子.我不想处理来自主机应用程序的输入,也不想从头开始产生声音.我需要点击包含应用程序正在播放的声音.

更新

我刚刚阅读了应用程序扩展编程指南.看起来并不乐观:

应用扩展程序仅与主机应用程序直接通信.应用程序扩展与其包含的应用程序之间不存在直接通信;通常,包含的扩展程序正在运行时,包含的应用程序甚至无法运行.

也:

Today小部件(没有其他应用程序扩展类型)可以通过调用NSExtensionContext类的openURL:completionHandler:方法来要求系统打开其包含的应用程序.任何应用程序扩展程序及其包含的应用程序都可以访问私有定义的共享容器中的共享数据.

如果这就是容器和扩展之间的数据共享范围,那么我将看不到这是如何工作的.该扩展程序需要实时访问AVAudioEngine节点,因此,如果包含应用程序的用户更改声音,播放,暂停,更改音量等,这些都将反映在主机应用程序接收到的输出中.

但是,如果AUv3不具备此功能,我想取消IAA会在平台上留下很大的空白.希望我没有想到另一种方法.

也许这需要以其他方式解决,所以在我的情况下,混音器应用程序将提供音频单元扩展,然后我的应用程序(音频播放器)将成为主机,并向混音器的扩展程序提供音频.但是,混音器应用程序将面临无法从其扩展程序获取传入音频的同样问题.

解决方案

除了通过AVAudioEngine播放音频外,应用还必须在音频单元扩展中发布其音频输出.该应用程序扩展程序的输出可能会变得对其他应用程序或其他应用程序中包含的Audio Unit扩展程序的输入可见.

已添加:要将音频数据从某个应用程序发送到其自己的应用程序扩展程序,您可以尝试将该应用程序及其扩展程序放在相同的App Extension Programming Guide. It doesn't look promising:

An app extension communicates directly only with the host app. There is no direct communication between an app extension and its containing app; typically, the containing app isn’t even running while a contained extension is running.

Also:

A Today widget (and no other app extension type) can ask the system to open its containing app by calling the openURL:completionHandler: method of the NSExtensionContext class. Any app extension and its containing app can access shared data in a privately defined shared container.

If that's the extent of the data sharing between the container and the extension, I don't see how this could work. The extension would need to access an AVAudioEngine node in real time so if the user of the containing app changes sounds, plays, pauses, changes volume, etc. that would all be reflected in the output that the host app receives.

And yet I feel like taking away IAA if AUv3 doesn't have this capability leaves a big gap in the platform. Hopefully there's another approach I'm not thinking of.

Maybe this would need to work the other way around, so in my situation, the mixer app would offer the audio unit extension, and then my app (an audio player) would be the host and provide the audio to the mixer's extension. But then the mixer app would have the same problem of not being able to obtain the incoming audio from its extension.

解决方案

In addition to playing the audio via AVAudioEngine, an app has to also publish its audio output in an Audio Unit extension. That app extension's output can potentially be made visible to the input of other apps or Audio Unit extensions contained in other apps.

Added: To send audio data from an app to its own app extension, you can try putting the app and its extension in the same App Group, creating a set of shared files, and perhaps memory mapping the shared file(s). Or use writeToFile:atomically: to put blocks of audio samples into a ring buffer of shared files.

Also, the original pre-IAA method in iOS was to use MIDI SysEx data packets to pass audio sample blocks between apps. This might be possible on macOS as well, with a fairly low latency.

这篇关于音频单元(v3)可以代替应用间音频将音频发送到主机应用吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆