ARKit –如何实现SCNRenderer().audioListener? [英] ARKit – How to implement SCNRenderer().audioListener?
问题描述
我正在尝试在我的应用程序中实现这种新的定位音频.
i'm try to implement this new Positional audio on my application.
使用tapGesture我在场景中插入了一个Drone,然后将sound.mp3附加到它上
with a tapGesture I insert a Drone on my scene, and I attach to it the sound.mp3
func tapToPlace(node: SCNNode, recognizer: UITapGestureRecognizer, view : ARSCNView){
debugPrint("tap")
if gameState == .placeObject {
DispatchQueue.global().async {
let tapedScreen = recognizer.location(in: view)
guard let query = view.raycastQuery(from: tapedScreen, allowing: .existingPlaneGeometry, alignment: .horizontal) else {return}
let result = view.session.raycast(query).first
guard let worldTransform = result?.worldTransform else {return}// simd_Float4x4
let newNode = node.clone() // duplicate the node create at app start up
newNode.position = SCNVector3(worldTransform.columns.3.x, worldTransform.columns.3.y, worldTransform.columns.3.z) // place it at position tapped
// set up position audio
let audio = SCNAudioSource(fileNamed: "sound.mp3")! // add audio file
audio.loops = true
audio.volume = 0.3
audio.rate = 0.1
audio.isPositional = true
audio.shouldStream = false
audio.load()
let player = SCNAudioPlayer(source: audio)
newNode.addAudioPlayer(player)
view.scene.rootNode.addChildNode(newNode)
}
}
}
阅读Apple文档似乎需要实现此 audioListner:SCNnode
Reading apple documentation looks like need to be implement this audioListner: SCNnode
我该怎么做?
我尝试了以下方法:
我获得了相机的当前位置.
I get the camera current location.
func trackCameraLocation(arView: ARSCNView) -> simd_float4x4 {
var cameraloc : simd_float4x4!
if let camera = arView.session.currentFrame?.camera.transform {
cameraloc = camera
}
return cameraloc
}
我在方法内部使用了此方法更新框架,以便获得准确的用户位置.
I use this inside the method did update frame, in order to have the accurate user location.
func session(_ session: ARSession, didUpdate frame: ARFrame) {
cameraLocation = trackCameraLocation(arView: sceneView)
}
一旦有了摄像机的位置,在didAdd节点内部,我便尝试设置audioListner.
Once I have the camera location, inside the method didAdd node I tried to set the audioListner..
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
let cameraNode = SCNNode()
cameraNode.position = SCNVector3(cameraLocation.columns.3.x, cameraLocation.columns.3.y, cameraLocation.columns.3.z)
renderer.audioListener = cameraNode
}
但是..没有任何效果..无法听到任何声音..我只是看到我的无人机在我家地板上的寂静.
but.. nothing work.. can't hear any audio..I just see my Drone silence on the floor of my house.
寻找一些帮助或解释如何实现ARKit的新未来.
Looking for some help or explanation how to implement this new future of ARKit.
提前感谢您的帮助.
我在其中放置音频文件的地方:
here where I put my audio file:
推荐答案
尝试此解决方案.它既可以在VR应用程序中运行,也可以在AR应用程序中运行.
Try this solution. It works in VR app, as well as in AR app.
import SceneKit
extension ViewController: SCNSceneRendererDelegate {
func renderer(_ renderer: SCNSceneRenderer,
updateAtTime time: TimeInterval) {
listener.position.z = -20 // change listener's position here
renderer.audioListener = self.listener
}
}
...
class ViewController: UIViewController {
let scene = SCNScene()
let audioNode = SCNNode()
let listener = SCNNode()
override func viewDidLoad() {
super.viewDidLoad()
let sceneView = self.view as! SCNView
sceneView.scene = self.scene
sceneView.backgroundColor = .black
sceneView.delegate = self
let node = SCNNode()
node.geometry = SCNSphere(radius: 0.05)
node.position = SCNVector3(0,0,-2)
self.scene.rootNode.addChildNode(node)
let path = Bundle.main.path(forResource: "art.scnassets/audio",
ofType: "mp3") // MONO AUDIO
let url = URL(fileURLWithPath: path!)
let source = SCNAudioSource(url: url)!
source.isPositional = true
source.shouldStream = false
source.load()
let player = SCNAudioPlayer(source: source)
node.addChildNode(audioNode)
// THE LOCATION OF THIS LINE IS IMPORTANT
audioNode.addAudioPlayer(player)
audioNode.addChildNode(self.listener)
}
}
这篇关于ARKit –如何实现SCNRenderer().audioListener?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!