Xamarin OpenEars 本机绑定不适用于设备但适用于模拟器 [英] Xamarin OpenEars Native Binding Not working on Device but works on Simulator

查看:33
本文介绍了Xamarin OpenEars 本机绑定不适用于设备但适用于模拟器的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直致力于在 xamarin iOS Binding 项目中使用 OpenEars v2.03 iOS 框架项目.让我解释一下我到目前为止所做的事情.我是 XCode、Xamarin 和所有这些 Binding 的新手.这将是一个大问题,所以请屏住呼吸......

1) 在 Xcode 中为模拟器构建 OpenEars 框架项目.从Framework/OpenEars.framework/Versions/Current/复制OpenEars"文件并重命名为libOpenEars-i386.a"

同样,通过将设备连接到 Mac 并选择目标到我的 iPhone,为 iPhone 4s 设备构建相同的库.最后复制生成的OpenEars并重命名为libOpenEars-armv7.a"

2) 使用 lipo 命令使用以下命令将两个文件(libOpenEars-i386.a、libOpenEars-armv7.a)捆绑到一个文件libOpenEars.a"中.

lipo -create -output libOpenEars.a libOpenEars-i386.a libOpenEars-armv7.a

3) 在 Xamarin Studio 中创建了一个 Binding 项目并添加了 libOpenEars.a,它会自动生成一个 libOpenEars.linkwith.cs.下面是下面的代码,

使用系统;使用 ObjCRuntime;[程序集:LinkWith ("libOpenEars.a", LinkTarget.ArmV7 | LinkTarget.Simulator, SmartLink = true, ForceLoad = true, Frameworks="AudioToolbox AVFoundation", IsCxx=true, LinkerFlags = "-lstdc++")]

我尝试更改喜欢者标志 LinkerFlags = "-lstdc++ -lc++ -ObjC" 和 SmartLink=false.

4) 我的 ApiDefinition 文件包含 OpenEars 的所有接口,我只在这里添加了一个接口.

[BaseType(typeof(NSObject))][协议]接口 OEEventsObserver{[包装(WeakDelegate")]OEEventsObserverDelegate 委托 { 获取;放;}[导出(委托",ArgumentSemantic.Assign),NullAllowed]NSObject WeakDelegate { get;放;}}

5) 将 OpenEars.dll 引用到我的 iOS 示例项目.

6) 在 Binding 库本身中添加语言模型和声学模型.(尽管动态语言模型生成不需要它,但我使用了来自这个

$lipo -info libOpenEars.a胖文件中的架构:libOpenEars.a 是:i386 armv7

检查$nm -arch armv7 libOpenEars.a

nm 命令输出在这里

检查模拟器 (i386) 中是否存在 OEEvent

$ nm -arch i386 libOpenEars.a |grep OEEvent

输出

U _OBJC_CLASS_$_OEEventsObserver00006aa0 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate000076f0 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate警告:/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm:没有名单libOpenEars.a(OEEventsObserver.o):00002174 S _OBJC_CLASS_$_OEEventsObserver00002170 S _OBJC_IVAR_$_OEEventsObserver._delegate00002188 S _OBJC_METACLASS_$_OEEventsObserverU _OBJC_CLASS_$_OEEventsObserver00002d90 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate000035a0 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate

检查armv7中存在OEEvent

$nm -arch armv7 libOpenEars.a |grep OEEvent

输出

 U _OBJC_CLASS_$_OEEventsObserver00005680 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate000062d8 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate警告:/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm:没有名单libOpenEars.a(OEEventsObserver.o):00001cb4 S _OBJC_CLASS_$_OEEventsObserver00001cb0 S _OBJC_IVAR_$_OEEventsObserver._delegate00001cc8 S _OBJC_METACLASS_$_OEEventsObserverU _OBJC_CLASS_$_OEEventsObserver00002638 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate00002e50 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate

我不确定我错过了什么.是的,这里有很多语法错误,感谢您花时间阅读本文.

解决方案

感谢 @poupou 和 @Halle 提出宝贵意见.最后,我使用包括 arm64 和 x86_64(必须)在内的所有架构构建了胖二进制文件.使用 lipo 在一个包中构建所有内容.现在就像魅力一样工作!...同时设置项目属性-> 高级-> SupportedArchi.-> ARMv7 用于在 ipad 2 和 iPhone 4 等设备上运行.仍然需要在 iPhone 6 和 6+ 上测试,我希望它们也能支持,因为它们是 arm64 家族.我不确定这在 ARMv7 上是如何工作的(iPhone 5、iPhone 5c、iPad 4).我在 OpenEars v2.03 中没有看到 ARMv7s 支持.

I have been working on using OpenEars v2.03 iOS framework project in xamarin iOS Binding project. Let me explain what i did so far,.I'm new to XCode, Xamarin and all this Binding things. This gonna be a big questions, so hold your breath…

1) Build the OpenEars framework project in Xcode for Simulator. Copied the "OpenEars" file from Framework/OpenEars.framework/Versions/Current/ and renamed to "libOpenEars-i386.a"

Likewise build the same library for iPhone 4s Device by connecting the device to Mac and chosen the target to my iPhone. Finally copied the generated OpenEars and renamed it to "libOpenEars-armv7.a"

2) Using lipo command bundled the two file (libOpenEars-i386.a, libOpenEars-armv7.a) to a single file "libOpenEars.a" using the below command.

lipo -create -output libOpenEars.a libOpenEars-i386.a libOpenEars-armv7.a 

3) Created a Binding project in Xamarin Studio and added the libOpenEars.a, it generates a libOpenEars.linkwith.cs automatically. Below is the following code,

using System;
using ObjCRuntime;

[assembly: LinkWith ("libOpenEars.a", LinkTarget.ArmV7 | LinkTarget.Simulator, SmartLink = true, ForceLoad = true, Frameworks="AudioToolbox AVFoundation", IsCxx=true, LinkerFlags = "-lstdc++")]

I tried changing the liker flags LinkerFlags = "-lstdc++ -lc++ -ObjC" and SmartLink=false.

4) My ApiDefinition file contain all interface for OpenEars, i just added only one interface here.

[BaseType(typeof(NSObject))]
[Protocol]
interface OEEventsObserver
{
    [Wrap ("WeakDelegate")]
    OEEventsObserverDelegate Delegate { get; set; }

    [Export ("delegate", ArgumentSemantic.Assign), NullAllowed]
    NSObject WeakDelegate { get; set; }
}

5) Referenced the OpenEars.dll to my iOS sample project.

6) Add the language model and acoustic model in the Binding library itself. (Even though it is not needed for dynamic Language model generation , i used the old OpenEars sample project from this OpenEars Xamarin git, i dind’t used the new DynamicLanguageModel generator but modified the example for latest changes).

View Controller:

public partial class OpenEarsNewApiViewController : UIViewController
{
    OEEventsObserver observer;
    OEFliteController fliteController;
    OEPocketsphinxController pocketSphinxController;


    String pathToLanguageModel;
    String pathToDictionary;
    String pathToAcousticModel;

    String firstVoiceToUse;
    String secondVoiceToUse;

    static bool UserInterfaceIdiomIsPhone {
        get { return UIDevice.CurrentDevice.UserInterfaceIdiom == UIUserInterfaceIdiom.Phone; }
    }

    public void init()
    {
        try
        {
            observer = new OEEventsObserver();
            observer.Delegate = new OpenEarsEventsObserverDelegate (this);
            pocketSphinxController = new OEPocketsphinxController ();

            fliteController = new OEFliteController();

            firstVoiceToUse = "cmu_us_slt";
            secondVoiceToUse = "cmu_us_rms";

            pathToLanguageModel = NSBundle.MainBundle.ResourcePath + System.IO.Path.DirectorySeparatorChar + "OpenEars1.languagemodel";
            pathToDictionary = NSBundle.MainBundle.ResourcePath + System.IO.Path.DirectorySeparatorChar + "OpenEars1.dic";
            pathToAcousticModel = NSBundle.MainBundle.ResourcePath;
        }
        catch(Exception e) {
            Console.WriteLine ("Exception Message :"+e.Message);
            Console.WriteLine ("Inner Exception Mesage :"+e.InnerException.Message);
        }

    }

    public OpenEarsNewApiViewController (IntPtr handle) : base (handle)
    {
        init ();
    }

    #region Update

    public void UpdateStatus (String text)
    {
        txtStatus.Text = text;
    }

    public void UpdateText (String text)
    {
        txtOutput.Text = text;
    }

    public void UpdateButtonStates (bool hidden1, bool hidden2, bool hidden3, bool hidden4)
    {
        btnStartListening.Hidden = hidden1;
        btnStopListening.Hidden = hidden2;
        btnSuspend.Hidden = hidden3;
        btnResume.Hidden = hidden4;
    }

    public void Say (String text)
    {
        //fliteController.SaywithVoice (text, secondVoiceToUse);
    }

    public void StartListening ()
    {
        //pocketSphinxController.RequestMicPermission ();
        if (!pocketSphinxController.IsListening) {

            //NSString *correctPathToMyLanguageModelFile = [NSString stringWithFormat:@"%@/TheNameIChoseForMyLanguageModelAndDictionaryFile.%@",[NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) objectAtIndex:0],@"DMP"];


            pocketSphinxController.StartListeningWithLanguageModelAtPath (
                pathToLanguageModel,
                pathToDictionary,
                pathToAcousticModel,
                false
            );
        } else {
            new UIAlertView ("Notify !!","Already Listening",null,"OK","Stop").Show();

        }

    }

    public void StopListening ()
    {
        //pocketSphinxController.StopListening ();
    }

    public void SuspendRecognition ()
    {
        pocketSphinxController.SuspendRecognition ();
    }

    public void ResumeRecognition ()
    {
        pocketSphinxController.ResumeRecognition ();
    }

    #endregion

    #region Event Handlers

    partial void btnStartListening_TouchUpInside (UIButton sender)
    {
        try
        {
            StartListening();
            //fliteController.Init();
            //Console.WriteLine("Speech in Progress :"+fliteController.SpeechInProgress);
            //fliteController.Say("Hai", new OEFliteVoice());

            UpdateButtonStates (true, false, false, true);
            Console.WriteLine("Speech in Progress :"+fliteController.SpeechInProgress);
        }
        catch(Exception e)
        {
            Console.WriteLine(e.Message);
        }
    }

    partial void btnStopListening_TouchUpInside (UIButton sender)
    {
        StopListening ();
        UpdateButtonStates (false, true, true, true);
    }

    partial void btnSuspend_TouchUpInside (UIButton sender)
    {
        SuspendRecognition ();
        UpdateButtonStates (true, false, true, false);
    }

    partial void btnResume_TouchUpInside (UIButton sender)
    {
        ResumeRecognition ();
        UpdateButtonStates (true, false, false, true);
    }
}

OpenEarsEventsObserverDelegate:

// nothing much here just to check the status and debugging 

public class OpenEarsEventsObserverDelegate:OEEventsObserverDelegate
{
    OpenEarsNewApiViewController _controller;

    public OpenEarsNewApiViewController controller {
        get {
            return _controller;
        }
        set {
            _controller = value;
        }
    }

    public OpenEarsEventsObserverDelegate (OpenEarsNewApiViewController ctrl)
    {
        controller = ctrl;
    }

    public override void PocketsphinxRecognitionLoopDidStart()
    {
        //base.PocketsphinxRecognitionLoopDidStart();

        Console.WriteLine ("Pocketsphinx is starting up");
        controller.UpdateStatus ("Pocketsphinx is starting up");
    }

    public override void PocketsphinxDidReceiveHypothesis (Foundation.NSString hypothesis, Foundation.NSString recognitionScore, Foundation.NSString utteranceID)
    {
        controller.UpdateText ("Heard: " + hypothesis);
        controller.Say ("You said: " + hypothesis);
    }

    public override void PocketSphinxContinuousSetupDidFail ()
    {

    }

    public override void PocketsphinxDidCompleteCalibration ()
    {
        Console.WriteLine ("Pocket calibration is complete");
        controller.UpdateStatus ("Pocket calibratio is complete");
    }

    public override void PocketsphinxDidDetectSpeech ()
    {

    }

    public override void PocketsphinxDidStartListening ()
    {
        Console.WriteLine ("Pocketsphinx is now listening");
        controller.UpdateStatus ("Pocketphinx is now listening");
        controller.UpdateButtonStates (true, false, false, true);
    }

    public override void PocketsphinxDidStopListening ()
    {

    }

    public override void PocketsphinxDidStartCalibration ()
    {
        Console.WriteLine ("Pocketsphinx calibration has started.");
        controller.UpdateStatus ("Pocketsphinx calibration has started");
    }

    public override void PocketsphinxDidResumeRecognition ()
    {

    }

    public override void PocketsphinxDidSuspendRecognition ()
    {

    }

    public override void PocketsphinxDidDetectFinishedSpeech ()
    {

    }

    public override void FliteDidStartSpeaking ()
    {

    }

    public override void FliteDidFinishSpeaking ()
    {

    }
}

This works perfectly on iOS simulator but not running on real device.

I got this Error message while running on device.I'm getting the same message for all the interfaces.

Exception Message :Wrapper type 'OpenEars.OEEventsObserver' is missing its native ObjectiveC class 'OEEventsObserver'.

2015-05-15 12:55:26.996 OpenEarsNewApi[1359:231264] Unhandled managed  exception: Exception has been thrown by the target of an invocation.  (System.Reflection.TargetInvocationException)
at System.Reflection.MonoCMethod.InternalInvoke (System.Object obj,   System.Object[] parameters) [0x00016] in   /Developer/MonoTouch/Source/mono/mcs/class/corlib/System.Reflection/MonoMethod.cs:543 

Am i missing anything related to Binding for devices?

i tried building the same .dll using make files also, but got the same error message.

For building OpenEars Framework:

xcodebuild -project OpenEars.xcodeproj -target OpenEars -sdk iphonesimulator8.2 -arch i386 -configuration Release clean build

xcodebuild -project OpenEars.xcodeproj -target OpenEars -sdk iphoneos -arch armv7 -configuration Release clean build

MAKE file for Genrating OpenEars.dll

BTOUCH=/Developer/MonoTouch/usr/bin/btouch-native

all: OpenEars.dll


OpenEars.dll: AssemblyInfo.cs OpenEars.cs libOpenEars.a
$(BTOUCH) -unsafe --new-style -out:$@ OpenEars.cs -x=AssemblyInfo.cs --link-with=libOpenEars.a,libOpenEars.a

clean:
   -rm -f *.dll

Check the complete mtouch error log here

$lipo -info libOpenEars.a

Architectures in the fat file: libOpenEars.a are: i386 armv7 

Check the $nm -arch armv7 libOpenEars.a

nm command output here

checked the OEEvent exist in simulator (i386)

$ nm -arch i386 libOpenEars.a | grep OEEvent

OUTPUT

U _OBJC_CLASS_$_OEEventsObserver
00006aa0 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000076f0 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate
warning: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm: no name list
libOpenEars.a(OEEventsObserver.o):
00002174 S _OBJC_CLASS_$_OEEventsObserver
00002170 S _OBJC_IVAR_$_OEEventsObserver._delegate
00002188 S _OBJC_METACLASS_$_OEEventsObserver
     U _OBJC_CLASS_$_OEEventsObserver
00002d90 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000035a0 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate

checked the OEEvent exist in armv7

$nm -arch armv7 libOpenEars.a | grep OEEvent

OUTPUT

 U _OBJC_CLASS_$_OEEventsObserver
00005680 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000062d8 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate
warning:    /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm: no name list
libOpenEars.a(OEEventsObserver.o):
00001cb4 S _OBJC_CLASS_$_OEEventsObserver
00001cb0 S _OBJC_IVAR_$_OEEventsObserver._delegate
00001cc8 S _OBJC_METACLASS_$_OEEventsObserver
     U _OBJC_CLASS_$_OEEventsObserver
00002638 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
00002e50 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate

I'm not sure what i am missing. Yup there is lot of grammar mistakes and i Thank you for your time spend on reading this.

解决方案

Thanks @poupou and @Halle for your valuable comments. Finally i build the fat binary using all the architectures including arm64 and x86_64 (must). Used lipo to build all in one package.Now that works like charm !... Also set the project properties-> Advanced-> SupportedArchi. -> ARMv7 for running in device like ipad 2 and iPhone 4. Still need to test in iPhone 6 and 6+, i hope that might also support since they are arm64 family. I'm not sure about how this works on ARMv7s like(iPhone 5, iPhone 5c, iPad 4). I'm not seeing the ARMv7s support in OpenEars v2.03.

这篇关于Xamarin OpenEars 本机绑定不适用于设备但适用于模拟器的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆