UIMA与Spark [英] UIMA with Spark

查看:71
本文介绍了UIMA与Spark的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

此处所述 UIMA和分布基础架构中的spark之间存在一些重叠.我打算将UIMA与spark一起使用. (现在我要转到UIMAFit)谁能告诉我在开发带有spark的uima时我们真正面临的问题是什么. 以及可能遇到的情况. (很抱歉,我尚未对此进行任何研究.)

as said in here there are some overlap between UIMA and spark in distribution infrastructures. I was planning to use UIMA with spark. (now i am moving to UIMAFit) Can any one tell me what are the problems we really face when we develop uima with spark. And what are the possible encounters. (Sorry I haven't done any research on this.)

推荐答案

主要问题是访问对象,因为UIMA会在运行其分析引擎时尝试重新实例化对象.如果对象具有本地引用,则从远程Spark群集进行访问将存在问题.一些RDD功能可能无法在UIMA上下文中使用.但是,如果您不使用单独的远程群集,则不会有问题. (我说的是uima-fit 2.2)

The main problem is accessing objects because UIMA tries to re instantiate objects when running their analyse engines. if the objects has local references then there will be a problem with accessing from a remote spark cluster. some RDD functions might not work within UIMA context. however if you don't use a separate remote cluster then there won't be a problem. (I am talking about uima-fit 2.2)

这篇关于UIMA与Spark的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆