对象databricks不是com包的成员 [英] object databricks is not a member of package com

查看:58
本文介绍了对象databricks不是com包的成员的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用Zeppelin(HDP 2.6)在Spark2中使用Stanford NLP库.显然,Databricks为Stanford NLP库构建了包装器.链接: https://github.com/databricks/spark-corenlp

I am trying to use Stanford NLP library in Spark2 using Zeppelin (HDP 2.6). Apparently there is wrapper built by Databricks for the Stanford NLP library for Spark. Link: https://github.com/databricks/spark-corenlp

我已从此处下载了用于上述包装的罐子并从此处下载.然后,我在Zeppelin的Spark2解释器设置中将这两组jars添加为依赖项,并重新启动了解释器.

I have downloaded the jar for the above wrapper from here and also downloaded Stanford NLP jars from here. Then I have added both sets of jars as dependencies in Spark2 interpreter settings of Zeppelin and restarted the interpreter.

下面的示例程序仍然给出错误对象databricks不是包com的成员"导入com.databricks.spark.corenlp.functions ._"

Still the below sample program gives the error "object databricks is not a member of package com import com.databricks.spark.corenlp.functions._"

import org.apache.spark.sql.functions._
import com.databricks.spark.corenlp.functions._

import sqlContext.implicits._

val input = Seq(
  (1, "<xml>Stanford University is located in California. It is a great university.</xml>")
).toDF("id", "text")

val output = input
  .select(cleanxml('text).as('doc))
  .select(explode(ssplit('doc)).as('sen))
  .select('sen, tokenize('sen).as('words), ner('sen).as('nerTags), sentiment('sen).as('sentiment))

output.show(truncate = false)

推荐答案

问题与为Databricks corenlp下载jar文件有关.我是从位置下载的.问题解决了.

The problem was related to downloading the jar file for Databricks corenlp. I downloaded it from this location. Problem solved.

这篇关于对象databricks不是com包的成员的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆