Scala中具有SBT的HBase依赖项 [英] HBase dependencies with SBT in Scala

查看:126
本文介绍了Scala中具有SBT的HBase依赖项的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是Scala,SBT和Intellij的新手.

I am new with Scala, SBT and Intellij.

使用以下sbt文件:

name := "mycompany"

version := "0.0.1-SNAPSHOT"

scalaVersion := "2.11.8"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.0.1",
  "org.apache.spark" %% "spark-sql" % "2.0.1",
  "org.apache.spark" %% "spark-mllib" % "2.0.1",
  "org.apache.hbase" % "hbase-client" % "1.2.0",
  "com.typesafe.akka" %% "akka-http-experimental" % "2.4.11"
)

resolvers ++= Seq(
  "Apache Repository" at "https://repository.apache.org/content/repositories/releases/"
)

这三个Apache Spark依赖项在Intellij中带有红色下划线,并带有'Unreasolved dpendancy'标签.但是,我可以导入Spark库,并且我的Spark作业可以在本地模式下运行而不会出现任何问题.

The three Apache Spark dependencies are red underlined in Intellij with an 'Unreasolved dpendancy' tag. However, I can import Spark libraries and my Spark jobs run in local mode without any issue.

我无法从IDE内的HBase库导入.以下导入全部无法解析

I can not import from the HBase library inside the IDE. The following imports all can not be resolved

import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.KeyValue
import org.apache.hadoop.hbase.MasterNotRunningException;
import org.apache.hadoop.hbase.TableName
import org.apache.hadoop.hbase.ZooKeeperConnectionException
import org.apache.hadoop.hbase.client.Connection
import org.apache.hadoop.hbase.client.ConnectionFactory
import org.apache.hadoop.hbase.client.Get
import org.apache.hadoop.hbase.client.Result
import org.apache.hadoop.hbase.client.Table
import org.apache.hadoop.hbase.util.Bytes

我已经使用上面的导入在Java中编写了代码,没有任何问题,并且只在maven中使用了以下行:

I have written code in java using the above imports without any issue and only with these lines in maven:

    <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-client</artifactId>
        <scope>provided</scope>
        <version>1.2.0</version>
    </dependency>

我在做什么错了?

谢谢

编辑

感谢pamu的帖子,我将解析器替换为:

Thanks to pamu's post, I have replaced the resolvers with:

resolvers ++= Seq(
  "Apache Repository" at "https://repository.apache.org/content/repositories/releases/",
  "Cloudera repo" at "//repository.cloudera.com/artifactory/cloudera-repos/"
)

但是,我仍然有一些未解决的导入(上面的其他操作现在还可以):

However, I still have some unresolved imports (other above are now OK):

import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.KeyValue
import org.apache.hadoop.hbase.util.TableName
import org.apache.hadoop.hbase.util.Bytes

感谢您的帮助

编辑

libraryDependencies ++= Seq(
  "org.apache.hbase" % "hbase-server" % 1.2.1,
  "org.apache.hbase" % "hbase-client" % 1.2.1,
  "org.apache.hbase" % "hbase-common" % 1.2.1,
  "org.apache.hadoop" % "hadoop-common" % 2.7.3
)

推荐答案

在提到的1.2.0版的hbase lib中,没有这样的类.您可以使用jar -tvf进行检查.这些类存在于2.0.0中(确定) hbase API

In mentioned hbase lib with version 1.2.0 there are not such classes. You can check using jar -tvf. Those classes exists in 2.0.0 (sure) hbase API

这篇关于Scala中具有SBT的HBase依赖项的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆