如何为Spark Streaming定义Kafka(数据源)依赖项? [英] How to define Kafka (data source) dependencies for Spark Streaming?

查看:207
本文介绍了如何为Spark Streaming定义Kafka(数据源)依赖项?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用spark-streaming2.0.0消耗kafka 0.8主题,我正在尝试确定我在build.sbt文件中使用这些依赖项尝试过的依赖项

I'm trying to consume a kafka 0.8 topic using spark-streaming2.0.0, i'm trying to identify the required dependencies i have tried using these dependencies in my build.sbt file

libraryDependencies += "org.apache.spark" %% "spark-streaming_2.11" % "2.0.0"

当我运行sbt软件包时,我得到了所有这三个jar的未解决的依赖关系,

when i run sbt package i'm getting unresolved dependencies for all three these jars,

但是这些罐子确实存在

https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-8_2.11/2.0.0

请帮助调试此问题,我是Scala的新手,所以如果我做对的事情不正确,请告诉我

Please help in debugging this issue, I'm new to Scala so please let me know if i'm not doing something right

推荐答案

问题是您要指定Scala版本,并且还要使用%% 来尝试推断您所使用的Scala版本重新使用.

The problem is that you're specifying the Scala version and also using %% which tries to infer which Scala version you're using.

要么删除一个%:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-8_2.11" % "2.0.0"

或删除Scala版本:

Or remove the Scala version:

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.0.0"
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka-0-8" % "2.0.0"

这篇关于如何为Spark Streaming定义Kafka(数据源)依赖项?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆