从Amazon SQS馈送Apache Spark流? [英] Feeding Apache Spark Streaming from Amazon SQS?

查看:134
本文介绍了从Amazon SQS馈送Apache Spark流?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

Spark可以按照文档中的说明以多种方式进行馈送(例如Kafka,Flume,Twitter,ZeroMQ,Kinesis或普通的旧式TCP套接字). 有人知道如何从Amazon SQS馈送Spark Streaming吗?

Spark can be fed in many ways as it is explained in the documentation (like Kafka, Flume, Twitter, ZeroMQ, Kinesis or plain old TCP sockets). Does anybody know how to feed Spark Streaming from Amazon SQS?

推荐答案

有一个名为spark-sql-receiver的github项目.它已使用spark-sqs-receiver_2.10的com.github.imapiartifactId的groupId上载到Maven存储库.当前版本为1.0.1.通过 github项目的外观,它也在积极维护中.以下是从项目的README.md文件中无耻复制的一些示例代码:

There's a github project called spark-sql-receiver. It's been uploaded to the maven repository with the groupId of com.github.imapi artifactId of spark-sqs-receiver_2.10. It's currently on version 1.0.1. By the looks of the github project, it's being actively maintained as well. The following is some sample code shamelessly copied from the project's README.md file:

ssc.receiverStream(new SQSReceiver("sample")
      .credentials(<key>, <secret>)
      .at(Regions.US_EAST_1)
      .withTimeout(2))

这篇关于从Amazon SQS馈送Apache Spark流?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆