使用Cassandra的Scala Spark Filter RDD [英] Scala Spark Filter RDD using Cassandra
问题描述
我是Spark-Cassandra和Scala的新手.我有一个现有的RDD.让我们说:
I am new to spark-Cassandra and Scala. I have an existing RDD. let say:
(((url_hash,url,created_timestamp)).
((url_hash, url, created_timestamp )).
我想根据url_hash过滤此RDD.如果cassandra表中存在url_hash,那么我想从RDD中将其过滤掉,这样我就只能对新的url进行处理.
I want to filter this RDD based on url_hash. If url_hash exists in the Cassandra table then I want to filter it out from the RDD so I can do processing only on the new urls.
Cassandra Table如下所示:
Cassandra Table looks like following:
url_hash| url | created_timestamp | updated_timestamp
任何指针都会很棒.
我尝试过这样的事情:
case class UrlInfoT(url_sha256: String, full_url: String, created_ts: Date)
def timestamp = new java.utils.Date()
val rdd1 = rdd.map(row => (calcSHA256(row(1)), (row(1), timestamp)))
val rdd2 = sc.cassandraTable[UrlInfoT]("keyspace", "url_info").select("url_sha256", "full_url", "created_ts")
val rdd3 = rdd2.map(row => (row.url_sha256,(row.full_url, row.created_ts)))
newUrlsRDD = rdd1.subtractByKey(rdd3)
我遇到卡桑德拉错误
java.lang.NullPointerException: Unexpected null value of column full_url in keyspace.url_info.If you want to receive null values from Cassandra, please wrap the column type into Option or use JavaBeanColumnMapper
cassandra表中没有空值
There are no null values in cassandra table
推荐答案
感谢原型保罗!
我希望有人觉得这有用.必须将Option添加到案例类.
I hope somebody finds this useful. Had to add Option to case class.
期待更好的解决方案
case class UrlInfoT(url_sha256: String, full_url: Option[String], created_ts: Option[Date])
def timestamp = new java.utils.Date()
val rdd1 = rdd.map(row => (calcSHA256(row(1)), (row(1), timestamp)))
val rdd2 = sc.cassandraTable[UrlInfoT]("keyspace", "url_info").select("url_sha256", "full_url", "created_ts")
val rdd3 = rdd2.map(row => (row.url_sha256,(row.full_url, row.created_ts)))
newUrlsRDD = rdd1.subtractByKey(rdd3)
这篇关于使用Cassandra的Scala Spark Filter RDD的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!