Scala/Spark中的异常org.apache.spark.rdd.RDD [(scala.collection.immutable.Map [String,Any],Int)] [英] Exception org.apache.spark.rdd.RDD[(scala.collection.immutable.Map[String,Any], Int)] in scala/spark

查看:68
本文介绍了Scala/Spark中的异常org.apache.spark.rdd.RDD [(scala.collection.immutable.Map [String,Any],Int)]的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

使用下面的代码,我将获得有关特定过滤器的推文:

Using Below code I am getting tweets for a particular filter :

val topCounts60 = tweetMap.map((_, 1)).

reduceByKeyAndWindow(_+_, Seconds(60*60))

如果我执行topCounts60.println(),则topCounts60的示例输出之一采用以下格式:

one of the sample Output of topCounts60 is in below format if i do topCounts60.println():

(Map(UserLang -> en, UserName -> Harmeet Singh, UserScreenName -> 
harmeetsingh060, HashTags -> , UserVerification -> false, Spam -> true,     UserFollowersCount -> 44, UserLocation -> भारत, UserStatusCount -> 50,   UserCreated -> 2016-07-04T06:32:49.000+0530, UserDescription -> Punjabi Music,   TextLength -> 118, Text -> RT @PMOIndia: The Prime Minister is chairing a high   level meeting on the situation in Kashmir,    UserFollowersRatio -> 0.32116788625717163, UserFavouritesCount -> 67,   UserFriendsCount -> 137, StatusCreatedAt -> 2016-07-12T21:07:30.000+0530,   UserID -> 749770405867556865),1)

现在我正在尝试打印每个键对值,如下所示:

Now I am trying to print each key pair values like below:

for ((k,v) <- topCounts60) printf("key: %s, value: %s\n", k, v)

我正在接受以下例外:

Error:(125, 10) constructor cannot be instantiated to expected type;
found   : (T1, T2)
required:     org.apache.spark.rdd.RDD[(scala.collection.immutable.Map[String,Any], Int)]
for ((k,v) <- topCounts60) printf("key: %s, value: %s\n", k, v)

如何获得如下所示的输出:

How to get output like below :

UserLang -> en,

UserName -> Harmeet Singh

我是Scala的初学者,不知道如何打印所有值seperatley,请对此提供帮助.

I am beginner in scala,have no clue how to print all values seperatley,please help me on this.

推荐答案

尝试

topCounts60.foreachRDD {
    rdd => for ((k,v) <- rdd.collect) printf("key: %s, value: %s\n", k, v)
}

这篇关于Scala/Spark中的异常org.apache.spark.rdd.RDD [(scala.collection.immutable.Map [String,Any],Int)]的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆