如何通过在Scala Spark中键入键来连接两个数据集 [英] how to join two datasets by key in scala spark

查看:74
本文介绍了如何通过在Scala Spark中键入键来连接两个数据集的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有两个数据集,每个数据集都有两个元素.下面是示例.

I have two datasets and each dataset have two elements. Below are examples.

数据1 :(名称,动物)

Data1: (name, animal)

('abc,def', 'monkey(1)')
('df,gh', 'zebra')
...

数据2 :(名称,水果)

Data2: (name, fruit)

('a,efg', 'apple')
('abc,def', 'banana(1)')
...

预期结果:(名称,动物,水果)

Results expected: (name, animal, fruit)

('abc,def', 'monkey(1)', 'banana(1)')
... 

我想通过使用第一列名称"来将这两个数据集结合起来.我已经尝试了几个小时,但是我不知道.谁能帮我吗?

I want to join these two datasets by using first column 'name.' I have tried to do this for a couple of hours, but I couldn't figure out. Can anyone help me?

val sparkConf = new SparkConf().setAppName("abc").setMaster("local[2]")
val sc = new SparkContext(sparkConf)
val text1 = sc.textFile(args(0))
val text2 = sc.textFile(args(1))

val joined = text1.join(text2)

以上代码不起作用!

推荐答案

join 是在成对的RDD上定义的,即 RDD [(K,V)] .第一步需要将输入数据转换为正确的类型.

join is defined on RDDs of pairs, that is, RDDs of type RDD[(K,V)]. The first step needed is to transform the input data into the right type.

我们首先需要将类型为 String 的原始数据转换为成对的(键,值):

We first need to transform the original data of type String into pairs of (Key, Value):

val parse:String => (String, String) = s => {
  val regex = "^\\('([^']+)',[\\W]*'([^']+)'\\)$".r
  s match {
    case regex(k,v) => (k,v)
    case _ => ("","")
  }
}

(请注意,我们不能使用简单的 split(,")表达式,因为键包含逗号)

(Note that we can't use a simple split(",") expression because the key contains commas)

然后我们使用该函数来解析文本输入数据:

Then we use that function to parse the text input data:

val s1 = Seq("('abc,def', 'monkey(1)')","('df,gh', 'zebra')")
val s2 = Seq("('a,efg', 'apple')","('abc,def', 'banana(1)')")

val rdd1 = sparkContext.parallelize(s1)
val rdd2 = sparkContext.parallelize(s2)

val kvRdd1 = rdd1.map(parse)
val kvRdd2 = rdd2.map(parse)

最后,我们使用 join 方法将两个RDD连接起来

Finally, we use the join method to join the two RDDs

val joined = kvRdd1.join(kvRdd2)

//让我们检查结果

joined.collect

// res31: Array[(String, (String, String))] = Array((abc,def,(monkey(1),banana(1))))

这篇关于如何通过在Scala Spark中键入键来连接两个数据集的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆