在数组[String]上使用Spark的join和scala进行练习 [英] Practicing with Spark's join and scala on Array[String]
问题描述
我对Spark和Scala都是陌生的,并且我正在尝试练习加入命令。
I am new to both Spark and Scala, and I'm trying to practice the join command in Spark.
我有两个csv文件:
Ads.csv
5de3ae82-d56a-4f70-8738-7e787172c018,AdProvider1
f1b6c6f4-8221-443d-812e-de857b77b2f4,AdProvider2
aca88cd0-fe50-40eb-8bda-81965b377827,AdProvider1
940c138a-88d3-4248-911a-7dbe6a074d9f,AdProvider3
983bb5e5-6d5b-4489-85b3-00e1d62f6a3a,AdProvider3
00832901-21a6-4888-b06b-1f43b9d1acac,AdProvider1
9a1786e1-ab21-43e3-b4b2-4193f572acbc,AdProvider1
50a78218-d65a-4574-90de-0c46affbe7f3,AdProvider5
d9bb837f-c85d-45d4-95f2-97164c62aa42,AdProvider4
611cf585-a8cf-43e9-9914-c9d1dc30dab5,AdProvider1
Impression.csv为:
Impression.csv is:
5de3ae82-d56a-4f70-8738-7e787172c018,Publisher1
f1b6c6f4-8221-443d-812e-de857b77b2f4,Publisher2
aca88cd0-fe50-40eb-8bda-81965b377827,Publisher1
940c138a-88d3-4248-911a-7dbe6a074d9f,Publisher3
983bb5e5-6d5b-4489-85b3-00e1d62f6a3a,Publisher3
00832901-21a6-4888-b06b-1f43b9d1acac,Publisher1
9a1786e1-ab21-43e3-b4b2-4193f572acbc,Publisher1
611cf585-a8cf-43e9-9914-c9d1dc30dab5,Publisher1
我想使用第一个ID作为键和两个值来加入它们。
I want to join them with the first ID as the key and two values.
所以我这样阅读它们:
val ads = sc.textFile("ads.csv")
ads: org.apache.spark.rdd.RDD[String] = MapPartitionsRDD[1] at textFile at <console>:21
val impressions = sc.textFile("impressions.csv")
impressions: org.apache.spark.rdd.RDD[String] = MapPartitionsRDD[3] at textFile at <console>:21
好,所以我必须制作键值对:
val adPairs = ads.map(line => line.split(,))
val impressionPairs =展示次数。 map(line => line.split(,))
Ok, so I have to make key,value pairs: val adPairs = ads.map(line => line.split(",")) val impressionPairs = impressions.map(line => line.split(","))
res11: org.apache.spark.rdd.RDD[Array[String]] = MapPartitionsRDD[6] at map at <console>:23
res13: org.apache.spark.rdd.RDD[Array[String]] = MapPartitionsRDD[7] at map at <console>:23
但是我不能加入他们:
val result = impressionPairs.join(adPairs)
<console>:29: error: value join is not a member of org.apache.spark.rdd.RDD[Array[String]]
val result = impressionPairs.join(adPairs)
我是否需要将货币对转换为另一种格式?
Do I need to convert the pairs into another format?
推荐答案
您已经差不多了,但是您需要将Array [String]转换为键-值对,例如:
You are almost there, but what you need is to transform the Array[String] into key-value pairs, like this:
val adPairs = ads.map(line => {
val substrings = line.split(",")
(substrings(0), substrings(1))
})
(和 impressionPairs
相同)
这将为您提供 RDD [(String,String)]
类型的rdds,然后可以将其加入:)
That will give you rdds of type RDD[(String, String)]
which can then be joined :)
这篇关于在数组[String]上使用Spark的join和scala进行练习的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!