如何在Apache Spark中添加多个列 [英] How to add multiple columns in Apache Spark

查看:152
本文介绍了如何在Apache Spark中添加多个列的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这是我的输入数据,其中四列以空格作为分隔符.我想添加第二列和第三列并打印结果

Here is my input data with four columns with space as the delimiter. I want to add the second and third column and print the result

sachin 200 10 2
sachin 900 20 2
sachin 500 30 3
Raju 400 40 4
Mike 100 50 5
Raju 50 60 6

我的代码处于中间状态

from pyspark import SparkContext
sc = SparkContext()
def getLineInfo(lines):
    spLine = lines.split(' ')
    name = str(spLine[0])
    cash = int(spLine[1])
    cash2 = int(spLine[2])
    cash3 = int(spLine[3])
    return (name,cash,cash2)
myFile = sc.textFile("D:\PYSK\cash.txt")
rdd = myFile.map(getLineInfo)
print rdd.collect()

从这里我得到的结果是

[('sachin', 200, 10), ('sachin', 900, 20), ('sachin', 500, 30), ('Raju', 400, 40
), ('Mike', 100, 50), ('Raju', 50, 60)]

现在我需要的最终结果如下,添加第二列和第三列并显示其余字段

Now the final result I need is as below, adding the 2nd and 3rd column and display the remaining fields

sachin 210 2
sachin 920 2
sachin 530 3
Raju 440 4
Mike 150 5
Raju 110 6

推荐答案

使用此:

def getLineInfo(lines):
    spLine = lines.split(' ')
    name = str(spLine[0])
    cash = int(spLine[1])
    cash2 = int(spLine[2])
    cash3 = int(spLine[3])
    return (name, cash + cash2, cash3)

这篇关于如何在Apache Spark中添加多个列的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆