Spark数据帧-检查列是否为整数类型 [英] Spark Data Frames - Check if column is of type integer
问题描述
我试图找出我的Spark数据框中的列是什么数据类型,并根据该检测结果对列进行操作.
I am trying to figure out what data type my column in a spark data frame is and manipulate the column based on that dedeuction.
这是我到目前为止所拥有的:
Here is what I have so far:
import pyspark
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName('MyApp').getOrCreate()
df = spark.read.csv('Path To csv File',inferSchema=True,header=True)
for x in df.columns:
if type(x) == 'integer':
print(x+": inside if loop")
print(x+": inside if loop")
语句似乎从未执行过,但是我确定有几列是整数数据类型.
我在这里想念什么?
The print(x+": inside if loop")
statement never seems to get executed but I am sure there are several columns that are integer data type.
What am I missing here?
推荐答案
您正在遍历列的名称,因此type(x)
永远不会等于整数"(它始终是字符串).
You are iterating over the names of your columns so type(x)
will never equal "integer" (it's always a string).
You need to use pyspark.sql.DataFrame.dtypes
for x, t in df.dtypes:
if t=="int":
print("{col} is integer type".format(col=x))
使用df.printSchema()
查看架构也很有用.
It can also be useful to look at the schema using df.printSchema()
.
这篇关于Spark数据帧-检查列是否为整数类型的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!