Spark Data Frames - 检查列是否为整数类型 [英] Spark Data Frames - Check if column is of type integer
本文介绍了Spark Data Frames - 检查列是否为整数类型的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我试图找出我在 spark 数据框中的列是什么数据类型,并根据该推论操作该列.
I am trying to figure out what data type my column in a spark data frame is and manipulate the column based on that dedeuction.
这是我目前所拥有的:
import pyspark
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName('MyApp').getOrCreate()
df = spark.read.csv('Path To csv File',inferSchema=True,header=True)
for x in df.columns:
if type(x) == 'integer':
print(x+": inside if loop")
print(x+": inside if loop")
语句似乎永远不会被执行,但我确信有几列是整数数据类型.我在这里缺少什么?
The print(x+": inside if loop")
statement never seems to get executed but I am sure there are several columns that are integer data type.
What am I missing here?
推荐答案
你可以试试:
dict(df.dtypes)['column name'] == 'int'
df.dtypes
返回元组列表,将每列的类型作为字符串获取的最简单方法是将其转换为 dict.
df.dtypes
returns list of tuples and the easiest way to get the type as string for each column is to convert it to dict.
这篇关于Spark Data Frames - 检查列是否为整数类型的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文