使用SparkR计算地理距离 [英] Geo distance calculation using SparkR
问题描述
我在R中有一个Spark数据帧,如下所示
I have a Spark dataframe in R as follows
head(df)
Lat1 Lng1 Lat2 Lng2
23.123 24.234 25.345 26.456
... ... ... ...
DataFrame
包含纬度和经度两点
我想计算每行节点之间的地理距离,并将其添加到新列中.
I would like to calculate the Geo distance between the nodes in each row and add it to a new column.
在R中,我正在使用geosphere
库中的distCosine
函数.
In R I am using distCosine
function from geosphere
library.
df$dist = distCosine(cbind(df$lng1,df$lat1),cbind(df$lng2,df$lat2))
我想知道如何在SparkR中计算它.
I am wondering how I should calculate it in SparkR.
SparkR产生以下错误,
SparkR produces the following error,
Error in as.integer(length(x) > 0L) :
cannot coerce type 'S4' to vector of type 'integer'
推荐答案
您不能直接在Spark DataFrames
上使用标准R函数.如果您使用的是最新的Spark版本,则可以使用dapply
,但是它有点冗长和缓慢:
You cannot use standard R function directly on Spark DataFrames
. If you use a recent Spark release you can you can use dapply
but it is a bit verbose and slowish:
df <- createDataFrame(data.frame(
lat1=c(23.123), lng1=c(24.234), lat2=c(25.345), lng2=c(26.456)))
new_schema <- do.call(
structType, c(schema(df)$fields(), list(structField("dist", "double", TRUE))))
attach_dist <- function(df) {
df$dist <- geosphere::distCosine(
cbind(df$lng1, df$lat1), cbind(df$lng2, df$lat2))
df
}
dapply(df, attach_dist, new_schema) %>% head()
lat1 lng1 lat2 lng2 dist
1 23.123 24.234 25.345 26.456 334733.4
在实践中,我宁愿直接使用该公式.它将更快,所有必需的功能已经可用,并且不是很复杂:
In practice I would rather use the formula directly. It will be much faster, all required functions are already available and it is not very complicated:
df %>% withColumn("dist", acos(
sin(toRadians(df$lat1)) * sin(toRadians(df$lat2)) +
cos(toRadians(df$lat1)) * cos(toRadians(df$lat2)) *
cos(toRadians(df$lng1) - toRadians(df$lng2))
) * 6378137) %>% head()
lat1 lng1 lat2 lng2 dist
1 23.123 24.234 25.345 26.456 334733.4
这篇关于使用SparkR计算地理距离的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!