麻烦在VSCode中运行pyspark的例子 [英] Trouble running example of pyspark in VSCode
问题描述
我试图关注:https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-for-vscode
Hive示例工作正常,但我遇到了pyspark,PySPark Interactive和Batch throw错误。我将专注于互动:
第一期我遇到了是"无法导入名称DataError"。在搜索后我发现我必须降级大熊猫0.20.0,这意味着将python降级到3.6。请将此指定为要求。
然后我遇到第二个问题我需要帮助:
由于致命错误导致代码失败:
来自https://xxx.azurehdinsight.net//livy/sessions的状态代码为"400",错误有效负载:"无效种类:pyspark3(通过参考链:org.apache.livy.server.interactive.CreateInteractiveRequest [ \"kind \"])"。
谢谢,
Lukasz
在Livy 0.4(HDI spark 2.2集群)中不再支持PySpark3。 python只支持"PySpark"。众所周知,提交给spark 2.2的问题是
python3失败。
有关详细信息,请参阅"使用Azure HDInsight Tools for Visual Studio Code "。
--------------------------------------- -------------------------------------------------- ------
如果此答案有用,请单击"标记为答案"或"向上投票"。要提供有关论坛体验的其他反馈,请点击
此处 跨度> 跨度> <跨度> 跨度>
Hi,
I've tried to follow:https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-for-vscode
Hive examples worked fine, but I had trouble with pyspark, both PySPark Interactive and Batch throw errors. I'll focus on interactive:
1st issue I ran into was "cannot import name DataError". After searching around I found out that I have to downgrade pandas 0.20.0, which meant downgrading python to 3.6. Please specify this as requirement.
Then I've encountered 2nd issue I need help with:
The code failed because of a fatal error:
Invalid status code '400' from https://xxx.azurehdinsight.net//livy/sessions with error payload: "Invalid kind: pyspark3 (through reference chain: org.apache.livy.server.interactive.CreateInteractiveRequest[\"kind\"])".
Thanks,
Lukasz
PySpark3 is not supported anymore in Livy 0.4 (which is HDI spark 2.2 cluster). Only "PySpark" is supported for python. It is known issue that submit to spark 2.2 fail with python3.
For more details, refer "Use Azure HDInsight Tools for Visual Studio Code".
-----------------------------------------------------------------------------------------------
If this answer was helpful, click "Mark as Answer" or "Up-Vote". To provide additional feedback on your forum experience, click here
这篇关于麻烦在VSCode中运行pyspark的例子的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!