获取星火,Python和MongoDB的共同努力 [英] Getting Spark, Python, and MongoDB to work together

查看:550
本文介绍了获取星火,Python和MongoDB的共同努力的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有困难得到这些组件中共同编织。我安装了Spark和成功地工作,我可以通过本地纱运行的作业,独立的,而还。我按照建议的步骤(据我所知)这里和<一个href=\"https://github.com/mongodb/mongo-hadoop/blob/master/spark/src/main/python/README.rst\">here

I'm having difficulty getting these components to knit together properly. I have Spark installed and working succesfully, I can run jobs locally, standalone, and also via YARN. I have followed the steps advised (to the best of my knowledge) here and here

我工作在Ubuntu和各种组件版本我是

I'm working on Ubuntu and the various component versions I have are


  • 星火火花1.5.1彬hadoop2.6

  • 的Hadoop 的Hadoop-2.6.1

  • 蒙戈 2.6.10

  • 蒙戈-Hadoop的连接从<克隆href=\"https://github.com/mongodb/mongo-hadoop.git\">https://github.com/mongodb/mongo-hadoop.git

  • 的Python 2.7.10

  • Spark spark-1.5.1-bin-hadoop2.6
  • Hadoop hadoop-2.6.1
  • Mongo 2.6.10
  • Mongo-Hadoop connector cloned from https://github.com/mongodb/mongo-hadoop.git
  • Python 2.7.10

我有以下的各种步骤,如哪些罐子增加一些难度的路径,所以我已经添加什么

I had some difficulty following the various steps such as which jars to add to which path, so what I have added are

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆