在Windows和Apache Toree内核上与Jupyter一起使用以实现Spark兼容性 [英] Work with Jupyter on Windows and Apache Toree Kernel for Spark compatibility

查看:160
本文介绍了在Windows和Apache Toree内核上与Jupyter一起使用以实现Spark兼容性的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试安装Apache Toree内核以实现火花兼容性,并且遇到了奇怪的环境消息.这是我遵循的过程:

I´m trying to install Apache Toree kernel for spark compatibility and I´m running into a strange environmental message. This is the process I followed:

  1. 使用Jupyter 4.1.0安装最新的Anaconda版本
  2. 运行:pip install --pre toree
  3. 运行:jupyter toree install --interpreters = PySpark,SparkR,Scala,SQL

只有对Scala内核真正感兴趣,但是我安装了所有解释器. 操作系统是Windows 7,没有选择使用虚拟机或Linux.

Only really interested in Scala Kernel, but I installed all interpreters. The OS is windows 7 and there is no choice to use virtual machines or linux.

这是我修改为使用cygwin执行run.sh bash脚本的kernel.json文件:

This is the kernel.json file that I modified to use cygwin to execute run.sh bash script:

{
  "language": "scala", 
  "display_name": "Apache Toree - Scala", 
  "env": {
    "__TOREE_SPARK_OPTS__": "", 
    "SPARK_HOME": "C:\\CDH\\spark", 
    "__TOREE_OPTS__": "", 
    "DEFAULT_INTERPRETER": "Scala", 
    "PYTHONPATH": "C:\\CDH\\spark\\python:C:\\CDH\\spark\\python\\lib\\py4j-0.8.2.1-src.zip", 
    "PYTHON_EXEC": "python"
  }, 
  "argv": [
    "C:\\cygwin64\\bin\\mintty.exe","-h","always","/bin/bash","-l","-e","C:\\ProgramData\\jupyter\\kernels\\apache_toree_scala\\bin\\run.sh", 
    "--profile", 
    "{connection_file}"
  ]
}

运行jupyter时,内核因错误而停止运行:

When runing jupyter, the kernel halts with error:

TypeError: environment can only contain strings

扩展日志:

[E 10:45:56.736 NotebookApp] Failed to run command:
    ['C:\\cygwin64\\bin\\mintty.exe', '-h', 'always', '/bin/bash', '-l', '-e', 'C:\\ProgramData\\jupyter\\kernels\\apache_toree_scala\\bin\\run.sh', '
--profile', 'C:\\Users\\luis\\AppData\\Roaming\\jupyter\\runtime\\kernel-e02cac9b-15de-4c69-a8e5-e5b11919e1bc.json']
    with kwargs:
    {'stdin': -1, 'stdout': None, 'cwd': 'C:\\Users\\luis\\Documents', 'stderr': None, 'env': {'TMP': 'C:\\Users\\luis\\AppData\\Local\\Temp', 'COMPUTERNAME': 'laptop', 'USERDOMAIN': 'HOME', 'SPARK_HOME': u'C:\\CDH\\spark', 'DEFLOGDIR': 'C:\\ProgramData\\McAfee\\DesktopProtection', 'PSMODULEPATH': 'C:\\Windows\\system32\\WindowsPowerShell\\v1.0\\Modules\\', 'COMMONPROGRAMFILES': 'C:\\Program Files\\Common Files', 'PROCESSOR_IDENTIFIER':'Intel64 Family 6 Model 45 Stepping 7, GenuineIntel', u'DEFAULT_INTERPRETER': u'Scala', 'PROGRAMFILES': 'C:\\Program Files', 'PROCESSOR_REVISION': '2d07', 'SYSTEMROOT': 'C:\\Windows', 'PATH': 'C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\Library\\bin;C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2;C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\Scripts;C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\Library\\bin;C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\Library\\bin;C:\\Program Files\\Java\\jdk1.7.0_76\\jre\\bin;C:\\Windows\\system32;C:\\Windows;C:\\Windows\\System32\\Wbem;C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\;C:\\Program Files (x86)\\sbt\\bin;C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2;C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\Scripts;C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\Library\\bin', 'PROGRAMFILES(X86)': 'C:\\Program Files (x86)', 'WINDOWS_TRACING_FLAGS': '3', 'TK_LIBRARY': 'C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\tcl\\tk8.5', u'__TOREE_SPARK_OPTS__': u'', 'TEMP': 'C:\\Users\\luis\\AppData\\Local\\Temp', 'COMMONPROGRAMFILES(X86)': 'C:\\Program Files (x86)\\Common Files', 'PROCESSOR_ARCHITECTURE': 'AMD64', 'TIX_LIBRARY': 'C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\tcl\\tix8.4.3', 'ALLUSERSPROFILE': 'C:\\ProgramData', 'LOCALAPPDATA': 'C:\\Users\\luis\\AppData\\Local', 'HOMEPATH': '\\Users\\luis', 'JAVA_HOME': 'C:\\Program Files\\java\\jdk1.7.0_76', 'JPY_INTERRUPT_EVENT': '1056', 'PROGRAMW6432': 'C:\\Program Files', 'USERNAME': 'luis', 'LOGONSERVER': '\\\\S8KROGR2', 'SBT_HOME': 'C:\\Program Files (x86)\\sbt\\', 'JPY_PARENT_PID': '1036', 'PROGRAMDATA': 'C:\\ProgramData', u'PYTHONPATH': u'C:\\CDH\\spark\\python:C:\\CDH\\spark\\python\\lib\\py4j-0.8.2.1-src.zip', 'TCL_LIBRARY': 'C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\tcl\\tcl8.5', 'VSEDEFLOGDIR': 'C:\\ProgramData\\McAfee\\DesktopProtection', 'USERDNSDOMAIN': 'HOME.ES', 'SESSIONNAME': 'RDP-Tcp#0', 'PATHEXT': '.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC', u'PYTHON_EXEC': u'python', 'CLIENTNAME': 'laptop2', u'__TOREE_OPTS__': u'', 'FP_NO_HOST_CHECK': 'NO', 'WINDIR': 'C:\\Windows', 'WINDOWS_TRACING_LOGFILE': 'C:\\BVTBin\\Tests\\installpackage\\csilogfile.log', 'HOMEDRIVE': 'C:', 'SYSTEMDRIVE': 'C:', 'COMSPEC': 'C:\\Windows\\system32\\cmd.exe', 'NUMBER_OF_PROCESSORS': '2', 'APPDATA': 'C:\\Users\\luis\\AppData\\Roaming', 'PROCESSOR_LEVEL': '6', 'COMMONPROGRAMW6432':    'C:\\Program Files\\Common Files', 'OS': 'Windows_NT', 'PUBLIC': 'C:\\Users\\Public', 'IPY_INTERRUPT_EVENT': '1056', 'USERPROFILE': 'C:\\Users\\luis'}}

[E 10:45:56.744 NotebookApp] Unhandled error in API request
    Traceback (most recent call last):
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\notebook\base\handlers.py", line 457, in wrapper
        result = yield gen.maybe_future(method(self, *args, **kwargs))
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 1008, in run
        value = future.result()
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\concurrent.py", line 232, in result
        raise_exc_info(self._exc_info)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 1014, in run
        yielded = self.gen.throw(*exc_info)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\notebook\services\sessions\handlers.py", line 62, in post
        kernel_id=kernel_id))
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 1008, in run
        value = future.result()
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\concurrent.py", line 232, in result
        raise_exc_info(self._exc_info)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 1014, in run
        yielded = self.gen.throw(*exc_info)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\notebook\services\sessions\sessionmanager.py", line 79, in create_session
        kernel_name)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 1008, in run
        value = future.result()
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\concurrent.py", line 232, in result
        raise_exc_info(self._exc_info)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 1014, in run
        yielded = self.gen.throw(*exc_info)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\notebook\services\sessions\sessionmanager.py", line 92, in start_kernel_for_session
        self.kernel_manager.start_kernel(path=kernel_path, kernel_name=kernel_name)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 1008, in run
        value = future.result()
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\concurrent.py", line 232, in result
        raise_exc_info(self._exc_info)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 282, in wrapper
        yielded = next(result)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\notebook\services\kernels\kernelmanager.py", line 87, in start_kernel
        super(MappingKernelManager, self).start_kernel(**kwargs)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\jupyter_client\multikernelmanager.py", line 110, in start_kernel
        km.start_kernel(**kwargs)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\jupyter_client\manager.py", line 243, in start_kernel
        **kw)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\jupyter_client\manager.py", line 189, in _launch_kernel
        return launch_kernel(kernel_cmd, **kw)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\jupyter_client\launcher.py", line 123, in launch_kernel
        proc = Popen(cmd, **kwargs)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\subprocess.py", line 711, in __init__
        errread, errwrite)
      File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\subprocess.py", line 959, in _execute_child
        startupinfo)
    TypeError: environment can only contain strings
[E 10:45:56.766 NotebookApp] {
      "Origin": "http://localhost:8888",
      "Content-Length": "88",
      "Accept-Language": "es-ES,es;q=0.8",
      "Accept-Encoding": "gzip, deflate",
      "Host": "localhost:8888",
      "Accept": "application/json, text/javascript, */*; q=0.01",
      "User-Agent": "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.101 Safari/537.36",
      "Connection": "keep-alive",
      "X-Requested-With": "XMLHttpRequest",
      "Referer": "http://localhost:8888/notebooks/Untitled3.ipynb?kernel_name=apache_toree_scala",
      "Content-Type": "application/json"
    }
[E 10:45:56.796 NotebookApp] 500 POST /api/sessions (::1) 626.00ms referer=http://localhost:8888/notebooks/Untitled3.ipynb?kernel_name=apache_toree_sc
ala

我已经隔离运行了命令:

I´ve run the command isolated:

C:\\cygwin64\\bin\\mintty.exe -h always /bin/bash -l -e C:\\ProgramData\\jupyter\\kernels\\apache_toree_scala\\bin\\run.sh

它有效.它只会在jupyter服务器执行的上下文中失败.

And it works. It only fails on the context of the jupyter server execution.

有人能成功在Windows计算机上运行此内核吗?

Does anyone succeed to run this kernel on a Window machine?

推荐答案

我编写了自己的(hacky)run.cmd,并设法使用Spark 2.2.0和toree-assembly-0.2.0.dev1-incubating使它运行-快照我将解决方案发布在

I wrote my own (hacky) run.cmd and managed to get it working using Spark 2.2.0 and toree-assembly-0.2.0.dev1-incubating-SNAPSHOT. I posted my solution on the TOREE-399 ticket.

Run.cmd如下:

Run.cmd as follows:

@echo off

set PROG_HOME=%~dp0..

if not defined SPARK_HOME (
  echo SPARK_HOME must be set to the location of a Spark distribution!
  exit 1
)

REM disable randomized hash for string in Python 3.3+
set PYTHONHASHSEED=0

REM The SPARK_OPTS values during installation are stored in __TOREE_SPARK_OPTS__. This allows values to be specified during
REM install, but also during runtime. The runtime options take precedence over the install options.

if not defined SPARK_OPTS (
  set SPARK_OPTS=%__TOREE_SPARK_OPTS__%
) else (
  if "%SPARK_OPTS%" == "" (
    set SPARK_OPTS=%__TOREE_SPARK_OPTS__%
  )
)

if not defined TOREE_OPTS (
  set TOREE_OPTS=%__TOREE_OPTS__%
) else (
  if "%TOREE_OPTS%" == "" (
    set TOREE_OPTS=%__TOREE_OPTS__%
  )
)

echo Starting Spark Kernel with SPARK_HOME=%SPARK_HOME%

REM This doesn't work because the classpath doesn't get set properly, unless you hardcode it in SPARK_SUBMIT_OPTS using forward slashes or double backslashes, but then you can't use the SPARK_HOME and PROG_HOME variables.
REM set SPARK_SUBMIT_OPTS=-cp "%SPARK_HOME%\conf\;%SPARK_HOME%\jars\*;%PROG_HOME%\lib\toree-assembly-0.2.0.dev1-incubating-SNAPSHOT.jar" -Dscala.usejavacp=true
REM set TOREE_COMMAND="%SPARK_HOME%\bin\spark-submit.cmd" %SPARK_OPTS% --class org.apache.toree.Main %PROG_HOME%\lib\toree-assembly-0.2.0.dev1-incubating-SNAPSHOT.jar %TOREE_OPTS% %*

REM The two important things that we must do differently on Windows are that we must add toree-assembly-0.2.0.dev1-incubating-SNAPSHOT.jar to the classpath, and we must define the java property scala.usejavacp=true.
set TOREE_COMMAND="%JAVA_HOME%\bin\java" -cp "%SPARK_HOME%\conf\;%SPARK_HOME%\jars\*;%PROG_HOME%\lib\toree-assembly-0.2.0.dev1-incubating-SNAPSHOT.jar" -Dscala.usejavacp=true -Xmx1g org.apache.spark.deploy.SparkSubmit %SPARK_OPTS% --class org.apache.toree.Main %PROG_HOME%\lib\toree-assembly-0.2.0.dev1-incubating-SNAPSHOT.jar %TOREE_OPTS% %*

echo.
echo %TOREE_COMMAND%
echo.

%TOREE_COMMAND%

  • 应该将run.cmd文件放置在 C:\ ProgramData \ jupyter \ kernels \ apache_toree_scala \ bin \
  • 此外, 您需要在上面的文件夹中编辑kernel.json进行更改 run.sh改为run.cmd.
  • 如果要允许安装其他 Toree内核,还应该编辑toreeapp.py以将run.sh更改为 run.cmd.
  • 我尚未测试IF语句是否正常工作.我怀疑由于批处理缺少可靠的IF语句,它们会扼杀某些参数.
    • The run.cmd file should be placed in C:\ProgramData\jupyter\kernels\apache_toree_scala\bin\
    • Additionally, you need to edit kernel.json in the folder above that to change run.sh to run.cmd.
    • If you want to allow for installing additional Toree kernels, you should also edit toreeapp.py to change run.sh to run.cmd.
    • I have not tested whether the IF statements work correctly. I suspect they will choke on some parameters since batch lacks a robust IF statement.
    • 这篇关于在Windows和Apache Toree内核上与Jupyter一起使用以实现Spark兼容性的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆