找到你要的答案

Q:How do i setup Pyspark in Python 3 with spark-env.sh.template

Q:我如何设置pyspark在Python 3 spark-env.sh.template

Because i have this issue in my ipython3 notebook, i guess i have to change "spark-env.sh.template" somehow.

Exception: Python in worker has different version 2.7 than that in driver 3.4, PySpark cannot run with different minor versions

因为我有这个问题在我的ipython3笔记本,我想我要改变“火花env。sh.template”不知何故。

例外:Python的工人有不同的版本2.7比3.4的驱动程序,pyspark不能运行不同的次要版本

answer1: 回答1:

Spark does not yet work with Python 3.If you wish to use the Python API you will also need a Python interpreter (version 2.6 or newer).

I had the same issue when running IPYTHON=1 ./pyspark.

Ok quick fix

Edit vim pyspark and change PYSPARK_DRIVER_PYTHON="ipython" line to

PYSPARK_DRIVER_PYTHON="ipython2"

That's it.

If you want to check where dose ipython points to,

Type which ipython in terminal and I bet that'll be

/Library/Frameworks/Python.framework/Versions/3.4/bin/ipython

**UPDATED**

The latest version of spark works well with python 3. So this may not need with the latest version.

Just set the environment variable:

export PYSPARK_PYTHON=python3

in case you want this change to be permanent add this line to pyspark script

火花没有Python 3的工作。如果你想使用Python API,你还需要一个Python解释器(2.6或更新版本)。

当运行IPython = 1我也有同样的问题。/ pyspark。

好的快速修复

编辑Vim pyspark和改变pyspark_driver_python =“IPython”线

PYSPARK_DRIVER_PYTHON="ipython2"

够了就要这些。

如果你想检查剂量IPython点,

式,IPython终端与我打赌会

/Library/Frameworks/Python.framework/Versions/3.4/bin/ipython

**更新**

火花的最新版本的作品以及与Python 3。因此,这可能不需要与最新版本。

只需设置环境变量:

出口pyspark_python = Python3

如果你想改变是永久性的这一行添加到pyspark脚本

answer2: 回答2:

I believe you can specify the two separately, like so:

PYSPARK_PYTHON=/opt/anaconda/bin/ipython
PYSPARK_DRIVER_PYTHON=/opt/anaconda/bin/ipython

Based on this other question Apache Spark: How to use pyspark with Python 3.

我相信你可以指定两个分开,像这样:

PYSPARK_PYTHON=/opt/anaconda/bin/ipython
PYSPARK_DRIVER_PYTHON=/opt/anaconda/bin/ipython

在此基础上的另一个问题:如何使用Apache的火花pyspark Python 3。

python  python-3.x  apache-spark  ipython-notebook  pyspark