irpas技术客

已解决:Exception: Python in worker has different version 2.7 than that in driver 3.

未知 6251

已解决:Exception: Python in worker has different version 2.7 than that in driver 3.6, PySpark cannot run with different minor versions.Please check environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set.

在阿里云服务器上运行pyspark模块程序时,核心报错如上

服务器centos环境:python(默认为python2)、python3,即双python环境

安装的pyspark==2.1.2版本,python3环境下安装的,注意pyspark版本要与安装的spark版本相符合(安装的spark版本为2.1.1)

如下图运行:python3 xxx.py? ?报错如下

[root@way code]# python3 foreach.py Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 20/12/17 15:30:26 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 20/12/17 15:30:27 WARN Utils: Your hostname, localhost resolves to a loopback address: 127.0.0.1; using 172.16.1.186 instead (on interface eth0) 20/12/17 15:30:27 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 20/12/17 15:30:30 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0) org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "/opt/software/spark/python/lib/pyspark.zip/pyspark/worker.py", line 125, in main ("%d.%d" % sys.version_info[:2], version)) Exception: Python in worker has different version 2.7 than that in driver 3.6, PySpark cannot run with different minor versions.Please check environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set. at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:193) at org.apache.spark.api.python.PythonRunner$$anon$1.<init>(PythonRDD.scala:234) at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:152) at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:63) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:99) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 20/12/17 15:30:30 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "/opt/software/spark/python/lib/pyspark.zip/pyspark/worker.py", line 125, in main ("%d.%d" % sys.version_info[:2], version))

解决方法:

报错可知程序中的变量?PYSPARK_PYTHON 、PYSPARK_DRIVER_PYTHON 本应调用python3,但是默认的python是2版本,但是2版本中又缺少pyspark等库,所以报错。

使用which?is?python3?命令查找python3的位置,在程序中指明上述两个变量调用的python版本,如下

from pyspark import SparkContext # 以下三行为新增内容 import os os.environ["PYSPARK_PYTHON"]="/usr/bin/python3" os.environ["PYSPARK_DRIVER_PYTHON"]="/usr/bin/python3"

保存再次运行,可正常执行


1.本站遵循行业规范,任何转载的稿件都会明确标注作者和来源;2.本站的原创文章,会注明原创字样,如未注明都非原创,如有侵权请联系删除!;3.作者投稿可能会经我们编辑修改或补充;4.本站不提供任何储存功能只提供收集或者投稿人的网盘链接。

标签: #已解决Exception #Python #in #Worker #has #different #version #27