Pyspark初始化SparkContext时,报jvm不存在错误
错误如下
----> 1 sc = SparkContext(conf=conf)
/usr/local/lib/python3.6/site-packages/pyspark/context.py in __init__(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls) 145 try:
146 self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
--> 147 conf, jsc, profiler_cls)
scala python148 except:
149 # If an error occurs, clean up in order to allow future SparkContext creation:
/usr/local/lib/python3.6/site-packages/pyspark/context.py in _do_init(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, jsc, profiler_cls)
222 # data via a socket.
223 # scala's mangled names w/ $ in them require special treatment.
--> 224 self._encryption_enabled = self._jvm.PythonUtils.isEncryptionEnabled(self._jsc)
225 os.environ["SPARK_AUTH_SOCKET_TIMEOUT"] = \
226 str(self._PythonAuthSocketTimeout(self._jsc))
/usr/local/lib/python3.6/site-packages/py4j/java_gateway.py in __getattr__(self, name)
1529 else:
1530 raise Py4JError(
-> 1531 "{0}.{1} does not exist in the JVM".format(self._fqn, name))
1532
1533 def _get_args(self, args):
Py4JError: org.apache.spark.api.python.PythonUtils.isEncryptionEnabled does not exist in the JVM
我的解决⽅法:
添加下⾯内容到~/.bashrc
export SPARK_HOME=/usr/local/spark
export PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.10.7-src.zip:$PYTHONPATH
export PATH=$SPARK_HOME/bin:$SPARK_HOME/python:$PATH
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。
发表评论