apache spark - importing pyspark in python shell -
This is a copy of another question on any other forum that has never been answered, so I thought That I apply it again here, because I have the same problem. (See) I have installed SPAR on my machine and I am able to run Python programs without error without piespecc module / bin / pyspark as my Python interpreter I am using. However, when I try to run the regular Python shell, when I try to import the piespark module I get this error: And it says How can I fix this ? Is there an environment variable that I need to set up Python for pyasper headers / libraries / etc? If my spark is in installation / spark /, what parspark path do I have to include? Or can pyspark programs be run only from pyspark interpreter? If it prints such error: Import error: Create a module named py4j.java_gateway Please construct at $ SPARK_HOME / python / PYTHONPATH:
import from Spark Contex
"Modules named Parspark"
export SPARK_HOME = / users / pinan / apps / spark - 1.1.0- Bin-hadoop2.4 Exports PYTHONPATH = $ SPARK_HOME / Python: $ SPARK_HOME / python / build: $ PYTHONPATH
Comments
Post a Comment