apache spark - importing pyspark in python shell -


This is a copy of another question on any other forum that has never been answered, so I thought That I apply it again here, because I have the same problem. (See)

I have installed SPAR on my machine and I am able to run Python programs without error without piespecc module / bin / pyspark as my Python interpreter I am using.

However, when I try to run the regular Python shell, when I try to import the piespark module I get this error:

  import from Spark Contex   

And it says

  "Modules named Parspark"   

How can I fix this ? Is there an environment variable that I need to set up Python for pyasper headers / libraries / etc? If my spark is in installation / spark /, what parspark path do I have to include? Or can pyspark programs be run only from pyspark interpreter?

If it prints such error:

Import error: Create a module named py4j.java_gateway

Please construct at $ SPARK_HOME / python / PYTHONPATH:

  export SPARK_HOME = / users / pinan / apps / spark - 1.1.0- Bin-hadoop2.4 Exports PYTHONPATH = $ SPARK_HOME / Python: $ SPARK_HOME / python / build: $ PYTHONPATH    

Comments

Popular posts from this blog

Java - Error: no suitable method found for add(int, java.lang.String) -

java - JPA TypedQuery: Parameter value element did not match expected type -

c++ - static template member variable has internal linkage but is not defined -