Can not run java in spark worker -


When I was running a Spark program in the cluster, I found this error in logging in:

  java.io.IOException: Program "java" (can not run in "/cloud/packages/spark-0.9.0-incubating-bin-hadoop1/work/app-20140424114752-0000/0") : java.io .IOException: Error = 2, org.apache.spark.deploy.worker on java.long. Process builder. There is no such file or directory on start (processbuger.java script). ExecutionerRunner.fetchAndRunExecutor (Executive: Scala: 12 9) .apache.spark.deploy.worker.ExecutorRunner $$ anon $ 1.run in Organization (Reason: Scala: 59) Reason: java.io.IOException: java.io. IOException: Error = 2, there is no such file or directory on Java. Lang.UNIXProcess at java.lang.ProcessBuilder.start (ProcessBuilder.java:453) at java.lang.ProcessImpl.start (ProcessImpl.java:65). & Lt; Init & gt; (UNIXProcess.java:148) ... 2 more   

I have set JAVA_HOME ( /cloud/packages/jdk1.6.0_38 ) and SPARK_HOME ( /cloud/packages/spark-0.9-development-bin-Hadoop1 ).

What are the reasons for this exception? How was this fixed?

I had to face the same problem at Ubuntu 12.04, and JAVA_HOME In these / etc / environment .

Comments

Popular posts from this blog

Java - Error: no suitable method found for add(int, java.lang.String) -

java - JPA TypedQuery: Parameter value element did not match expected type -

c++ - static template member variable has internal linkage but is not defined -