본문 바로가기
개발/HADOOP_SPARK_ETC

spark-shell error : Service 'sparkDriver' failed after 16 retries!

by 로그인시러 2017. 8. 29.
# To solve the problems :
Step 1. Go to your installed spark directories.
Step 2. Go into 'bin' directory and then open the filed 'load-spark-env.sh'
Step 3. After adding  'export SPARK_LOCAL_IP='127.0.0.1' to 'load-spark-env.sh'.
It worked for me. I'm using Mac OS X EI Captain.(10.11.6) 


출처 : http://jacob119.blogspot.kr/2016/08/spark-shell-error-service-sparkdriver.html

'개발 > HADOOP_SPARK_ETC' 카테고리의 다른 글

Spark Java jar NullPointerException  (0) 2018.01.19
Dataset  (0) 2017.07.25
spark transformation 설명-예제 모음  (0) 2017.04.21
hadoop 명령어  (0) 2017.04.12
hbase vs impala  (0) 2017.04.11

댓글