Timeline for answer to How to solve "Can't assign requested address: Service 'sparkDriver' failed after 16 retries" when running spark code? by Eldho Shaji
Current License: CC BY-SA 4.0
Post Revisions
4 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Oct 22, 2024 at 13:38 | comment | added | Ken Myers | Doing this in load-spark-env.sh did not work for me but did work when I put it in .zshrc | |
| Jan 2, 2024 at 10:16 | comment | added | hayj |
Works for me, but the file is under spark/conf. You can use echo "export SPARK_LOCAL_IP=127.0.0.1" >> $SPARK_HOME/conf/spark-env.sh after defining SPARK_HOME if not already done..
|
|
| Feb 24, 2023 at 8:54 | comment | added | greatvovan |
This helped in my case. master("local[*]") did not.
|
|
| Jul 28, 2020 at 3:49 | history | answered | Eldho Shaji | CC BY-SA 4.0 |