0

I install hadoop and spark in my windows 11 I put the path in my env variable

"C:\BigData\spark-3.1.2-bin-hadoop3.2"
"C:\BigData\spark-3.1.2-bin-hadoop3.2\sbin"  
"C:\BigData\spark-3.1.2-bin-hadoop3.2\bin" 
"C:\BigData\hadoop-3.2.2\bin" 

and i install jdk 1.8 and i put the path java_home and when i was to excute the spark whith this cmd "spark-shell" i have this problem

"he system cannot find the path specified." 

what is the solution ?

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
  • Show us the output from `echo %PATH%`. Otherwise, cd into the directory with that command and run it there – OneCricketeer Nov 15 '22 at 13:19
  • C:\Windows\system32>echo %PATH% ";C:\Program Files\Java\jdk1.8.0_351\bin;C:\BigData\hadoop-3.2.2\bin;C:\BigData\spark-3.1.2-bin-hadoop3.2\bin;C:\BigData\spark-3.1.2-bin-hadoop3.2\sbin;C:\BigData\hadoop-3.2.2\sbin" – fatiha chaibi Nov 16 '22 at 12:23
  • Spark / Hadoop doesn't like having spaces in your path. Reinstall Java somewhere else like `C:\Java` or use `C:\PROGRA~1` - https://stackoverflow.com/a/892568/2308683 . Also, you should be able to use Java 11 instead – OneCricketeer Nov 16 '22 at 21:16

0 Answers0