You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi Sir, @LearningJournal
I have the clone of the project and want to execute the .py files using spark-submit command , but facing an error. Can you guide me through on how can I execute these scripts in spark env.
21/07/16 14:01:56 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Traceback (most recent call last):
File "/home/ubuntu/anaconda3/lib/python3.8/configparser.py", line 846, in items
d.update(self._sections[section])
KeyError: 'SPARK_APP_CONFIGS'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/ubuntu/Spark-Programming-In-Python-master/01-HelloSpark/HelloSpark.py", line 7, in
conf = get_spark_app_config()
File "/home/ubuntu/Spark-Programming-In-Python-master/01-HelloSpark/lib/utils.py", line 25, in get_spark_app_config
for (key, val) in config.items("SPARK_APP_CONFIGS"):
File "/home/ubuntu/anaconda3/lib/python3.8/configparser.py", line 849, in items
raise NoSectionError(section)
configparser.NoSectionError: No section: 'SPARK_APP_CONFIGS'
The text was updated successfully, but these errors were encountered:
Hi Sir, @LearningJournal
I have the clone of the project and want to execute the .py files using spark-submit command , but facing an error. Can you guide me through on how can I execute these scripts in spark env.
command:./bin/spark-submit /home/ubuntu/Spark-Programming-In-Python-master/01-HelloSpark/HelloSpark.py
the error logs are pasted below for reference:
21/07/16 14:01:56 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Traceback (most recent call last):
File "/home/ubuntu/anaconda3/lib/python3.8/configparser.py", line 846, in items
d.update(self._sections[section])
KeyError: 'SPARK_APP_CONFIGS'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/ubuntu/Spark-Programming-In-Python-master/01-HelloSpark/HelloSpark.py", line 7, in
conf = get_spark_app_config()
File "/home/ubuntu/Spark-Programming-In-Python-master/01-HelloSpark/lib/utils.py", line 25, in get_spark_app_config
for (key, val) in config.items("SPARK_APP_CONFIGS"):
File "/home/ubuntu/anaconda3/lib/python3.8/configparser.py", line 849, in items
raise NoSectionError(section)
configparser.NoSectionError: No section: 'SPARK_APP_CONFIGS'
The text was updated successfully, but these errors were encountered: