0
SparkContext機能がCMDにpysparkにエラーを返してpysparkで定義されていないとJupyterSCが
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'sc' is not defined
私が試してみました:
>>> from pyspark import SparkContext
>>> sc = SparkContext()
しかし、まだエラーを示す:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "c:\spark\python\pyspark\context.py", line 115, in __init__
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File "c:\spark\python\pyspark\context.py", line 275, in _ensure_initialized
callsite.function, callsite.file, callsite.linenum))
**ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app
=PySparkShell, master=local[*]) created by getOrCreate at c:\spark\bin\..\python
\pyspark\shell.py:43**strong text****
どのように問題を解決するには?