-
Notifications
You must be signed in to change notification settings - Fork 849
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
运行FeatureEngForRecModel.py时报错 #30
Comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
21/02/23 10:54:55 ERROR TaskSetManager: Task 0 in stage 10.0 failed 1 times; aborting job
21/02/23 10:54:55 WARN TaskSetManager: Lost task 0.0 in stage 11.0 (TID 210, localhost, executor driver): TaskKilled (Stage cancelled)
Traceback (most recent call last):
File "D:/code/sparrowrecsys/SparrowRecSys-master/RecPySpark/src/com/sparrowrecsys/offline/pyspark/featureeng/FeatureEngForRecModel.py", line 151, in
samplesWithMovieFeatures = addMovieFeatures(movieSamples, ratingSamplesWithLabel)
File "D:/code/sparrowrecsys/SparrowRecSys-master/RecPySpark/src/com/sparrowrecsys/offline/pyspark/featureeng/FeatureEngForRecModel.py", line 54, in addMovieFeatures
samplesWithMovies4.show(5, truncate=False)
File "D:\ProgramData\Anaconda3\envs\recoenv\lib\site-packages\pyspark\sql\dataframe.py", line 380, in show
print(self._jdf.showString(n, int(truncate), vertical))
File "D:\ProgramData\Anaconda3\envs\recoenv\lib\site-packages\py4j\java_gateway.py", line 1257, in call
answer, self.gateway_client, self.target_id, self.name)
File "D:\ProgramData\Anaconda3\envs\recoenv\lib\site-packages\pyspark\sql\utils.py", line 63, in deco
return f(*a, **kw)
File "D:\ProgramData\Anaconda3\envs\recoenv\lib\site-packages\py4j\protocol.py", line 328, in get_return_value
format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling o132.showString.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 10.0 failed 1 times, most recent failure: Lost task 0.0 in stage 10.0 (TID 209, localhost, executor driver): org.apache.spark.SparkException: Python worker failed to connect back.
The text was updated successfully, but these errors were encountered: