Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
I was trying to execute a Pyspark script and encountered a py4j error.
Can someone help me with this? I have checked all the version compatibilities as well.
I am attaching a screenshot of the error.
Operating system used: Ubuntu
Operating system used: Ubuntu
โ
Hi
it looks like Spark integration is not done, or not done properly. Either you're using a spark in version >3.1.2, or you haven't run the "<datadir>/bin/dssadmin install-spark-integration ..." command
Yes it did solve the error.
thank you so much
Hi
it looks like Spark integration is not done, or not done properly. Either you're using a spark in version >3.1.2, or you haven't run the "<datadir>/bin/dssadmin install-spark-integration ..." command
Yes it did solve the error.
thank you so much