0 votes

While trying to run the sample code provided in the Jupyter Python Spark Notebook, I get an error "no module named pyspark.sql" :

Do I need to configure something in order to use pyspark ?
I'm running DSS community on an EC2 AMI.

asked by Romain

1 Answer

0 votes
Best answer

You need to setup the DSS / Spark integration.

* For DSS 3.1: https://doc.dataiku.com/dss/3.1/installation/spark.html

* For DSS 4: https://doc.dataiku.com/dss/latest/spark/installation.html
answered by
928 questions
957 answers
1,804 users