0 votes

While trying to run the sample code provided in the Jupyter Python Spark Notebook, I get an error "no module named pyspark.sql" :

Do I need to configure something in order to use pyspark ?
I'm running DSS community on an EC2 AMI.

asked by

1 Answer

0 votes
Best answer

You need to setup the DSS / Spark integration.

* For DSS 3.1: https://doc.dataiku.com/dss/3.1/installation/spark.html

* For DSS 4: https://doc.dataiku.com/dss/latest/spark/installation.html
answered by
972 questions
1,000 answers
2,391 users

©Dataiku 2012-2018 - Privacy Policy