Spark on local machine (where DSS is intalled) + Spark on another cluster

UserBird
Dataiker
Spark on local machine (where DSS is intalled) + Spark on another cluster
Is it possible to configure spark on Dss in a way that we can choose "run Spark on the machine of Dss (local macine)"or run the spark-job with a Spark which is installed on another cluster?

Additionally: How do we configure that Dss interacts with Spark on another cluster
0 Kudos
1 Reply
AdrienL
Dataiker

DSS only supports one Spark installation (i.e. directory with the Spark configuration and libraries for a given version of spark) see here for the details.



However, using different Spark configurations you can have your Spark job run either locally (spark master "local[*]") or remotely (e.g. spark master "yarn-client" to run on the Hadoop cluster, or spawn a spark "standalone cluster" on your cluster and provide its URL as the spark master).

0 Kudos

Labels

?
Labels (2)
A banner prompting to get Dataiku