0 votes
Is it possible to configure spark on Dss in a way that we can choose "run Spark on the machine of Dss (local macine)"or run the spark-job with a Spark which is installed on another cluster?

Additionally: How do we configure that Dss interacts with Spark on another cluster
asked by anonymous

1 Answer

0 votes

DSS only supports one Spark installation (i.e. directory with the Spark configuration and libraries for a given version of spark) see here for the details.

However, using different Spark configurations you can have your Spark job run either locally (spark master "local[*]") or remotely (e.g. spark master "yarn-client" to run on the Hadoop cluster, or spawn a spark "standalone cluster" on your cluster and provide its URL as the spark master).

answered by
861 questions
891 answers
1,162 users