Skip to main content

Advertisement

Fig. 1 | Journal of Big Data

Fig. 1

From: Leveraging resource management for efficient performance of Apache Spark

Fig. 1

Apache Spark architecture. Apache Spark runs applications independently through its architecture in the cluster, these applications are combined by SparkContext Driver program, then Spark connects to several types of Cluster Managers to allocate resources between applications to run on a Cluster, when it is connected, Spark acquires executors on the cluster nodes, to perform calculations and store data for the application, besides Spark sends the application code passed to SparkContext to the executors, then, SparkContext sends tasks to executors in order to be executed

Back to article page