Fig. 1From: Leveraging resource management for efficient performance of Apache SparkApache Spark architecture. Apache Spark runs applications independently through its architecture in the cluster, these applications are combined by SparkContext Driver program, then Spark connects to several types of Cluster Managers to allocate resources between applications to run on a Cluster, when it is connected, Spark acquires executors on the cluster nodes, to perform calculations and store data for the application, besides Spark sends the application code passed to SparkContext to the executors, then, SparkContext sends tasks to executors in order to be executedBack to article page