SparkContext is the main entry point of spark. It contains the interface of hdfs/tachyon etc.
SparkEnv is used to holds all the runtime environment objects for a running Spark instance
(either master or worker). It’s complete api includes here.
The ExecutorEnv is created by CoarseGrainedExecutorBackend (in standalone mode).
The DriverEnv is created by SparkContext.