Spark 2.0

Apache Spark has had two core contexts that are available to the user. The sparkContext made available as sc and the SQLContext made available as sqlContext, these contexts make a variety of functions and information available to the user. The sqlContext makes a lot of DataFrame functionality available while the sparkContext focuses more on the Apache Spark engine itself.

However in Apache Spark 2.X, there is just one context - the SparkSession.

If you are using the Spark Shell you will automatically get a SparkSession called spark to accompany the SparkContext called sc.

Last updated