Spark Session

Much as the SparkContext is the entry point for all Spark applications, and the StreamingContext is for all streaming applications, the SparkSession serves as the entry point for Spark SQL.

If you are using the Spark Shell you will automatically get a SparkSession called spark to accompany the SparkContext called sc.

It’s preconfigured in a Spark shell and available as the variable spark. In your own programs, you should construct it yourself, just as you did in previous chapters.

import org.apache.spark.sql.SparkSession

val spark = SparkSession.builder().getOrElse()

SparkSession is a wrapper around SparkContext and SQLContext, which was directly used for constructing DataFrames in the versions prior to Spark 2.0. The Builder object lets you specify master, appName, and other configuration options, but the defaults will do.

Last updated