How to Create SQLContext in Spark Using Scala?
Last Updated :
13 May, 2024
Scala stands for scalable language. It was developed in 2003 by Martin Odersky. It is an object-oriented language that provides support for functional programming approach as well. Everything in Scala is an object. It is a statically typed language although unlike other statically typed languages like C, C++, or Java, it doesn't require type information while writing the code. The type verification is done at the compile time. Static typing allows to building of safe systems by default. Smart built-in checks and actionable error messages, combined with thread-safe data structures and collections, prevent many tricky bugs before the program first runs.
This article focuses on discussing steps to create SQLContext in Spark using Scala.
What is SQLContext?
The official definition in the documentation of Spark is:
"The entry point for running relational queries using Spark. Allows the creation of SchemaRDD objects and the execution of SQL queries."
The purpose of SQLContext is to introduce processing on structured data in Spark. Before it, spark only had RDDs to manipulate data. RDDs are simply a collection of rows (notice the absence of columns) that can be manipulated using lambda functions and other functionalities. SQLContext introduced objects that would add schema (like column name and data type column) to the data to make it similar to relational databases. The additional information about data would also open the gate to optimizations for data processing.
Looking more at the documentation, it shows that the SQLContext is a class introduced in version 1.0.0 and provides a set of functions that allow creating and manipulating a SchemaRDD object. Here is the list of functions:
- cacheTable
- createParquetFile
- createSchemaRDD
- logicalPlanToSparkQuery
- parquetFile
- registerRDDAsTable
- sparkContext
- sql
- table
- uncacheTable
The APIs revolve around inter-transformation of Parquet files and SchemaRDD objects. SchemaRDD objects are an RDD of Row objects that has an associated schema. In addition to standard RDD functions, SchemaRDDs can be used in relational queries, like as below:
Scala
// One method for defining the schema of an RDD is to make a case class with the desired column
// names and types.
case class Record(key: Int, value: String)
val sc: SparkContext // An existing spark context.
val sqlContext = new SQLContext(sc)
// Importing the SQL context gives access to all the SQL functions and implicit conversions.
import sqlContext._
val rdd = sc.parallelize((1 to 100).map(i => Record(i, s"val_$i")))
// Any RDD containing case classes can be registered as a table. The schema of the table is
// automatically inferred using scala reflection.
rdd.registerAsTable("records")
val results: SchemaRDD = sql("SELECT * FROM records")
The above code would not run on the latest versions of Spark because SchemaRDDs are now obsolete.
Currently, SQLContext is itslef not used and instead SparkSession is used to create a unified interface for many such different contexts like SQLContext, SparkContext HiveContext and others. Inside SparkSession, the SQLContext is still present. Also, instead of SchemaRDDs spark now uses DataSets and DataFrames to denote structured data.
Creating SQLContext
1. Using SparkContext
We can create an SQLContext from a sparkcontext. The constructor is as follows:
public SQLContext(SparkContext sparkContext)
We can create a simple sparkcontext object with "master" (the cluster url) being set to "local" (just use the current machine) and "appName" to "createSQLContext". We can then supply this sparkcontext to the SQLContext constructor.
Scala
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
object createSQLContext {
def main(args: Array[String]): Unit = {
val sc = new SparkContext("local[*]", "createSQLContext")
val sqlc = new SQLContext(sc)
println(sqlc)
}
}
Output:
The SQLContext ObjectExplanation:
As you can see above we have created a new SQLContext object. Although we were successful but this method is deprecated and SQLContext is replaced with SparkSession. SQLContext is kept in newer versions only for backward compatibility.
2. Using Existing SQLContext Object
We can also use an existing SQLContext object to create a new SQLContext object. Every SQLContext provides a newSession API to create a new object based on the same SparkContext object. The API is as follows:
def newSession(): SQLContext
// Returns a SQLContext as new session, with separated SQL configurations, temporary tables, registered functions, but sharing the same SparkContext, cached data and other things
Below is the Scala program to implement the approach:
Scala
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
object createSQLContext {
def main(args: Array[String]): Unit = {
val sc = new SparkContext("local[*]", "createSQLContext")
val sqlc = new SQLContext(sc)
val nsqlc = sqlc.newSession()
println(nsqlc)
}
}
Output:
The SQLContext ObjectExplanation:
As you can see above we have created a new SQLContext object. Although we were successful but this method is deprecated and SQLContext is replaced with SparkSession. SQLContext is kept in newer versions only for backward compatibility.
3. Using SparkSession
The latest way (as of version 3.5.0) is to use SparkSession object. The SparkSession is a culmination of various previous contexts and provides a unified interface for all of them. We can create a SparkSession object using the builder API and then access the SQLContext object from it as follows:
Scala
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.SparkSession
object createSQLContext {
def main(args: Array[String]): Unit = {
val spark = SparkSession
.builder()
.appName("createSQLContext")
.master("local[*]")
.getOrCreate()
println(spark.sqlContext)
}
}
Output:
The SQLContext ObjectExplanation:
As you can see we accessed the SQLContext object from inside the SparkSession object.
Similar Reads
Non-linear Components In electrical circuits, Non-linear Components are electronic devices that need an external power source to operate actively. Non-Linear Components are those that are changed with respect to the voltage and current. Elements that do not follow ohm's law are called Non-linear Components. Non-linear Co
11 min read
Spring Boot Tutorial Spring Boot is a Java framework that makes it easier to create and run Java applications. It simplifies the configuration and setup process, allowing developers to focus more on writing code for their applications. This Spring Boot Tutorial is a comprehensive guide that covers both basic and advance
10 min read
Class Diagram | Unified Modeling Language (UML) A UML class diagram is a visual tool that represents the structure of a system by showing its classes, attributes, methods, and the relationships between them. It helps everyone involved in a projectâlike developers and designersâunderstand how the system is organized and how its components interact
12 min read
Steady State Response In this article, we are going to discuss the steady-state response. We will see what is steady state response in Time domain analysis. We will then discuss some of the standard test signals used in finding the response of a response. We also discuss the first-order response for different signals. We
9 min read
Backpropagation in Neural Network Back Propagation is also known as "Backward Propagation of Errors" is a method used to train neural network . Its goal is to reduce the difference between the modelâs predicted output and the actual output by adjusting the weights and biases in the network.It works iteratively to adjust weights and
9 min read
Polymorphism in Java Polymorphism in Java is one of the core concepts in object-oriented programming (OOP) that allows objects to behave differently based on their specific class type. The word polymorphism means having many forms, and it comes from the Greek words poly (many) and morph (forms), this means one entity ca
7 min read
3-Phase Inverter An inverter is a fundamental electrical device designed primarily for the conversion of direct current into alternating current . This versatile device , also known as a variable frequency drive , plays a vital role in a wide range of applications , including variable frequency drives and high power
13 min read
What is Vacuum Circuit Breaker? A vacuum circuit breaker is a type of breaker that utilizes a vacuum as the medium to extinguish electrical arcs. Within this circuit breaker, there is a vacuum interrupter that houses the stationary and mobile contacts in a permanently sealed enclosure. When the contacts are separated in a high vac
13 min read
AVL Tree Data Structure An AVL tree defined as a self-balancing Binary Search Tree (BST) where the difference between heights of left and right subtrees for any node cannot be more than one. The absolute difference between the heights of the left subtree and the right subtree for any node is known as the balance factor of
4 min read
CTE in SQL In SQL, a Common Table Expression (CTE) is an essential tool for simplifying complex queries and making them more readable. By defining temporary result sets that can be referenced multiple times, a CTE in SQL allows developers to break down complicated logic into manageable parts. CTEs help with hi
6 min read