How to check dataframe is empty in Scala? Last Updated : 27 Mar, 2024 Comments Improve Suggest changes Like Article Like Report In this article, we will learn how to check dataframe is empty or not in Scala. we can check if a DataFrame is empty by using the isEmpty method or by checking the count of rows. Syntax: val isEmpty = dataframe.isEmpty OR, val isEmpty = dataframe.count() == 0 Here's how you can do it: Example #1: using isEmpty function Scala import org.apache.spark.sql.{DataFrame, SparkSession} object DataFrameEmptyCheck { def main(args: Array[String]): Unit = { // Create SparkSession val spark = SparkSession.builder() .appName("DataFrameEmptyCheck") .master("local[*]") .getOrCreate() // Sample DataFrame (replace this with // your actual DataFrame) val dataframe: DataFrame = spark.emptyDataFrame // Check if DataFrame is empty val isEmpty = dataframe.isEmpty if (isEmpty) { println("DataFrame is empty") } else { println("DataFrame is not empty") } // Stop SparkSession spark.stop() } } Output: DataFrame is emptyExplanation: The code creates a SparkSession, which is the entry point to Spark functionality.It defines a sample DataFrame using spark.emptyDataFrame, which creates an empty DataFrame. You would typically replace this with your actual DataFrame.The code then checks if the DataFrame is empty using the isEmpty method. Since we initialized it as an empty DataFrame, the condition isEmpty will evaluate to true.If the DataFrame is empty, it prints "DataFrame is empty".Finally, the SparkSession is stopped to release resources.Example #2 : using count function Scala import org.apache.spark.sql.{DataFrame, SparkSession} object DataFrameEmptyCheck { def main(args: Array[String]): Unit = { // Create SparkSession val spark = SparkSession.builder() .appName("DataFrameEmptyCheck") .master("local[*]") .getOrCreate() // Sample DataFrame (replace this with // your actual DataFrame) val dataframe: DataFrame = spark.emptyDataFrame // Check if DataFrame is empty val isEmpty = dataframe.count() == 0 if (isEmpty) { println("DataFrame is empty") } else { println("DataFrame is not empty") } // Stop SparkSession spark.stop() } } Output: DataFrame is emptyExplanation: The code creates a SparkSession, initializing it as "local[*]", which means it will run locally using all available CPU cores.It defines a sample DataFrame using spark.emptyDataFrame, creating an empty DataFrame. This DataFrame has no rows.The code then checks if the DataFrame is empty using the count() function. This function returns the number of rows in the DataFrame. Since the DataFrame is empty, its count will be 0.The condition dataframe.count() == 0 evaluates to true because the count of rows in the DataFrame is indeed 0.Therefore, it prints "DataFrame is empty" to indicate that the DataFrame is indeed empty.Finally, the SparkSession is stopped to release resources. Comment More infoAdvertise with us Next Article How to check dataframe is empty in Scala? R raushanikuf9x7 Follow Improve Article Tags : Scala Similar Reads How to Check if PySpark DataFrame is empty? In this article, we are going to check if the Pyspark DataFrame or Dataset is Empty or Not. At first, let's create a dataframe Python3 # import modules from pyspark.sql import SparkSession from pyspark.sql.types import StructType, StructField, StringType # defining schema schema = StructType([ Struc 1 min read How to create an empty dataframe in Scala? In this article, we will learn how to create an empty dataframe in Scala. We can create an empty dataframe in Scala by using the createDataFrame method provided by the SparkSession object. Syntax to create an empty DataFrame: val df = spark.emptyDataFrame Example of How to create an empty dataframe 2 min read How To Check If Cell Is Empty In Pandas Dataframe An empty cell or missing value in the Pandas data frame is a cell that consists of no value, even a NaN or None. It is typically used to denote undefined or missing values in numerical arrays or DataFrames. Empty cells in a DataFrame can take several forms:NaN: Represents missing or undefined data.N 6 min read How to check dataframe size in Scala? In this article, we will learn how to check dataframe size in Scala. To check the size of a DataFrame in Scala, you can use the count() function, which returns the number of rows in the DataFrame. Here's how you can do it: Syntax: val size = dataframe.count() Example #1: Scala import org.apache.spar 2 min read How to print dataframe in Scala? Scala stands for scalable language. It was developed in 2003 by Martin Odersky. It is an object-oriented language that provides support for functional programming approach as well. Everything in scala is an object e.g. - values like 1,2 can invoke functions like toString(). Scala is a statically typ 4 min read Like