How to check if table exists in Apache Hive using Apache Spark?

When you are looking for hive table please provide table name in lowercase, due to fact that spark.sqlContext.tableNames returns the array of table names only in lowercase.

Spark 2.0 or higher

// Create SparkSession object with enabled Hive support
val spark = SparkSession
.appName("Check table")
// Select database where you will search for table - lowercase
spark.sqlContext.sql("use bigdata_etl")
res4: Boolean = true

// With Uppercase
res4: Boolean = false

Since Spark 1.6 to 2.0

// Get HiveContext from SparkContext
val sparkConf = new SparkConf().setAppName("Check table")
val sc = new SparkContext(sparkConf)
val hiveContext = new HiveContext(sc)
hiveContext.sql("use bigdata_etl")

// With Uppercase
res4: Boolean = false

If table will exist you will give the “true”, otherwise “false”

If you enjoyed this post please add the comment below or share this post on your Facebook, Twitter, LinkedIn or another social media webpage.
Thanks in advanced!

0 0 vote
Article Rating
Notify of
Newest Most Voted
Inline Feedbacks
View all comments

Side note. spark.sqlContext.tableNames array contains lower case names only. Make sure, the name of the table is all lowercase, otherwise, the result will be false.

spark.sqlContext.tableNames.contains(“schemas”) – true
spark.sqlContext.tableNames.contains(“Schemas”) – false