How to check if table exists in Apache Hive using Apache Spark?

How to check if table exists in Apache Hive using Apache Spark?

When you are looking for hive table please provide table name in lowercase, due to fact that spark.sqlContext.tableNames returns the array of table names only in lowercase.

Spark 2.0 or higher

// Create SparkSession object with enabled Hive support
val spark = SparkSession
.builder()
.appName("Check table")
.enableHiveSupport()
.getOrCreate()
// Select database where you will search for table - lowercase
spark.sqlContext.sql("use bigdata_etl")
spark.sqlContext.tableNames.contains("schemas")
res4: Boolean = true

// With Uppercase
spark.sqlContext.tableNames.contains("Schemas")
res4: Boolean = false

Since Spark 1.6 to 2.0

// Get HiveContext from SparkContext
val sparkConf = new SparkConf().setAppName("Check table")
val sc = new SparkContext(sparkConf)
val hiveContext = new HiveContext(sc)
hiveContext.sql("use bigdata_etl")
hiveContext.tableNames.contains("schemas")

// With Uppercase
hiveContext.tableNames.contains("Schemas")
res4: Boolean = false

If table will exist you will give the “true”, otherwise “false”

If you enjoyed this post please add the comment below or share this post on your Facebook, Twitter, LinkedIn or another social media webpage.
Thanks in advanced!

0 0 vote
Article Rating
Subscribe
Notify of
guest
2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Amayak Urumyan

Side note. spark.sqlContext.tableNames array contains lower case names only. Make sure, the name of the table is all lowercase, otherwise, the result will be false.

spark.sqlContext.tableNames.contains(“schemas”) – true
spark.sqlContext.tableNames.contains(“Schemas”) – false