You are currently viewing [SOLVED] How To Check Spark Version (PySpark Jupyter Notebook)? –  These 2 Simple Method Will Help You!
Could You Please Share This Post? I Appreciate It And Thank YOU! :) Have A Nice Day!
5
(1)

In this post I will show you how to check Spark version using CLI and PySpark code in Jupyter notebook. When we create the application which will be run on the cluster we firstly must know what Spark version is used on our cluster to be compatible.


Spark Free Tutorials

This post is a part of Spark Free Tutorial. Check the rest of the Spark tutorials which uou can find on the right side bar of this page! Stay tuned!


How To Check Spark Version Using CLI?

To check the Spark version you can use Command Line Interface (CLI).

To do this you must login to Cluster Edge Node for instance and then execute the following command on linux:

$ spark-submit --version
$ spark-shell --version


Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.3.0
      /_/
                        
Type --help for more information.

How To Check PySpark Version Using CLI?

To check the PySpark version just run the pyspark client from CLI. Use the following command:

$ pyspark --version


Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.3.0
      /_/
                        
Type --help for more information.

Check Spark Version In Jupyter Notebook

You can check the PySpark version using Jupyter notebook as well. Please just create the new notebook and run the following snippet of code:

import pyspark
from pyspark.sql import SparkSession

# Create SparkSession
spark = SparkSession.builder.master("local[*]") \
                    .appName('BigData-ETL.com') \
                    .getOrCreate()

print(f'The PySpark {spark.version} version is running...')

When you run above code you will get the response like on the below picture:

[SOLVED] How To Check Spark Version? -  These 2 Simple Method Will Help You!
PySpark – Jupyter Notebook – Check Spark Version

Code On Gitlab

The following code you can find on my Gitlab!

Summary

Now you know how to check Spark and PySpark version and use this information to provide correct dependency when you’re creating the applications which will be running on the cluster.

PySpark Official Site

If you are more interested in PySpark you should follow by official PySpark (Spark) website which provides up-to-date information about Spark features.

Could You Please Share This Post? 
I appreciate It And Thank YOU! :)
Have A Nice Day!

BigData-ETL: image 7YOU MIGHT ALSO LIKE

How useful was this post?

Click on a star to rate it!

Average rating 5 / 5. Vote count: 1

No votes so far! Be the first to rate this post.

As you found this post useful...

Follow us on social media!

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?