Spark failed to connect to the MetaStore Server

The problem

You may encounter errors like this when running a Spark script / application:

hive.metastore  - Failed to connect to the MetaStore Server...
hive.ql.metadata.Hive  - Failed to register all functions.
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1662)
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: Peer indicated failure: Failure to initialize security context
	at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)

Solutions

If you do not need the MetaStore server, there are two ways to disable it. Please note that Spark version >= 2.x is required.

The first way (via spark2-submit parameters)

spark2-submit \
  ...
  --conf spark.your.other.parameters=your.other.values \
  --conf spark.sql.catalogImplementation=in-memory 
  ...

The second way (via SparkConf object)

Java example:

SparkConf conf = new SparkConf()
  .setAppName("your-app-name")
  .set("your.other.spark.parameters", "your.other.spark.values")
  .set("spark.sql.catalogImplementation ", "in-memory");

Scala example:

val conf: SparkConf = new SparkConf()
  conf.setAppName("your-app-name")
  conf.set("your.other.spark.parameters", "your.other.spark.values")
  conf.set("spark.sql.catalogImplementation", "in-memory")

If you still have any questions, feel free to ask me in the comments under this article, or write me on promark33@gmail.com.

If I saved your day, you can support me :)

Leave a Reply

Your email address will not be published.