Suppose you have code like this in Scala:
import org.apache.spark.sql.{RuntimeConfig, SparkSession}
val spark: SparkSession = SparkSession.builder().appName("YourApp").getOrCreate()
val conf: RuntimeConfig = spark.conf
val yourParameter: String = conf.get("spark.driver.your_parameter")
print(yourParameter)
And you run it like this:
spark2-shell -i /path/to/script/test.scala --conf spark.driver.your_parameter="value"
And you want to put it in a Scala Object so that IntelliJ IDEA doesn’t highlight the code in red. Firstly, you are going to do smth like that:
import org.apache.spark.sql.{RuntimeConfig, SparkSession}
val spark: SparkSession = SparkSession.builder().appName("YourApp").getOrCreate()
val conf: RuntimeConfig = spark.conf
object MyObject {
def main(): Unit = {
val yourParameter: String = conf.get("spark.driver.your_parameter")
print(yourParameter)
}
}
But when you run the script, nothing happens and spark2-shell just opens. In order for sparks to run this code, and also to close when the code is executed, you need to add the sys.exit line inside MyObject, and also make a call outside of it:
import org.apache.spark.sql.{RuntimeConfig, SparkSession}
val spark: SparkSession = SparkSession.builder().appName("YourApp").getOrCreate()
val conf: RuntimeConfig = spark.conf
object MyObject {
def main(): Unit = {
val yourParameter: String = conf.get("spark.driver.your_parameter")
print(yourParameter)
sys.exit
}
}
MyObject.main()
IntelliJ IDEA will highlight the last line, but otherwise spark2-shell
will not understand that it needs to run the code from MyObject.
If you still have any questions, feel free to ask me in the comments under this article, or write me on promark33@gmail.com.