The PySpark configurations are:
alias
: the alias used to refer to the SparkSession within the modules. The default is spark
.
loglevel
: the log level to use for the console. The default value is WARN. Acceptable values can be found .
Any key, value
pair that can be used in the config
method of the . Some examples are shown below!
Under the hood, prism takes care of parsing the configuration variables, construction a SparkSession instance, and storing the instance in an attribute named alias
within the hooks
object.
Users can access the SparkSession directly using hooks.{alias}
.