spark-submit
Usage
Usage: prism spark-submit [OPTIONS]
Execute your Prism project as a PySpark job.
Examples:
• prism spark-submit
• prism spark-submit -m module01.py -m module02.py
• prism spark-submit -m module01 --all-downstream
• prism spark-submit -v VAR1=VALUE1 -v VAR2=VALUE2
• prism spark-submit --context '{"hi": 1}'
╭─ Options ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ --module -m TEXT Modules to execute. You can specify multiple modules with as follows: -m │
│ <your_first_module> -m <your_second_module>. │
│ --all-downstream Execute all tasks downstream of modules specified with --module. │
│ --all-upstream Execute all tasks upstream of modules specified with --module. │
│ --full-refresh Run tasks from scratch (even the ones that are considered done) │
│ --log-level -l [info|warn|error|debug] Set the log level. │
│ --full-tb Show the full traceback when an error occurs. │
│ --vars -v TEXT Variables as key value pairs. These overwrite variables in prism_project.py. All │
│ values are intepreted as strings. │
│ --context TEXT Context as a dictionary. Must be a valid JSON. These overwrite variables in │
│ prism_project.py. │
│ --help Show this message and exit. │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯Last updated