Coverage for src/spark_etl/job_submitters/abstract_job_submitter.py : 100%

Hot-keys on this page
r m x p toggle line displays
j k next/prev highlighted chunk
0 (zero) top of page
1 (one) first highlighted chunk
1class AbstractJobSubmitter:
2 def __init__(self, config):
3 self.config = config
5 def run(self, deployment_location, options={}, args={}):
6 """
7 run an application
9 Parameters
10 -----------
11 deployment_location: str
12 the location the application build is deployed.
13 options: dict
14 The runtime options, it could be vendor specific. The job submitter understands it. Not passed to application.
15 args: dict
16 the runtime argument that passed to application.
18 return:
19 it should return a dict with 2 keys.
20 state : If the job succeeded, it should be SUCCEEDED, otherwise it should be FAILED
21 run_id: A string of unique ID of the run
22 """
23 raise NotImplementedError()