This example explains the distributed execution model of Hyperflow. It demonstrates how HyperFlow can communicate with remote job executors using Redis. It is also useful for testing the implementation of the Hyperflow job executor.
The distributed execution architecture consists of:
- Master components:
- The Hyperflow engine - executes the workflow graph; for each workflow task it invokes the Job invoker function
- Job invoker - Javascript function which creates jobs on a (remote) infrastructure to execute workflow tasks
- Redis server - used for communication between the Hyperflow engine and Job executors on remote workers
- Worker components:
- Hyperflow job executor - receives the job command from the Hyperflow engine and spawns application software
- Application software - software that actually performs workflow tasks
In this example:
- The workflow has two tasks (see
workflow.json): one that executesjob.js, the other which simply runsls -l. Note that the commands to be executed are specified inworkflow.json. - The engine invokes the function
submitRemoteJob(Job invoker) fromfunctions.js. This function simulates submission of jobs by starting the Hyperflow job executor and communicating with it via Redis to run jobs. ../../../hyperflow-job-executor/handler.jsrepresents a remote job executor which is passed two parameters:taskIdandredis_url. The executor gets ajobMessagefrom HyperFlow, executes the command in a separate process, and then sends back a notification that the job has completed;taskIdis used to construct appropriate Redis keys for this communication.- On the HyperFlow engine side, the Job invoker can use two functions (provided by Hyperflow):
context.sendMsgToJobto send a message to the job executor, andcontext.jobResultto wait for the notification from the executor. These functions return aPromise, so the async/await syntax can be used as shown in the example. - The parameter to the
context.jobResultfunction is a timeout in seconds (0 denotes infinity). One can use a retry library, such as promise-retry, to implement an exponential retry strategy. - The Job invoker also gets the Redis URL in
context.redis_urlwhich can be passed to the remote job executors.
To run the workflow, execute the following commands:
- First, clone the Hyperflow engine and the Hyperflow job executor:
git clone https://github.com/hyperflowgit clone https://github.com/hyperflow-wms/hyperflow-job-executorcd hyperflow; npm installcd ../hyperflow-job-executor; npm install
- Start the redis server
- To run the workflow:
cd ../hyperflow/examples/RemoteJobsnpm install(once)hflow run .