Skip to main content
Creating and Managing Workflows
Use the Apache Hue UI to create workflows. Use the scheduler list to create bundles, coordinators, or workflows. For Spark jobs, use the Spark action. The following task shows the Spark action widget with details on the Spark job and location in HDFS. The options list shown is to provide any extra Spark parameters to the Spark job. For HA clusters, you must provide Ranger related XML information. More information follows.
-
Sign in to Hue.
-
Create a script file and upload it to Hue.
-
In the leftmost navigation menu, click Scheduler.
-
Click Workflow, and then click My Workflow to create a workflow.
-
Click the Spark program icon to drag the Spark action to the Drop your action here area.
-
Select the Jar file or Python file from the Jar/py name dropdown.
-
Select the workflow from the FILES dropdown.
-
To connect Hive in an HA environment, click the gear icon, and then click Credentials.
-
Select hcat.
-
Click the save icon.
-
Select the workflow from the folder structure, and then click the submit icon.