We provide examples of triggering flows with the following actions:
- Tar and Transfer: A flow that creates a tar archive (using Globus Compute) and transfers the resulting tar file to the destination collection.
- Transfer: A single-action flow that transfers data.
- Transfer and Compute: A flow that transfers data and invokes a Globus Compute function on the destination collection to process the transferred files.
- Transfer, Compute and Share: A flow that transfers data, invokes a Globus Compute function on the destination collection to process the transferred files, transfers the processed files to another collection, and sets permissions for sharing the data.
- Transfer and Publish: A flow that transfers data, sets permissions for accessing the data, and ingests metadata (both fully accessible and restricted) into a Globus Search index.
- Transfer and Share: A flow that transfers data and sets permissions for sharing the data.
Each folder contains three files:
- definition.json - the flow definition
- schema.json - the flow input schema
- trigger_*.py - a Python script that will trigger the flow
The examples require the globus_sdk
and watchdog
packages. They can be installed by creating a Python virtual environment and running:
pip install -r requirements.txt
You can deploy each flow by running ./deploy_flow --flowdef FLOW_DEFINITION_FILE --schema FLOW_INPUT_SCHEMA_FILE --title FLOW_TITLE
. For example, to deploy the transfer-and-share flow, run:
./deploy_flow.py --flowdef transfer_share/definition.json --schema transfer_share/schema.json --title "My Transfer and Share Flow Example"
A separate watcher script is provided for triggering each flow. The trigger script must be modified before running, by defining values for the flow ID, collection ID and other variables (varies by script). Look for placeholders like "REPLACE_WTIH_...
and provide values for each before running the trigger script. All trigger scripts are run by specifying two arguments:
--watchdir
specifies the directory path to watch--patterns
specifies the file suffix pattern(s) to watch for (this can be a list of multiple suffixes, separated by spaces).
For example, to trigger the transfer-and-share flow when a file with suffix .done
is created in directory /my/experiment/data
, run:
./trigger_transfer_share_flow.py --watchdir /my/experiment/data --patterns .done
The trigger logic can be modified by editing the Handler
class in watch.py
. By default, the trigger logic will run the flow every time a file is created that ends with one of the suffixes specified in --patterns
.
A deployed flow may be updated by running:
./deploy_flow.py --flowid FLOW_ID --flowdef UPDATED_FLOW_DEFINITION_FILE --schema UPDATED_FLOW_INPUT_SCHEMA_FILE --title UPDATED_FLOW_TITLE
Note: This is just a convenience extension for these examples, and is limited to updating only the flow/schema definition and/or flow title; refer to the Globus Flows API reference for the full-featured PUT
.
Some of the flows include a compute action that uses the Globus Compute service. In order for these flows to succeed you must first create a Globus Compute endpoint and ensure that the compute endpoint has a Python environment with any required packages already installed. You must also register the function to run in the compute step with the Globus Compute service; we provide two scripts for deploying the required Globus Compute functions:
- compute_function.py - contains a simple image manipulation function that creates thumbnail images for any
.png
and.jpg
files transferred by the flow. The compute endpoint environment for running this function must include thePillow
package. This function is used in the transfer_compute and transfer_compute_share flows. - tar_function.py - contains a function for archiving all files in the specified directory to a tar file, which will then be transferred by the flow. The compute endpoint environment for running this function must include the
tarfile
package. This function is used in the tar_transfer flow.