Data Transfer Guide: EOS ↔ PFS
This guide explains how to transfer data from EOS to PFS or from PFS to EOS using Argo Workflows.
You can use these workflows to schedule or automate periodic data movement jobs between your object storage buckets and parallel filesystem volumes.
Prerequisites
Before running any workflow:
-
You must have an existing EOS bucket. To create an EOS, follow this link
-
You must have a PFS filesystem created and mounted to an instance. To create a PFS, follow this link
-
Ensure your access credentials:
-
bucket_access_key -
bucket_secret_key
-

You can view the Bucket Name in the Dataset’s Overview tab.
Transfer Data: PFS → EOS
This workflow uploads files from your PFS to your EOS bucket.
Required Parameters
| Parameter | Description |
|---|---|
source_fs_name | Name of the filesystem |
source_path | Path inside the filesystem to upload |
destination_bucket_name | EOS bucket name |
destination_path | Destination folder inside the bucket |
bucket_endpoint_url | EOS endpoint (default included) |
bucket_access_key | Your EOS access key |
bucket_secret_key | Your EOS secret key |
Create Pipeline
Download fs-to-eos.yamlNow create a pipeline using the fs-to-eos.yaml file. For more details about creating pipeline follow this link
Enter the correct parameters in the YAML file, such as source_fs_name, source_path, and others. You may also fill them in manually while creating a run for this pipeline.

Now create a run for the pipeline. While creating the run, make sure to provide the correct parameters; otherwise, they will be automatically fetched from the YAML file if already specified.
For more details about creating a run for the pipeline, follow this link

Once the run completes successfully, you will see the following logs.

Transfer Data: EOS → PFS
This workflow downloads data from your EOS bucket into your PFS filesystem.
Required Parameters
| Parameter | Description |
|---|---|
source_bucket_name | EOS bucket name |
source_path | Folder/object prefix in the bucket |
destination_fs_name | Target PFS filesystem name |
destination_path | Folder inside filesystem to store data |
bucket_endpoint_url | EOS endpoint |
bucket_access_key | EOS access key |
bucket_secret_key | EOS secret key |
Create Pipeline
Download eos-to-fs.yamlNow create a pipeline using the eos-to-fs.yaml file. For more details about creating pipeline, follow this link
Enter the correct parameters in the YAML file, such as source_bucket_name, source_path, and others. You may also fill them in manually while creating a run for this pipeline.

Now create a run for the pipeline. While creating the run, make sure to provide the correct parameters; otherwise, they will be automatically fetched from the YAML file if already specified.
For more details about creating a run for the pipeline, follow this link

Once the run completes successfully, you will see the following logs.
