Introduction
This guide will walk you through setting up environment variables for your account and how to use these variables for blocks accessing external storage services such as Google Cloud Storage (GCS) or Amazon Web Services (AWS) in workflows. The following steps will use a workflow consisting of the Global Seeps block and the Export Vector block.
Setting an Environment Variable
When using blocks like the Import Data or Export Data blocks where you are importing or exporting data to and from an external storage system, it is necssary to set up your external storage credentials as environment variables first.
- Go to your workspace. You can find the workspace tab in a drop-down window by clicking on your workspace located in the upper-right hand corner of your console.
- Navigate to Environments and click on Create environment.
- Enter your environment name. This can be any name that will help you distinguish your credentials from other environment variables you may set up in the future. For example, you can call it GCS or AWS, for your GCS or AWS credentials respectively.
- The Export and Import data blocks access either the GCS or AWS, which have different inputs for the environment variable
Key
. Copy and paste your credentials in theValue
field and save. For more information: Import Data (GeoTIFF) and Export Data (Vector).
Using Environment Variables in a Workflow
- Go back to your Projects and find your workflow. Here is a workflow consisting of the Global Seeps and Export Data (Vector) blocks:
- The Environment tabs will be visible immediately after you enter your environment variables as we had done in the previous steps. Select your GCS or AWS credentials from the dropdown menu for the Export Vector block. By doing so, the Export Vector block can now access your GGS bucket using the credentials you provided.
Use your environment variables for the block that needs it. In this example, there is no need to set environment variables for the Global Seeps block because it does not use your GCS or AWS buckets to access the data.
- Click on Update & Configure job to move on to job configuration. Select your AOI for the Global Seeps block and for the Export Data block portion of the job parameters, tweak the following parameters:
Parameter | Tips |
---|---|
cloud_provider | "gcs" or "aws" |
bucket_name | Name of the GCP or AWS bucket. If the bucket has subdirectories within it, do not write the full path here (use the prefix parameter below to specify these subdirectories). |
prefix | Names of subdirectories contained in the main bucket specified by bucket_name. Separate subdirectories with /. |
In this tutorial, we want to export the data from the Global Seeps block directly to a GCS bucket called blocks-e2e-testing
and for the data to be exported into the subfolder input
located within the folder e2d_export_data
. The full path for the end directory is therefore:
blocks-e2e-testing/e2e_export_data/input
Our job parameters will look like so:
{
"airbus-globalseeps:1": {
"intersects": {
"type": "Polygon",
"coordinates": [
[
[-119.880243, 34.412154],
[-119.880535, 34.401514],
[-119.868056, 34.401675],
[-119.868571, 34.411871],
[-119.880243, 34.412154]
]
]
},
"object_type": ["Scenes", "Ships_Rigs", "Slick_Points", "Slick_Outlines"]
},
"up42-exportdata-vector:1": {
"prefix": "e2e_export_data/input",
"bucket_name": "blocks-e2e-testing",
"cloud_provider": "gcs"
}
}
- Run the job as normal and check your GCS or AWS bucket to make sure that the files have been exported.