/

How to create environment variables

How to set up environment variables for usage in workflows


Introduction

This guide will walk you through setting up environment variables for your account and how to use these variables for blocks accessing external storage services such as Google Cloud Storage (GCS) or Amazon Web Services (AWS) in workflows. The following steps will use a workflow consisting of the Global Seeps block and the Export Vector block.

Setting an environment variable

When using blocks like the Import Data or Export Data blocks where you are importing or exporting data to and from an external storage system, it is necssary to set up your external storage credentials as environment variables first.

  1. Go to your workspace. You can find the workspace tab in a drop-down window by clicking on your workspace located in the upper-right hand corner of your console.

environment variables tutorial find workspace
Click on the Workspace from the dropdown list in the upper-right hand corner of the Console

  1. Navigate to Environments and click on Create environment.

environment variables tutorial noenvironments

  1. Enter your environment name. This can be any name that will help you distinguish your credentials from other environment variables you may set up in the future. For example, you can call it GCS or AWS, for your GCS or AWS credentials respectively.

environment variables tutorial create environment gcs
Enter a unique environment name for the credentials you will create

  1. The Export and Import data blocks access either the GCS or AWS, which have different inputs for the environment variable Key. Refer to the block reference articles for these blocks to know which environment variable string to use. Copy and paste your credentials in the Value field and save.

environment variables tutorial savecredentials
Enter credentials and save

Using environment variables in a workflow

  1. Go back to your Projects and find your workflow. Here is a workflow consisting of the Global Seeps and Export Vector blocks:

environment variables tutorial dropdown
Environment variables can now be applied for a block

  1. The Environment tabs will be visible immediately after you enter your environment variables as we had done in the previous steps. Select your GCS or AWS credentials from the dropdown menu for the Export Vector block. By doing so, the Export Vector block can now access your GGS bucket using the credentials you provided.

Use your environment variables for the block that needs it. In this example, there is no need to set environment variables for the Global Seeps block because it does not use your GCS or AWS buckets to access the data.

environment variables tutorial choose variable
Choose the environment variable for the appropriate block

  1. Click on Update & Configure job to move on to job configuration. Select your AOI for the Global Seeps block and for the Export Data block portion of the job parameters, tweak the following parameters:
ParameterTips
cloud_provider"gcs" or "aws"
bucket_nameName of the GCP or AWS bucket. If the bucket has subdirectories within it, do not write the full path here (use the prefix parameter below to specify these subdirectories).
prefixNames of subdirectories contained in the main bucket specified by bucket_name. Separate subdirectories with /.

In this tutorial, we want to export the data from the Global Seeps block directly to a GCS bucket called blocks-e2e-testing and for the data to be exported into the subfolder input located within the folder e2d_export_data. The full path for the end directory is therefore:

blocks-e2e-testing/e2e_export_data/input

Our job parameters will look like so:


	{
	"airbus-globalseeps:1": {
		"intersects": {
			"type": "Polygon",
			"coordinates": [
				[
					[-119.880243, 34.412154],
					[-119.880535, 34.401514],
					[-119.868056, 34.401675],
					[-119.868571, 34.411871],
					[-119.880243, 34.412154]
				]
			]
		},
		"object_type": ["Scenes", "Ships_Rigs", "Slick_Points", "Slick_Outlines"]
	},
	"up42-exportdata-vector:1": {
		"prefix": "e2e_export_data/input",
		"bucket_name": "blocks-e2e-testing",
		"cloud_provider": "gcs"
		}
	}
  1. Run the job as normal and check your GCS or AWS bucket to make sure that the files have been exported.