/

Getting started

Get familiar with the processing platform.


Main Concepts

In this section, you will learn how to apply analytics on geospatial datasets and build custom data/processing blocks. This section explains how to create projects and workflows, add data and processing blocks, configure the job parameters of your workflows and run jobs to generate outputs.

Projects

Projects are intended for storing workflows and their corresponding job runs. A project contains the project credentials which are necessary to generate the authorization token. This token allows you to make API calls or use the Python SDK.

A project also contains the settings for the threshold limits. These limits are maximum values that you can apply during the API, Python SDK or console operations: the size of your area of interest, the number of returned images and the number of concurrent jobs.

Learn more

For more information: Projects.

Workflows

A workflow is a Directed Acyclic Graph (DAG) of data and processing blocks. The workflow defines the order in which each operation associated with a certain block is performed. A workflow always starts with a data block and is followed by one or more processing blocks. A workflow cannot be empty and it needs to contain at least one data block.

Learn more

For more information: Workflows.

Blocks

A block is a unit of the workflow and acts as an operator for data retrieval or processing algorithms. Blocks are divided in two types:

  1. Data block: operator for downloading/streaming a data source. The data block is always the first operator of a workflow and it can be followed by one or more processing blocks.
  2. Processing block: operator for processing the previous data source or processing block output. The order of the blocks is defined by the block capabilities, which specify the parameter values necessary to allow the combination of blocks: spectral or spatial resolution, file format, bit depth etc. A data or processing block can be used in multiple workflows.
Learn more

For more information: Blocks.

Jobs

A job is a unique instance of a workflow that runs in order to generate the outputs corresponding to each block and defined by the job parameters. A workflow can have one or more job runs. For instance, the job parameters of a workflow can be configured more times and trigger a new job run if you want to retrieve data from multiple areas of interest or dates.

Learn more

For more information: Jobs.