Pipelines are core feature in Microtica and the most crucial part of every software delivery automation. Using pipelines you can define the process your source code should go through from your local machine to production.

Pipeline is a composition of multiple steps executed in sequence and/or parallel. Each step performs a specific action such as:

  • Compile and package the code
  • Run unit and integration tests
  • Run code quality checks
  • Build and push Docker images
  • Deploy services on Kubernetes

How Microtica pipelines work

Unlike traditional solutions where you need to spin and manage dedicated VMs to handle execution of pipeline actions, Microtica takes the cloud-native approach to provide software delivery automation using Docker as a runtime for pipeline step execution.

Using Microtica pipelines you donā€™t have to worry about maintaining complex infrastructure for your automation. Using Docker as a runtime you are able to define more flexible pipelines that can use different frameworks and tools in one pipeline.

In one step you can useĀ nodeĀ image as a runtime environment to compile your NodeJS application but than in the next step you can useĀ hashicorp/terraformĀ image, that contains pre-installed Terraform CLI, to perform some Terraform operations.

Document image

Each step spins up a new Docker container using the image specified for that step.

The Docker container lives until all actions within the step are completed, after that the container is killed and deleted and everything in the containerā€™s memory will no longer be available.

Microtica provides aĀ Pipeline shared stateĀ that preserves the state between steps throughout the pipeline execution.

Pipeline shared state

To preserve state between steps, Microtica uses Docker volumes to achieve that.

Sharing state between steps is useful when you clone the source code from Git in one step and compile and test the code in another step.

Shared state can be accessed from any step within one pipeline onĀ /microtica/sharedĀ path. Anything that is written in this folder by any step will be available for all other steps in that pipeline.

Document image

The diagram implies that if within the first step you store a file namedĀ index.js, the same file will be available in the second step onĀ /microtica/shared/index.jsĀ path.

Pipeline artifacts

Artifacts persist the step state even after the step is completed. Pipeline artifacts are usually used as a storage for deployment packages.

Document image

Each step can define one or multiple artifact packages withinĀ microtica.yaml.


With this spec we are telling Microtica to package everything within theĀ /dstĀ folder store artifact with nameĀ primary. The stored artifact can then be used for deployment or downloaded from the Microtica GUI.

Learn more about artifacts and different output configurations fromĀ Pipeline Artifacts.


Updated 07 Jun 2022
Did this page help?