Pipelines
Pipelines are a core feature in Microtica and the most crucial part of every software delivery automation. Using pipelines you can define the process your source code should go through from your local machine to production.
A pipeline is a composition of multiple steps executed in sequence and/or parallel. Each step performs a specific action such as:
- Compile and package the code
- Run unit and integration tests
- Run code quality checks
- Build and push Docker images
- Deploy services on Kubernetes
Unlike traditional solutions where you need to spin and manage dedicated VMs to handle the execution of pipeline actions, Microtica takes the cloud-native approach to provide software delivery automation using Docker as a runtime for pipeline step execution.
Using Microtica pipelines you donāt have to worry about maintaining complex infrastructure for your automation. Using Docker as a runtime you are able to define more flexible pipelines that can use different frameworks and tools in one pipeline.
In one step you can useĀ nodeĀ image as a runtime environment to compile your NodeJS application but then in the next step you can useĀ hashicorp/terraformĀ image, which contains pre-installed Terraform CLI, to perform some Terraform operations.
ļ»æ
Each step spins up a new Docker container using the image specified for that step.
The Docker container lives until all actions within the step are completed, after that the container is killed and deleted and everything in the containerās memory will no longer be available.
Microtica provides aĀ Pipeline shared stateĀ that preserves the state between steps throughout the pipeline execution.
To preserve the state between steps, Microtica uses Docker volumes to achieve that.
Sharing state between steps is useful when you clone the source code from Git in one step and compile and test the code in another step.
A shared state can be accessed from any step within one pipeline onĀ /microtica/sharedĀ path. Anything that is written in this folder by any step will be available for all other steps in that pipeline.
The diagram implies that if within the first step you store a file namedĀ index.js, the same file will be available in the second step onĀ /microtica/shared/index.jsĀ path.
Artifacts persist in the step state even after the step is completed. Pipeline artifacts are usually used as storage for deployment packages.
With this spec, we are telling Microtica to package everything within theĀ /dstĀ folder store artifact with the nameĀ primary. The stored artifact can then be used for deployment or downloaded from the Microtica GUI.
Learn more about artifacts and different output configurations fromĀ Pipeline Artifacts.