\ No newline at end of file
diff --git a/assets/operator-guide/sonar-externalsecret-password.png b/assets/operator-guide/sonar-externalsecret-password.png
new file mode 100644
index 000000000..90fc0f6f5
Binary files /dev/null and b/assets/operator-guide/sonar-externalsecret-password.png differ
diff --git a/assets/operator-guide/sonar-secret-password.png b/assets/operator-guide/sonar-secret-password.png
new file mode 100644
index 000000000..10d0a70ef
Binary files /dev/null and b/assets/operator-guide/sonar-secret-password.png differ
diff --git a/developer-guide/edp-workflow/index.html b/developer-guide/edp-workflow/index.html
index 2b46fc714..8af8affe7 100644
--- a/developer-guide/edp-workflow/index.html
+++ b/developer-guide/edp-workflow/index.html
@@ -1,4 +1,4 @@
- EDP Project Rules. Working Process - EPAM Delivery Platform
This page contains the details on the project rules and working process for EDP team and contributors. Explore the main points about working with Gerrit, following the main commit flow, as well as the details about commit types and message below.
Before starting the development, please check the project rules:
It is highly recommended to become familiar with the Gerrit flow. For details, please refer to the Gerrit official documentation and pay attention to the main points:
a. Voting in Gerrit
b. Resolution of Merge Conflict
c. Comments resolution
d. One Jira task should have one Merge Request (MR). If there are many changes within one MR, add the next patch set to the open MR by selecting the Amend commit check box.
Only the Assignee is responsible for the MR merge and Jira task status.
With EDP, the main workflow is based on the getting a Jira task and creating a Merge Request according to the rules described below.
Workflow
Get Jira task → implement, verify by yourself the results → create Merge Request (MR) → send for review → resolve comments/add changes, ask colleagues for the final review → track the MR merge → verify by yourself the results → change the status in the Jira ticket to CODE COMPLETE or RESOLVED → share necessary links with a QA specialist in the QA Verification channel → QA specialist closes the Jira task after his verification → Jira task should be CLOSED.
Commit Flow
Get Jira task. Please be aware of the following points:
a. Every task has a reporter who can provide more details in case something is not clear.
b. The responsible person for the task and code implementation is the assignee who tracks the following:
actual Jira task status
time logging
add comments, attach necessary files
in comments, add link that refers to the merged MR (optional, if not related to many repositories)
code review and the final merge
MS Teams chats - ping other colleagues, answer questions, etc.
verification by a QA specialist
bug fixing
c. Pay attention to the task Status that differs in different entities, the workflow will help to see the whole task processing:
d. There are several entities that are used on the EDP project: Story, Improvement, Task, Bug.
Implement feature, improvement, fix and check the results on your own. If it is impossible to check the results of your work before the merge, verify all later.
Create a Merge Request, for details, please refer to the Code Review Process.
When committing, use the pattern: [EPMDEDP-JIRA Task Number]: commit type: Commit message.
a. [EPMDEDP] - is the default part;
b. JIRA Task Number - the number of your Jira task;
c. commit type:
feat: (new feature for the user, not a new feature for build script)
fix: (bug fix for the user, not a fix to a build script)
docs: (changes to the documentation)
style: (formatting, missing semicolons, etc; no production code change)
refactor: (refactoring production code, eg. renaming a variable)
test: (adding missing tests, refactoring tests; no production code change)
chore: (updating grunt tasks etc; no production code change)
!: (added to other commit types to mark breaking changes) For example:
[EPMDEDP-0000]:feat!:JobprovisionerisresponsiblefortheformationofJenkinsfile
+ EDP Project Rules. Working Process - EPAM Delivery Platform
This page contains the details on the project rules and working process for EDP team and contributors. Explore the main points about working with Gerrit, following the main commit flow, as well as the details about commit types and message below.
Before starting the development, please check the project rules:
It is highly recommended to become familiar with the Gerrit flow. For details, please refer to the Gerrit official documentation and pay attention to the main points:
a. Voting in Gerrit
b. Resolution of Merge Conflict
c. Comments resolution
d. One Jira task should have one Merge Request (MR). If there are many changes within one MR, add the next patch set to the open MR by selecting the Amend commit check box.
Only the Assignee is responsible for the MR merge and Jira task status.
With EDP, the main workflow is based on the getting a Jira task and creating a Merge Request according to the rules described below.
Workflow
Get Jira task → implement, verify by yourself the results → create Merge Request (MR) → send for review → resolve comments/add changes, ask colleagues for the final review → track the MR merge → verify by yourself the results → change the status in the Jira ticket to CODE COMPLETE or RESOLVED → share necessary links with a QA specialist in the QA Verification channel → QA specialist closes the Jira task after his verification → Jira task should be CLOSED.
Commit Flow
Get Jira task. Please be aware of the following points:
a. Every task has a reporter who can provide more details in case something is not clear.
b. The responsible person for the task and code implementation is the assignee who tracks the following:
actual Jira task status
time logging
add comments, attach necessary files
in comments, add link that refers to the merged MR (optional, if not related to many repositories)
code review and the final merge
MS Teams chats - ping other colleagues, answer questions, etc.
verification by a QA specialist
bug fixing
c. Pay attention to the task Status that differs in different entities, the workflow will help to see the whole task processing:
d. There are several entities that are used on the EDP project: Story, Improvement, Task, Bug.
Implement feature, improvement, fix and check the results on your own. If it is impossible to check the results of your work before the merge, verify all later.
Create a Merge Request, for details, please refer to the Code Review Process.
When committing, use the pattern: [EPMDEDP-JIRA Task Number]: commit type: Commit message.
a. [EPMDEDP] - is the default part;
b. JIRA Task Number - the number of your Jira task;
c. commit type:
feat: (new feature for the user, not a new feature for build script)
fix: (bug fix for the user, not a fix to a build script)
docs: (changes to the documentation)
style: (formatting, missing semicolons, etc; no production code change)
refactor: (refactoring production code, eg. renaming a variable)
test: (adding missing tests, refactoring tests; no production code change)
chore: (updating grunt tasks etc; no production code change)
!: (added to other commit types to mark breaking changes) For example:
This page is intended for developers with the aim to share details on how to set up the local environment and start coding in Go language for EPAM Delivery Platform.
We recommend using GoLand and enabling the Kubernetes plugin. Before installing plugins, make sure to save your work because IDE may require restarting.
To set up the cloned operator, follow the three steps below:
Configure Go Build Option. Open folder in GoLand, click the button and select the Go Build option:
Fill in the variables in Configuration tab:
In the Files field, indicate the path to the main.go file;
In the Working directory field, indicate the path to the operator;
In the Environment field, specify the namespace to watch by setting WATCH_NAMESPACE variable. It should equal default but it can be any other if required by the cluster specifications.
In the Environment field, also specify the platform type by setting PLATFORM_TYPE. It should equal either kubernetes or openshift.
Check cluster connectivity and variables. Local development implies working within local Kubernetes clusters. Kind (Kubernetes in Docker) is recommended so set this or another environment first before running code.
Testing and linting must be used before every single commit with no exceptions. The instructions for the commands below are written here.
It is mandatory to run test and lint to make sure the code passes the tests and meets acceptance criteria. Most operators are covered by tests so just run them by issuing the commands "make test" and "make lint":
make test
+ Workspace Setup Manual - EPAM Delivery Platform
This page is intended for developers with the aim to share details on how to set up the local environment and start coding in Go language for EPAM Delivery Platform.
We recommend using GoLand and enabling the Kubernetes plugin. Before installing plugins, make sure to save your work because IDE may require restarting.
To set up the cloned operator, follow the three steps below:
Configure Go Build Option. Open folder in GoLand, click the button and select the Go Build option:
Fill in the variables in Configuration tab:
In the Files field, indicate the path to the main.go file;
In the Working directory field, indicate the path to the operator;
In the Environment field, specify the namespace to watch by setting WATCH_NAMESPACE variable. It should equal default but it can be any other if required by the cluster specifications.
In the Environment field, also specify the platform type by setting PLATFORM_TYPE. It should equal either kubernetes or openshift.
Check cluster connectivity and variables. Local development implies working within local Kubernetes clusters. Kind (Kubernetes in Docker) is recommended so set this or another environment first before running code.
Testing and linting must be used before every single commit with no exceptions. The instructions for the commands below are written here.
It is mandatory to run test and lint to make sure the code passes the tests and meets acceptance criteria. Most operators are covered by tests so just run them by issuing the commands "make test" and "make lint":
make test
The command "make test" should give the output similar to the following:
make lint
The command "make lint" should give the output similar to the following:
The commands below are especially essential when making changes to API. The code is unsatisfactory if these commands fail.
Generate documentation in the .MD file format so the developer can read it:
make api-docs
The command "make api-docs" should give the output similar to the following:
There are also manifests within the operator that generate zz_generated.deepcopy.go file in /api/v1 directory. This file is necessary for the platform to work but it's time-consuming to fill it by yourself so there is a mechanism that does it automatically. Update it using the following command and check if it looks properly:
This section defines necessary steps to start developing the EDP documentation in the MkDocs Framework. The framework presents a static site generator with documentation written in Markdown. All the docs are configured with a YAML configuration file.
This section defines necessary steps to start developing the EDP documentation in the MkDocs Framework. The framework presents a static site generator with documentation written in Markdown. All the docs are configured with a YAML configuration file.
Consult EDP Glossary section for definitions mentioned on this page and EDP Toolset to have a full list of tools used with the Platform. The below table contains a full list of features provided by EDP.
Features
Description
Cloud Agnostic
EDP runs on Kubernetes cluster, so any Public Cloud Provider which provides Kubernetes can be used. Kubernetes clusters deployed on-premises work as well
CI/CD for Microservices
EDP is initially designed to support CI/CD for Microservices running as containerized applications inside Kubernetes Cluster. EDP also supports CI for: - Terraform Modules, - Open Policy Rules, - Workflows for Java (8,11,17), JavaScript (React, Vue, Angular, Express, Antora), C# (.NET 6.0), Python (FastAPI, Flask, 3.8), Go (Beego, Operator SDK)
Version Control System (VCS)
EDP installs Gerrit as a default Source Code Management (SCM) tool. EDP also supports GitHub and GitLab integration
EDP provides separate Git repository per each Codebase and doesn't work with Monorepo. However, EDP does support customization and runs helm-lint, dockerfile-lint steps using Monorepo approach.
Artifacts Versioning
EDP supports two approaches for Artifacts versioning: - default (BRANCH-[TECH_STACK_VERSION]-BUILD_ID) - EDP (MAJOR.MINOR.PATCH-BUILD_ID), which is SemVer. Custom versioning can be created by implementing get-version stage
Application Library
EDP provides baseline codebase templates for Microservices, Libraries, within create strategy while onboarding new Codebase
Stages Library
Each EDP Pipeline consists of pre-defined steps (stages). Consult library documentation for more details
CI Pipelines
EDP provides CI Pipelines for first-class citizens: - Applications (Microservices) based on Java (8,11,17), JavaScript (React, Vue, Angular, Express, Antora), C# (.NET 6.0), Python (FastAPI, Flask, 3.8), Go (Beego, Operator SDK) - Libraries based on Java (8,11,17), JavaScript (React, Vue, Angular, Express), Python (FastAPI, Flask, 3.8), Groovy Pipeline (Codenarc), Terraform, Rego (OPA), Container (Docker), Helm (Pipeline), C#(.NET 6.0) - Autotests based on Java8, Java11, Java17
CD Pipelines
EDP provides capabilities to design CD Pipelines (in Admin Console) for Microservices and defines logic for artifacts flow (promotion) from env to env. Artifacts promotion is performed automatically (Autotests), manually (User Approval) or combining both approaches
Autotests
EDP provides CI pipeline for autotest implemented in Java. Autotests can be used as Quality Gates in CD Pipelines
Custom Pipeline Library
EDP can be extended by introducing Custom Pipeline Library
Dynamic Environments
Each EDP CD Pipeline creates/destroys environment upon user requests
\ No newline at end of file
+ Basic Concepts - EPAM Delivery Platform
Consult EDP Glossary section for definitions mentioned on this page and EDP Toolset to have a full list of tools used with the Platform. The below table contains a full list of features provided by EDP.
Features
Description
Cloud Agnostic
EDP runs on Kubernetes cluster, so any Public Cloud Provider which provides Kubernetes can be used. Kubernetes clusters deployed on-premises work as well
CI/CD for Microservices
EDP is initially designed to support CI/CD for Microservices running as containerized applications inside Kubernetes Cluster. EDP also supports CI for: - Terraform Modules, - Open Policy Rules, - Workflows for Java (8,11,17), JavaScript (React, Vue, Angular, Express, Antora), C# (.NET 6.0), Python (FastAPI, Flask, 3.8), Go (Beego, Operator SDK)
Version Control System (VCS)
EDP installs Gerrit as a default Source Code Management (SCM) tool. EDP also supports GitHub and GitLab integration
EDP provides separate Git repository per each Codebase and doesn't work with Monorepo. However, EDP does support customization and runs helm-lint, dockerfile-lint steps using Monorepo approach.
Artifacts Versioning
EDP supports two approaches for Artifacts versioning: - default (BRANCH-[TECH_STACK_VERSION]-BUILD_ID) - EDP (MAJOR.MINOR.PATCH-BUILD_ID), which is SemVer. Custom versioning can be created by implementing get-version stage
Application Library
EDP provides baseline codebase templates for Microservices, Libraries, within create strategy while onboarding new Codebase
Stages Library
Each EDP Pipeline consists of pre-defined steps (stages). Consult library documentation for more details
CI Pipelines
EDP provides CI Pipelines for first-class citizens: - Applications (Microservices) based on Java (8,11,17), JavaScript (React, Vue, Angular, Express, Antora), C# (.NET 6.0), Python (FastAPI, Flask, 3.8), Go (Beego, Operator SDK) - Libraries based on Java (8,11,17), JavaScript (React, Vue, Angular, Express), Python (FastAPI, Flask, 3.8), Groovy Pipeline (Codenarc), Terraform, Rego (OPA), Container (Docker), Helm (Pipeline), C#(.NET 6.0) - Autotests based on Java8, Java11, Java17
CD Pipelines
EDP provides capabilities to design CD Pipelines (in Admin Console) for Microservices and defines logic for artifacts flow (promotion) from env to env. Artifacts promotion is performed automatically (Autotests), manually (User Approval) or combining both approaches
Autotests
EDP provides CI pipeline for autotest implemented in Java. Autotests can be used as Quality Gates in CD Pipelines
Custom Pipeline Library
EDP can be extended by introducing Custom Pipeline Library
Dynamic Environments
Each EDP CD Pipeline creates/destroys environment upon user requests
\ No newline at end of file
diff --git a/getting-started/index.html b/getting-started/index.html
index b93fef781..f371177e5 100644
--- a/getting-started/index.html
+++ b/getting-started/index.html
@@ -1,4 +1,4 @@
- Quick Start - EPAM Delivery Platform
To install EDP with the necessary parameters, please refer to the Install EDP section of the Operator Guide. Mind the parameters in the EDP installation chart. For details, please refer to the values.yaml.
Find below the example of the installation command:
To install EDP with the necessary parameters, please refer to the Install EDP section of the Operator Guide. Mind the parameters in the EDP installation chart. For details, please refer to the values.yaml.
Find below the example of the installation command:
Get familiar with the definitions and context for the most useful EDP terms presented in table below.
Terms
Details
EDP Component - an item used in CI/CD process
EDP Headlamp UI - an EDP component that helps to manage, set up, and control the business entities.
Artifactory - an EDP component that stores all the binary artifacts. NOTE: Nexus is used as a possible implementation of a repository.
CI/CD Server - an EDP component that launches pipelines that perform the build, QA, and deployment code logic. NOTE: Jenkins is used as a possible implementation of a CI/CD server.
Code Review tool - an EDP component that collaborates with the changes in the codebase. NOTE: Gerrit is used as a possible implementation of a code review tool.
Identity Server - an authentication server providing a common way to verify requests to all of the applications. NOTE: Keycloak is used as a possible implementation of an identity server.
Security Realm Tenant - a realm in identity server (e.g Keycloak) where all users' accounts and their access permissions are managed. The realm is unique for the identity server instance.
Static Code Analyzer - an EDP component that inspects continuously a code quality before the necessary changes appear in a master branch. NOTE: SonarQube is used as a possible implementation of a static code analyzer.
VCS (Version Control System) - a replication of the Gerrit repository that displays all the changes made by developers. NOTE: GitHub and GitLab are used as the possible implementation of a repository with the version control system.
EDP Business Entity - a part of the CI/CD process (the integration, delivery, and deployment of any codebase changes)
Application - a codebase type that is built as the binary artifact and deployable unit with the code that is stored in VCS. As a result, the application becomes a container and can be deployed in an environment.
Autotests - a codebase type that inspects a product (e.g. an application set) on a stage. Autotests are not deployed to any container and launched from the respective code stage.
CD Pipeline (Continuous Delivery Pipeline) - an EDP business entity that describes the whole delivery process of the selected application set via the respective stages. The main idea of the CD pipeline is to promote the application version between the stages by applying the sequential verification (i.e. the second stage will be available if the verification on the first stage is successfully completed). NOTE: The CD pipeline can include the essential set of applications with its specific stages as well.
CD Pipeline Stage - an EDP business entity that is presented as the logical gate required for the application set inspection. Every stage has one OpenShift project where the selected application set is deployed. All stages are sequential and promote applications one-by-one.
Codebase - an EDP business entity that possesses a code.
Codebase Branch - an EDP business entity that represents a specific version in a Git branch. Every codebase branch has a Codebase Docker Stream entity.
Codebase Docker Stream - a deployable component that leads to the application build and displays that the last build was verified on the specific stage. Every CD pipeline stage accepts a set of Codebase Docker Streams (CDS) that are input and output. SAMPLE: if an application1 has a master branch, the input CDS will be named as [app name]-[pipeline name]-[stage name]-[master] and the output after the passing of the DEV stage will be as follows: [app name]-[pipeline name]-[stage name]-[dev]-[verified].
Library - a codebase type that is built as the binary artifact, i.e. it`s stored in the Artifactory and can be uploaded by other applications, autotests or libraries.
Quality Gate - an EDP business entity that represents the minimum acceptable results after the testing. Every stage has a quality gate that should be passed to promote the application. The stage quality gate can be a manual approve from a QA specialist OR a successful autotest launch.
Quality Gate Type - this value defines trigger type that promotes artifacts (images) to the next environment in CD Pipeline. There are manual and automatic types of quality gates. The manual type means that the promoting process should be confirmed in Jenkins. The automatic type promotes the images automatically in case there are no errors in the Allure Report. NOTE: If any of the test types is not passed, the CD pipeline will fail.
Trigger Type - a value that defines a trigger type used for the CD pipeline triggering. There are manual and automatic types of triggering. The manual type means that the CD pipeline should be triggered manually. The automatic type triggers the CD pipeline automatically as soon as the Codebase Docker Stream was changed.
EDP CI/CD Pipelines Framework - a library that allows extending the Jenkins pipelines and stages to develop an application. Pipelines are presented as the shared library that can be connected in Jenkins. The library is connected using the Git repository link (a public repository that is supported by EDP) on the GitHub.
Allure Report- a tool that represents test results in one brief report in a clear form.
Automated Tests - different types of automated tests that can be run on the environment for a specific stage.
Build Pipeline - a Jenkins pipeline that builds a corresponding codebase branch in the Codebase.
Build Stage - a stage that takes place after the code has been submitted/merged to the repository of the main branch (the pull request from the feature branch is merged to the main one, the Patch set is submitted in Gerrit).
Code Review Pipeline - a Jenkins pipeline that inspects the code candidate in the Code Review tool.
Code Review Stage - a stage where code is reviewed before it goes to the main branch repository of the version control system (the commit to the feature branch is pushed, the Patch set is created in Gerrit).
Deploy Pipeline - a Jenkins pipeline that is responsible for the CD Pipeline Stage deployment with the full set of applications and autotests.
Deployment Stage - a part of the Continuous Delivery where artifacts are being deployed to environments.
EDP CI/CD Pipelines - an orchestrator for stages that is responsible for the common technical events, e.g. initialization, in Jenkins pipeline. The set of stages for the pipeline is defined as an input JSON file for the respective Jenkins job. NOTE: There is the ability to create the necessary realization of the library pipeline on your own as well.
EDP CI/CD Stages - a repository that is launched in the Jenkins pipeline. Every stage is presented as an individual Groovy file in a corresponding repository. Such single responsibility realization allows rewriting of one essential stage without changing the whole pipeline.
Environment - a part of the stage where the built and packed into an image application are deployed for further testing. It`s possible to deploy several applications to several environments (Team and Integration environments) within one stage.
Integration Environment - an environment type that is always deployed as soon as the new application version is built in order to launch the integration test and promote images to the next stages. The Integration Environment can be triggered manually or in case a new image appears in the Docker registry.
Jenkinsfile - a text file that keeps the definition of a Jenkins Pipeline and is checked into source control. Every Job has its Jenkinsfile that is stored in the specific application repository and in Jenkins as the plain text.
Jenkins Node - a machine that is a part of the Jenkins environment that is capable of executing a pipeline.
Jenkins Pipeline - a user-defined model of a CD pipeline. The pipeline code defines the entire build process.
Jenkins Stage - a part of the whole CI/CD process that should pass the source code in order to be released and deployed on the production.
Team Environment - an environment type that can be deployed at any time by the manual trigger of the Deploy pipeline where team or developers can check out their applications. NOTE: The promotion from such kind of environment is prohibited and developed only for the local testing.
OpenShift / Kubernetes (K8S)
ConfigMap - a resource that stores configuration data and processes the strings that do not contain sensitive information.
Docker Container - is a lightweight, standalone, and executable package.
Docker Registry - a store for the Docker Container that is created for the application after the Build pipeline performance.
OpenShift Web Console - a web console that enables to view, manage, and change OpenShift / K8S resources using browser.
Operator Framework - a deployable unit in OpenShift that is responsible for one or a set of resources and performs its life circle (adding, displaying, and provisioning).
Path - a route component that helps to find a specified path (e.g. /api) at once and skip the other.
Pod - the smallest deployable unit of the large microservice application that is responsible for the application launch. The pod is presented as the one launched Docker container. When the Docker container is collected, it will be kept in Docker Registry and then saved as Pod in the OpenShift project. NOTE: The Deployment Config is responsible for the Pod push, restart, and stop processes.
PV (Persistent Volume) - a cluster resource that captures the details of the storage implementation and has an independent lifecycle of any individual pod.
PVC (Persistent Volume Claim) - a user request for storage that can request specific size and access mode. PV resources are consumed by PVCs.
Route - a resource in OpenShift that allows getting the external access to the pushed application.
Secret - an object that stores and manages all the sensitive information (e.g. passwords, tokens, and SSH keys).
Service - an external connection point with Pod that is responsible for the network. A specific Service is connected to a specific Pod using labels and redirects all the requests to Pod as well.
Site - a route component (link name) that is created from the indicated application name and applies automatically the project name and a wildcard DNS record.
\ No newline at end of file
+ Glossary - EPAM Delivery Platform
Get familiar with the definitions and context for the most useful EDP terms presented in table below.
Terms
Details
EDP Component - an item used in CI/CD process
EDP Headlamp UI - an EDP component that helps to manage, set up, and control the business entities.
Artifactory - an EDP component that stores all the binary artifacts. NOTE: Nexus is used as a possible implementation of a repository.
CI/CD Server - an EDP component that launches pipelines that perform the build, QA, and deployment code logic. NOTE: Jenkins is used as a possible implementation of a CI/CD server.
Code Review tool - an EDP component that collaborates with the changes in the codebase. NOTE: Gerrit is used as a possible implementation of a code review tool.
Identity Server - an authentication server providing a common way to verify requests to all of the applications. NOTE: Keycloak is used as a possible implementation of an identity server.
Security Realm Tenant - a realm in identity server (e.g Keycloak) where all users' accounts and their access permissions are managed. The realm is unique for the identity server instance.
Static Code Analyzer - an EDP component that inspects continuously a code quality before the necessary changes appear in a master branch. NOTE: SonarQube is used as a possible implementation of a static code analyzer.
VCS (Version Control System) - a replication of the Gerrit repository that displays all the changes made by developers. NOTE: GitHub and GitLab are used as the possible implementation of a repository with the version control system.
EDP Business Entity - a part of the CI/CD process (the integration, delivery, and deployment of any codebase changes)
Application - a codebase type that is built as the binary artifact and deployable unit with the code that is stored in VCS. As a result, the application becomes a container and can be deployed in an environment.
Autotests - a codebase type that inspects a product (e.g. an application set) on a stage. Autotests are not deployed to any container and launched from the respective code stage.
CD Pipeline (Continuous Delivery Pipeline) - an EDP business entity that describes the whole delivery process of the selected application set via the respective stages. The main idea of the CD pipeline is to promote the application version between the stages by applying the sequential verification (i.e. the second stage will be available if the verification on the first stage is successfully completed). NOTE: The CD pipeline can include the essential set of applications with its specific stages as well.
CD Pipeline Stage - an EDP business entity that is presented as the logical gate required for the application set inspection. Every stage has one OpenShift project where the selected application set is deployed. All stages are sequential and promote applications one-by-one.
Codebase - an EDP business entity that possesses a code.
Codebase Branch - an EDP business entity that represents a specific version in a Git branch. Every codebase branch has a Codebase Docker Stream entity.
Codebase Docker Stream - a deployable component that leads to the application build and displays that the last build was verified on the specific stage. Every CD pipeline stage accepts a set of Codebase Docker Streams (CDS) that are input and output. SAMPLE: if an application1 has a master branch, the input CDS will be named as [app name]-[pipeline name]-[stage name]-[master] and the output after the passing of the DEV stage will be as follows: [app name]-[pipeline name]-[stage name]-[dev]-[verified].
Library - a codebase type that is built as the binary artifact, i.e. it`s stored in the Artifactory and can be uploaded by other applications, autotests or libraries.
Quality Gate - an EDP business entity that represents the minimum acceptable results after the testing. Every stage has a quality gate that should be passed to promote the application. The stage quality gate can be a manual approve from a QA specialist OR a successful autotest launch.
Quality Gate Type - this value defines trigger type that promotes artifacts (images) to the next environment in CD Pipeline. There are manual and automatic types of quality gates. The manual type means that the promoting process should be confirmed in Jenkins. The automatic type promotes the images automatically in case there are no errors in the Allure Report. NOTE: If any of the test types is not passed, the CD pipeline will fail.
Trigger Type - a value that defines a trigger type used for the CD pipeline triggering. There are manual and automatic types of triggering. The manual type means that the CD pipeline should be triggered manually. The automatic type triggers the CD pipeline automatically as soon as the Codebase Docker Stream was changed.
EDP CI/CD Pipelines Framework - a library that allows extending the Jenkins pipelines and stages to develop an application. Pipelines are presented as the shared library that can be connected in Jenkins. The library is connected using the Git repository link (a public repository that is supported by EDP) on the GitHub.
Allure Report- a tool that represents test results in one brief report in a clear form.
Automated Tests - different types of automated tests that can be run on the environment for a specific stage.
Build Pipeline - a Jenkins pipeline that builds a corresponding codebase branch in the Codebase.
Build Stage - a stage that takes place after the code has been submitted/merged to the repository of the main branch (the pull request from the feature branch is merged to the main one, the Patch set is submitted in Gerrit).
Code Review Pipeline - a Jenkins pipeline that inspects the code candidate in the Code Review tool.
Code Review Stage - a stage where code is reviewed before it goes to the main branch repository of the version control system (the commit to the feature branch is pushed, the Patch set is created in Gerrit).
Deploy Pipeline - a Jenkins pipeline that is responsible for the CD Pipeline Stage deployment with the full set of applications and autotests.
Deployment Stage - a part of the Continuous Delivery where artifacts are being deployed to environments.
EDP CI/CD Pipelines - an orchestrator for stages that is responsible for the common technical events, e.g. initialization, in Jenkins pipeline. The set of stages for the pipeline is defined as an input JSON file for the respective Jenkins job. NOTE: There is the ability to create the necessary realization of the library pipeline on your own as well.
EDP CI/CD Stages - a repository that is launched in the Jenkins pipeline. Every stage is presented as an individual Groovy file in a corresponding repository. Such single responsibility realization allows rewriting of one essential stage without changing the whole pipeline.
Environment - a part of the stage where the built and packed into an image application are deployed for further testing. It`s possible to deploy several applications to several environments (Team and Integration environments) within one stage.
Integration Environment - an environment type that is always deployed as soon as the new application version is built in order to launch the integration test and promote images to the next stages. The Integration Environment can be triggered manually or in case a new image appears in the Docker registry.
Jenkinsfile - a text file that keeps the definition of a Jenkins Pipeline and is checked into source control. Every Job has its Jenkinsfile that is stored in the specific application repository and in Jenkins as the plain text.
Jenkins Node - a machine that is a part of the Jenkins environment that is capable of executing a pipeline.
Jenkins Pipeline - a user-defined model of a CD pipeline. The pipeline code defines the entire build process.
Jenkins Stage - a part of the whole CI/CD process that should pass the source code in order to be released and deployed on the production.
Team Environment - an environment type that can be deployed at any time by the manual trigger of the Deploy pipeline where team or developers can check out their applications. NOTE: The promotion from such kind of environment is prohibited and developed only for the local testing.
OpenShift / Kubernetes (K8S)
ConfigMap - a resource that stores configuration data and processes the strings that do not contain sensitive information.
Docker Container - is a lightweight, standalone, and executable package.
Docker Registry - a store for the Docker Container that is created for the application after the Build pipeline performance.
OpenShift Web Console - a web console that enables to view, manage, and change OpenShift / K8S resources using browser.
Operator Framework - a deployable unit in OpenShift that is responsible for one or a set of resources and performs its life circle (adding, displaying, and provisioning).
Path - a route component that helps to find a specified path (e.g. /api) at once and skip the other.
Pod - the smallest deployable unit of the large microservice application that is responsible for the application launch. The pod is presented as the one launched Docker container. When the Docker container is collected, it will be kept in Docker Registry and then saved as Pod in the OpenShift project. NOTE: The Deployment Config is responsible for the Pod push, restart, and stop processes.
PV (Persistent Volume) - a cluster resource that captures the details of the storage implementation and has an independent lifecycle of any individual pod.
PVC (Persistent Volume Claim) - a user request for storage that can request specific size and access mode. PV resources are consumed by PVCs.
Route - a resource in OpenShift that allows getting the external access to the pushed application.
Secret - an object that stores and manages all the sensitive information (e.g. passwords, tokens, and SSH keys).
Service - an external connection point with Pod that is responsible for the network. A specific Service is connected to a specific Pod using labels and redirects all the requests to Pod as well.
Site - a route component (link name) that is created from the indicated application name and applies automatically the project name and a wildcard DNS record.
\ No newline at end of file
diff --git a/headlamp-user-guide/add-application/index.html b/headlamp-user-guide/add-application/index.html
index e430bb15b..31c85fabc 100644
--- a/headlamp-user-guide/add-application/index.html
+++ b/headlamp-user-guide/add-application/index.html
@@ -1 +1 @@
- Add Application - EPAM Delivery Platform
Headlamp allows to create, clone and import an application and add it to the environment. It can also be deployed in Gerrit (if the Clone or Create strategy is used) with the Code Review and Build pipelines built in Jenkins/Tekton.
To add an application, navigate to the Components section on the navigation bar and click Create (the plus sign icon in the lower-right corner of the screen). Once clicked, the Create new component dialog will appear, then select Application and choose one of the strategies which will be described later in this page. You can create an Application in YAML or via the two-step menu in the dialog.
Follow the instructions below to fill in the fields of the Codebase Info menu:
In the Create new component menu, select Application:
Select the necessary configuration strategy. There are three configuration strategies:
Create from template – creates a project on the pattern in accordance with an application language, a build tool, and a framework. This strategy is recommended for projects that start developing their applications from scratch.
Import project - allows configuring a replication from the Git server. While importing the existing repository, select the Git server from the drop-down list and define the relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example.
Clone project – clones the indicated repository into EPAM Delivery Platform. While cloning the existing repository, it is required to fill in the Repository URL field as well:
In our example, we will use the Create from template strategy:
Select the Git server from the drop-down list and define the relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example.
Type the name of the application in the Component name field by entering at least two characters and by using the lower-case letters, numbers and inner dashes.
Type the application description.
To create an application with an empty repository in Gerrit, select the Empty project check box.
Select any of the supported application languages with their providers in the Application Code Language field:
Java – selecting specific Java version (8,11,17 are available).
JavaScript - selecting JavaScript allows using React, Vue, Angular, Express, Next.js and Antora frameworks.
Python - selecting Python allows using the Python v.3.8, FastAPI, Flask frameworks.
Go - selecting Go allows using the Beego, Gin and Operator SDK frameworks.
C# - selecting C# allows using the .Net v.3.1 and .Net v.6.0 frameworks.
Helm - selecting Helm allows using the Helm framework.
Other - selecting Other allows extending the default code languages when creating a codebase with the clone/import strategy. To add another code language, inspect the Add Other Code Language section.
Note
The Create from template strategy does not allow to customize the default code language set.
Select necessary Language version/framework depending on the Application code language field.
Choose the necessary build tool in the Build Tool field:
Java - selecting Java allows using the Gradle or Maven tool.
JavaScript - selecting JavaScript allows using the NPM tool.
C# - selecting C# allows using the .Net tool.
Python - selecting Python allows using Python tool.
Go - selecting Go allows using Go tool.
Helm - selecting Helm allows using Helm tool.
Note
The Select Build Tool field disposes of the default tools and can be changed in accordance with the selected code language.
Note
Tekton pipelines offer built-in support for Java Maven Multi-Module projects. These pipelines are capable of recognizing Java deployable modules based on the information in the pom.xml file and performing relevant deployment actions. It's important to note that although the Dockerfile is typically located in the root directory, Kaniko, the tool used for building container images, uses the targets folder within the deployable module's context. For a clear illustration of a Multi-Module project structure, please refer to this example on GitHub, which showcases a commonly used structure for Java Maven Multi-Module projects.
The Advanced Settings menu should look similar to the picture below:
Follow the instructions below to fill in the fields of the Advanced Setting menu:
a. Specify the name of the Default branch where you want the development to be performed.
Note
The default branch cannot be deleted. For the Clone project and Import project strategies: if you want to use the existing branch, enter its name into this field.
b. Select the necessary codebase versioning type:
default - using the default versioning type, in order to specify the version of the current artifacts, images, and tags in the Version Control System, a developer should navigate to the corresponding file and change the version manually.
edp - using the edp versioning type, a developer indicates the version number that will be used for all the artifacts stored in artifactory: binaries, pom.xml, metadata, etc. The version stored in repository (e.g. pom.xml) will not be affected or used. Using this versioning overrides any version stored in the repository files without changing actual file.
When selecting the edp versioning type, the extra field will appear:
Type the version number from which you want the artifacts to be versioned.
Note
The Start Version From field should be filled out in compliance with the semantic versioning rules, e.g. 1.2.3 or 10.10.10. Please refer to the Semantic Versioning page for details.
c. Specify the pattern to validate a commit message. Use regular expression to indicate the pattern that is followed on the project to validate a commit message in the code review pipeline. An example of the pattern: ^[PROJECT_NAME-d{4}]:.*$.
d. Select the Integrate with Jira Server check box in case it is required to connect Jira tickets with the commits and have a respective label in the Fix Version field.
Note
To adjust the Jira integration functionality, first apply the necessary changes described on the Adjust Jira Integration page, and setup the Adjust VCS Integration With Jira. Pay attention that the Jira integration feature is not available when using the GitLab CI tool.
e. In the Jira Server field, select the Jira server.
f. Specify the pattern to find a Jira ticket number in a commit message. Based on this pattern, the value from EDP will be displayed in Jira. Combine several variables to obtain the desired value.
Note
The GitLab CI tool is available only with the Import strategy and makes the Jira integration feature unavailable.
g. In the Mapping field name section, specify the names of the Jira fields that should be filled in with attributes from EDP:
Select the name of the field in a Jira ticket from the Mapping field name drop-down menu. The available fields are the following: Fix Version/s, Component/s and Labels.
Click the Add button to add the mapping field name.
Enter Jira pattern for the field name:
For the Fix Version/s field, select the EDP_VERSION variable that represents an EDP upgrade version, as in 2.7.0-SNAPSHOT. Combine variables to make the value more informative. For example, the pattern EDP_VERSION-EDP_COMPONENT will be displayed as 2.7.0-SNAPSHOT-nexus-operator in Jira.
For the Component/s field, select the EDP_COMPONENT variable that defines the name of the existing repository. For example, nexus-operator.
For the Labels field, select the EDP_GITTAG variable that defines a tag assigned to the commit in GitHub. For example, build/2.7.0-SNAPSHOT.59.
Click the bin icon to remove the Jira field name.
h. Click the Apply button to add the application to the Applications list.
Note
After the complete adding of the application, inspect the Application Overview part.
Note
Since EDP v3.3.0, the CI tool field has been hidden. Now Headlamp automatically defines the CI tool depending on which one is deployed with EDP. If both Jenkins and Tekton are deployed, Headlamp chooses Tekton by default. To define the CI tool manualy, operate with the spec.ciTool parameters.
Headlamp allows to create, clone and import an application and add it to the environment. It can also be deployed in Gerrit (if the Clone or Create strategy is used) with the Code Review and Build pipelines built in Jenkins/Tekton.
To add an application, navigate to the Components section on the navigation bar and click Create (the plus sign icon in the lower-right corner of the screen). Once clicked, the Create new component dialog will appear, then select Application and choose one of the strategies which will be described later in this page. You can create an Application in YAML or via the two-step menu in the dialog.
Follow the instructions below to fill in the fields of the Codebase Info menu:
In the Create new component menu, select Application:
Select the necessary configuration strategy. There are three configuration strategies:
Create from template – creates a project on the pattern in accordance with an application language, a build tool, and a framework. This strategy is recommended for projects that start developing their applications from scratch.
Import project - allows configuring a replication from the Git server. While importing the existing repository, select the Git server from the drop-down list and define the relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example.
Clone project – clones the indicated repository into EPAM Delivery Platform. While cloning the existing repository, it is required to fill in the Repository URL field as well:
In our example, we will use the Create from template strategy:
Select the Git server from the drop-down list and define the relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example.
Type the name of the application in the Component name field by entering at least two characters and by using the lower-case letters, numbers and inner dashes.
Type the application description.
To create an application with an empty repository in Gerrit, select the Empty project check box.
Select any of the supported application languages with their providers in the Application Code Language field:
Java – selecting specific Java version (8,11,17 are available).
JavaScript - selecting JavaScript allows using React, Vue, Angular, Express, Next.js and Antora frameworks.
Python - selecting Python allows using the Python v.3.8, FastAPI, Flask frameworks.
Go - selecting Go allows using the Beego, Gin and Operator SDK frameworks.
C# - selecting C# allows using the .Net v.3.1 and .Net v.6.0 frameworks.
Helm - selecting Helm allows using the Helm framework.
Other - selecting Other allows extending the default code languages when creating a codebase with the clone/import strategy. To add another code language, inspect the Add Other Code Language section.
Note
The Create from template strategy does not allow to customize the default code language set.
Select necessary Language version/framework depending on the Application code language field.
Choose the necessary build tool in the Build Tool field:
Java - selecting Java allows using the Gradle or Maven tool.
JavaScript - selecting JavaScript allows using the NPM tool.
C# - selecting C# allows using the .Net tool.
Python - selecting Python allows using Python tool.
Go - selecting Go allows using Go tool.
Helm - selecting Helm allows using Helm tool.
Note
The Select Build Tool field disposes of the default tools and can be changed in accordance with the selected code language.
Note
Tekton pipelines offer built-in support for Java Maven Multi-Module projects. These pipelines are capable of recognizing Java deployable modules based on the information in the pom.xml file and performing relevant deployment actions. It's important to note that although the Dockerfile is typically located in the root directory, Kaniko, the tool used for building container images, uses the targets folder within the deployable module's context. For a clear illustration of a Multi-Module project structure, please refer to this example on GitHub, which showcases a commonly used structure for Java Maven Multi-Module projects.
The Advanced Settings menu should look similar to the picture below:
Follow the instructions below to fill in the fields of the Advanced Setting menu:
a. Specify the name of the Default branch where you want the development to be performed.
Note
The default branch cannot be deleted. For the Clone project and Import project strategies: if you want to use the existing branch, enter its name into this field.
b. Select the necessary codebase versioning type:
default - using the default versioning type, in order to specify the version of the current artifacts, images, and tags in the Version Control System, a developer should navigate to the corresponding file and change the version manually.
edp - using the edp versioning type, a developer indicates the version number that will be used for all the artifacts stored in artifactory: binaries, pom.xml, metadata, etc. The version stored in repository (e.g. pom.xml) will not be affected or used. Using this versioning overrides any version stored in the repository files without changing actual file.
When selecting the edp versioning type, the extra field will appear:
Type the version number from which you want the artifacts to be versioned.
Note
The Start Version From field should be filled out in compliance with the semantic versioning rules, e.g. 1.2.3 or 10.10.10. Please refer to the Semantic Versioning page for details.
c. Specify the pattern to validate a commit message. Use regular expression to indicate the pattern that is followed on the project to validate a commit message in the code review pipeline. An example of the pattern: ^[PROJECT_NAME-d{4}]:.*$.
d. Select the Integrate with Jira Server check box in case it is required to connect Jira tickets with the commits and have a respective label in the Fix Version field.
Note
To adjust the Jira integration functionality, first apply the necessary changes described on the Adjust Jira Integration page, and setup the Adjust VCS Integration With Jira. Pay attention that the Jira integration feature is not available when using the GitLab CI tool.
e. In the Jira Server field, select the Jira server.
f. Specify the pattern to find a Jira ticket number in a commit message. Based on this pattern, the value from EDP will be displayed in Jira. Combine several variables to obtain the desired value.
Note
The GitLab CI tool is available only with the Import strategy and makes the Jira integration feature unavailable.
g. In the Mapping field name section, specify the names of the Jira fields that should be filled in with attributes from EDP:
Select the name of the field in a Jira ticket from the Mapping field name drop-down menu. The available fields are the following: Fix Version/s, Component/s and Labels.
Click the Add button to add the mapping field name.
Enter Jira pattern for the field name:
For the Fix Version/s field, select the EDP_VERSION variable that represents an EDP upgrade version, as in 2.7.0-SNAPSHOT. Combine variables to make the value more informative. For example, the pattern EDP_VERSION-EDP_COMPONENT will be displayed as 2.7.0-SNAPSHOT-nexus-operator in Jira.
For the Component/s field, select the EDP_COMPONENT variable that defines the name of the existing repository. For example, nexus-operator.
For the Labels field, select the EDP_GITTAG variable that defines a tag assigned to the commit in GitHub. For example, build/2.7.0-SNAPSHOT.59.
Click the bin icon to remove the Jira field name.
h. Click the Apply button to add the application to the Applications list.
Note
After the complete adding of the application, inspect the Application Overview part.
Note
Since EDP v3.3.0, the CI tool field has been hidden. Now Headlamp automatically defines the CI tool depending on which one is deployed with EDP. If both Jenkins and Tekton are deployed, Headlamp chooses Tekton by default. To define the CI tool manualy, operate with the spec.ciTool parameters.
Headlamp enables to clone or import an autotest, add it to the environment with its subsequent deployment in Gerrit (in case the Clone strategy is used) and building of the Code Review pipeline in Jenkins/Tekton, as well as to use it for work with an application under development. It is also possible to use autotests as quality gates in a newly created CD pipeline.
Info
Please refer to the Add Application section for the details on how to add an application codebase type. For the details on how to use autotests as quality gates, please refer to the Stages Menu section of the Add CD Pipeline documentation.
To add an autotest, navigate to the Components section on the navigation bar and click Create (the plus sign icon in the lower-right corner of the screen). Once clicked, the Create new component dialog will appear, then select Autotest and choose one of the strategies which will be described later in this page. You can create an autotest in YAML or via the two-step menu in the dialog.
There are two available strategies: clone and import.
The Create new component menu should look like the picture below:
In the Repository onboarding strategy field, select the necessary configuration strategy:
Clone project – clones the indicated repository into EPAM Delivery Platform. While cloning the existing repository, it is required to fill in the Repository URL field as well.
Import project - allows configuring a replication from the Git server. While importing the existing repository, select the Git server from the drop-down list and define the relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example.
In our example, we will use the Clone project strategy:
While cloning the existing repository, it is required to fill in the Repository URL field.
Select the Git server from the drop-down list and define the relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example.
Select the Repository credentials check box in case you clone the private repository, and fill in the repository login and password/access token.
Fill in the Component name field by entering at least two characters and by using the lower-case letters, numbers and inner dashes.
Type the necessary description in the Description field.
In the Autotest code language field, select the Java code language with its framework (specify Java 8 or Java 11 to be used) and get the default Maven build tool OR add another code language. Selecting Other allows extending the default code languages and get the necessary build tool, for details, inspect the Add Other Code Language section.
Note
Using the Create strategy does not allow to customize the default code language set.
Select the Java framework if Java is selected above.
The Build Tool field can dispose of the default Maven tool, Gradle or other built tool in accordance with the selected code language.
All the autotest reports will be created in the Allure framework that is available in the Autotest Report Framework field by default.
Click the Proceed button to switch to the next menu.
The Advanced Settings menu should look like the picture below:
a. Specify the name of the default branch where you want the development to be performed.
Note
The default branch cannot be deleted.
b. Select the necessary codebase versioning type:
default: Using the default versioning type, in order to specify the version of the current artifacts, images, and tags in the Version Control System, a developer should navigate to the corresponding file and change the version manually.
edp: Using the edp versioning type, a developer indicates the version number from which all the artifacts will be versioned and, as a result, automatically registered in the corresponding file (e.g. pom.xml).
When selecting the edp versioning type, the extra field will appear:
Type the version number from which you want the artifacts to be versioned.
Note
The Start Version From field must be filled out in compliance with the semantic versioning rules, e.g. 1.2.3 or 10.10.10. Please refer to the Semantic Versioning page for details.
c. Specify the pattern to validate a commit message. Use regular expression to indicate the pattern that is followed on the project to validate a commit message in the code review pipeline. An example of the pattern: ^[PROJECT_NAME-d{4}]:.*$
d. Select the Integrate with Jira Server check box in case it is required to connect Jira tickets with the commits and have a respective label in the Fix Version field.
Note
To adjust the Jira integration functionality, first apply the necessary changes described on the Adjust Jira Integration page, and Adjust VCS Integration With Jira. Pay attention that the Jira integration feature is not available when using the GitLab CI tool.
e. As soon as the Jira server is set, select it in the Jira Server field.
f. Specify the pattern to find a Jira ticket number in a commit message. Based on this pattern, the value from EDP will be displayed in Jira.
g. In the Advanced Mapping section, specify the names of the Jira fields that should be filled in with attributes from EDP:
Select the name of the field in a Jira ticket. The available fields are the following: Fix Version/s, Component/s and Labels.
Click the Add button to add the mapping field name.
Enter Jira pattern for the field name:
For the Fix Version/s field, select the EDP_VERSION variable that represents an EDP upgrade version, as in 2.7.0-SNAPSHOT. Combine variables to make the value more informative. For example, the pattern EDP_VERSION-EDP_COMPONENT will be displayed as 2.7.0-SNAPSHOT-nexus-operator in Jira.
For the Component/s field select the EDP_COMPONENT variable that defines the name of the existing repository. For fexample, nexus-operator.
For the Labels field select the EDP_GITTAGvariable that defines a tag assigned to the commit in Git Hub. For example, build/2.7.0-SNAPSHOT.59.
Click the bin icon to remove the Jira field name.
h. Click the Apply button to add the library to the Libraries list.
Note
After the complete adding of the autotest, inspect the Autotest Overview part.
Note
Since EDP v3.3.0, the CI tool field has been hidden. Now Headlamp automatically defines the CI tool depending on which one is deployed with EDP. If both Jenkins and Tekton are deployed, Headlamp chooses Tekton by default. To define the CI tool manualy, operate with the spec.ciTool parameters.
Headlamp enables to clone or import an autotest, add it to the environment with its subsequent deployment in Gerrit (in case the Clone strategy is used) and building of the Code Review pipeline in Jenkins/Tekton, as well as to use it for work with an application under development. It is also possible to use autotests as quality gates in a newly created CD pipeline.
Info
Please refer to the Add Application section for the details on how to add an application codebase type. For the details on how to use autotests as quality gates, please refer to the Stages Menu section of the Add CD Pipeline documentation.
To add an autotest, navigate to the Components section on the navigation bar and click Create (the plus sign icon in the lower-right corner of the screen). Once clicked, the Create new component dialog will appear, then select Autotest and choose one of the strategies which will be described later in this page. You can create an autotest in YAML or via the two-step menu in the dialog.
There are two available strategies: clone and import.
The Create new component menu should look like the picture below:
In the Repository onboarding strategy field, select the necessary configuration strategy:
Clone project – clones the indicated repository into EPAM Delivery Platform. While cloning the existing repository, it is required to fill in the Repository URL field as well.
Import project - allows configuring a replication from the Git server. While importing the existing repository, select the Git server from the drop-down list and define the relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example.
In our example, we will use the Clone project strategy:
While cloning the existing repository, it is required to fill in the Repository URL field.
Select the Git server from the drop-down list and define the relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example.
Select the Repository credentials check box in case you clone the private repository, and fill in the repository login and password/access token.
Fill in the Component name field by entering at least two characters and by using the lower-case letters, numbers and inner dashes.
Type the necessary description in the Description field.
In the Autotest code language field, select the Java code language with its framework (specify Java 8 or Java 11 to be used) and get the default Maven build tool OR add another code language. Selecting Other allows extending the default code languages and get the necessary build tool, for details, inspect the Add Other Code Language section.
Note
Using the Create strategy does not allow to customize the default code language set.
Select the Java framework if Java is selected above.
The Build Tool field can dispose of the default Maven tool, Gradle or other built tool in accordance with the selected code language.
All the autotest reports will be created in the Allure framework that is available in the Autotest Report Framework field by default.
Click the Proceed button to switch to the next menu.
The Advanced Settings menu should look like the picture below:
a. Specify the name of the default branch where you want the development to be performed.
Note
The default branch cannot be deleted.
b. Select the necessary codebase versioning type:
default: Using the default versioning type, in order to specify the version of the current artifacts, images, and tags in the Version Control System, a developer should navigate to the corresponding file and change the version manually.
edp: Using the edp versioning type, a developer indicates the version number from which all the artifacts will be versioned and, as a result, automatically registered in the corresponding file (e.g. pom.xml).
When selecting the edp versioning type, the extra field will appear:
Type the version number from which you want the artifacts to be versioned.
Note
The Start Version From field must be filled out in compliance with the semantic versioning rules, e.g. 1.2.3 or 10.10.10. Please refer to the Semantic Versioning page for details.
c. Specify the pattern to validate a commit message. Use regular expression to indicate the pattern that is followed on the project to validate a commit message in the code review pipeline. An example of the pattern: ^[PROJECT_NAME-d{4}]:.*$
d. Select the Integrate with Jira Server check box in case it is required to connect Jira tickets with the commits and have a respective label in the Fix Version field.
Note
To adjust the Jira integration functionality, first apply the necessary changes described on the Adjust Jira Integration page, and Adjust VCS Integration With Jira. Pay attention that the Jira integration feature is not available when using the GitLab CI tool.
e. As soon as the Jira server is set, select it in the Jira Server field.
f. Specify the pattern to find a Jira ticket number in a commit message. Based on this pattern, the value from EDP will be displayed in Jira.
g. In the Advanced Mapping section, specify the names of the Jira fields that should be filled in with attributes from EDP:
Select the name of the field in a Jira ticket. The available fields are the following: Fix Version/s, Component/s and Labels.
Click the Add button to add the mapping field name.
Enter Jira pattern for the field name:
For the Fix Version/s field, select the EDP_VERSION variable that represents an EDP upgrade version, as in 2.7.0-SNAPSHOT. Combine variables to make the value more informative. For example, the pattern EDP_VERSION-EDP_COMPONENT will be displayed as 2.7.0-SNAPSHOT-nexus-operator in Jira.
For the Component/s field select the EDP_COMPONENT variable that defines the name of the existing repository. For fexample, nexus-operator.
For the Labels field select the EDP_GITTAGvariable that defines a tag assigned to the commit in Git Hub. For example, build/2.7.0-SNAPSHOT.59.
Click the bin icon to remove the Jira field name.
h. Click the Apply button to add the library to the Libraries list.
Note
After the complete adding of the autotest, inspect the Autotest Overview part.
Note
Since EDP v3.3.0, the CI tool field has been hidden. Now Headlamp automatically defines the CI tool depending on which one is deployed with EDP. If both Jenkins and Tekton are deployed, Headlamp chooses Tekton by default. To define the CI tool manualy, operate with the spec.ciTool parameters.
\ No newline at end of file
diff --git a/headlamp-user-guide/add-cd-pipeline/index.html b/headlamp-user-guide/add-cd-pipeline/index.html
index 989b21604..5df75ed4a 100644
--- a/headlamp-user-guide/add-cd-pipeline/index.html
+++ b/headlamp-user-guide/add-cd-pipeline/index.html
@@ -1 +1 @@
- Add CD Pipeline - EPAM Delivery Platform
Headlamp provides the ability to deploy an environment on your own and specify the essential components.
Navigate to the CD Pipelines section on the navigation bar and click Create (the plus sign icon in the lower-right corner of the screen). Once clicked, the Create CD Pipeline dialog will appear.
The creation of the CD pipeline becomes available as soon as an application is created including its provisioning in a branch and the necessary entities for the environment. You can create the CD pipeline in YAML or via the three-step menu in the dialog.
The Pipeline tab of the Create CD Pipeline menu should look like the picture below:
Type the name of the pipeline in the Pipeline Name field by entering at least two characters and by using the lower-case letters, numbers and inner dashes.
Note
The namespace created by the CD pipeline has the following pattern combination: [edp namespace]-[cd pipeline name]-[stage name]. Please be aware that the namespace length should not exceed 63 symbols.
Select the deployment type from the drop-down list:
Container - the pipeline will be deployed in a Docker container;
Custom - this mode allows to deploy non-container applications and customize the Init stage of CD pipeline.
Click the Proceed button to switch to the next menu.
The Pipeline tab of the Create CD Pipeline menu should look like the picture below:
Select the necessary application from the Mapping field name drop-down menu.
Select the plus sign icon near the selected application to specify the necessary codebase Docker branch for the application (the output for the branch and other stages from other CD pipelines).
Select the application branch from the drop-down menu.
Select the Promote in pipeline check box in order to transfer the application from one to another stage by the specified codebase Docker branch. If the Promote in pipeline check box is not selected, the same codebase Docker stream will be deployed regardless of the stage, i.e. the codebase Docker stream input, which was selected for the pipeline, will always be used.
Note
The newly created CD pipeline has the following pattern combination: [pipeline name]-[branch name]. If there is another deployed CD pipeline stage with the respective codebase Docker stream (= image stream as an OpenShift term), the pattern combination will be as follows: [pipeline name]-[stage name]-[application name]-[verified].
Click the Proceed button to switch to the next menu.
Click the plus sign icon in the Stages menu and fill in the necessary fields in the Adding Stage window :
a. Type the stage name;
Note
The namespace created by the CD pipeline has the following pattern combination: [cluster name]-[cd pipeline name]-[stage name]. Please be aware that the namespace length should not exceed 63 symbols.
b. Enter the description for this stage;
c. Select the trigger type. The key benefit of the automatic deploy feature is to keep environments up-to-date. The available trigger types are Manual and Auto. When the Auto trigger type is chosen, the CD pipeline will initiate automatically once the image is built. Manual implies that user has to perform deploy manually by clicking the Deploy button in the CD Pipeline menu. Please refer to the Architecture Scheme of CD Pipeline Operator page for additional details.
Note
In Tekton deploy scenario, automatic deploy will start working only after the first manual deploy.
d. Select the job provisioner. In case of working with non-container-based applications, there is an option to use a custom job provisioner. Please refer to the Manage Jenkins CD Job Provision page for details.
e. Select the groovy-pipeline library;
f. Select the branch;
g. Add an unlimited number of quality gates by clicking a corresponding plus sign icon and remove them as well by clicking the recycle bin icon;
h. Type the step name, which will be displayed in Jenkins/Tekton, for every quality gate;
i. Select the quality gate type:
Manual - means that the promoting process should be confirmed in Jenkins/Tekton manually;
Autotests - means that the promoting process should be confirmed by the successful passing of the autotests.
In the additional fields, select the previously created autotest name (j) and specify its branch for the autotest that will be launched on the current stage (k).
Note
Execution sequence. The image promotion and execution of the pipelines depend on the sequence in which the environments are added.
l. Click the Apply button to display the stage in the Stages menu.
Edit the stage by clicking its name and applying changes, and remove the added stage by clicking the recycle bin icon next to its name.
Click the Apply button to start the provisioning of the pipeline. After the CD pipeline is added, the new project with the stage name will be created in OpenShift.
As soon as the CD pipeline is provisioned and added to the CD Pipelines list, there is an ability to:
Create another application by clicking the plus sign icon in the lower-right corner of the screen and performing the same steps as described in the Add CD Pipeline section.
Open CD pipeline data by clicking its link name. Once clicked, the following blocks will be displayed:
General Info - displays common information about the CD pipeline, such as name and deployment type.
Applications - displays the CD pipeline applications to promote.
Stages - displays the CD pipeline stages and stage metadata (by selecting the information icon near the stage name); allows to add, edit and delete stages, as well as deploy or uninstall image stream versions of the related applications for a stage.
Metadata - displays the CD pipeline name, namespace, creation date, finalizers, generation, resource version, and UID. Open this block by selecting the information icon near the options icon next to the CD pipeline name.
Edit the CD pipeline by selecting the options icon next to its name in the CD Pipelines list, and then selecting Edit. For details see the Edit Existing CD Pipeline section.
Delete the added CD pipeline by selecting the options icon next to its name in the CD Pipelines list, and then selecting Delete.
Info
In OpenShift, if the deployment fails with the ImagePullBackOff error, delete the POD.
Sort the existing CD pipelines in a table by clicking the sorting icons in the table header. When sorting by name, the CD pipelines will be displayed in alphabetical order. You can also sort the CD pipelines by their status.
Search the necessary CD pipeline by the namespace or by entering the corresponding name, language or the build tool into the Filter tool.
Select a number of CD pipelines displayed per page (15, 25 or 50 rows) and navigate between pages if the number of CD pipelines exceeds the capacity of a single page.
Edit the CD pipeline directly from the CD Pipelines overview page or when viewing the CD Pipeline data:
Select Edit in the options icon menu next to the CD pipeline name:
Apply the necessary changes (edit the list of applications for deploy, application branches, and promotion in the pipeline). Add new extra stages by clicking the plus sign icon and filling in the application branch and promotion in the pipeline.
In order to create a new stage for the existing CD pipeline, follow the steps below:
Navigate to the Stages block by clicking the CD pipeline name link in the CD Pipelines list.
Select Create to open the Create stage dialog.
Click Edit YAML in the upper-right corner of the Create stage dialog to open the YAML editor and add a stage. Otherwise, fill in the required fields in the dialog. Please see the Stages Menu section for details.
You cannot remove the last stage, as the CD pipeline does not exist without stages.
In order to delete a stage for the existing CD pipeline, follow the steps below:
Navigate to the Stages block by clicking the CD pipeline name link in the CD Pipelines list.
Select the options icon related to the necessary stage and then select Delete. After the confirmation, the CD stage is deleted with all its components: database record, Jenkins/Tekton pipeline, and cluster namespace.
To view the CD pipeline stage data for the existing CD pipeline, follow the steps below:
Navigate to the Stages block by clicking the CD pipeline name link in the CD Pipelines list.
Select the expand icon near the stage name. The following blocks will be displayed:
Applications - displays the status of the applications related to the stage and allows deploying the applications. Applications health and sync statuses are returned from the Argo CD tool.
General Info - displays the stage status, CD pipeline, description, job provisioning, order, trigger type, and source.
Quality Gates - displays the stage quality gate type, step name, autotest name, and branch name.
Navigate to the Applications block of the stage and select an application. Select the image stream version from the drop-down list and click Deploy. The application will be deployed in the Argo CD tool as well.
To update or uninstall the application, select Update or Uninstall.
After this, the application will be updated or uninstalled in the Argo CD tool as well.
Note
In a nutshell, the Update button updates your image version in the Helm chart, whereas the Uninstall button deletes the Helm chart from the namespace where the pipeline is deployed.
Headlamp provides the ability to deploy an environment on your own and specify the essential components.
Navigate to the CD Pipelines section on the navigation bar and click Create (the plus sign icon in the lower-right corner of the screen). Once clicked, the Create CD Pipeline dialog will appear.
The creation of the CD pipeline becomes available as soon as an application is created including its provisioning in a branch and the necessary entities for the environment. You can create the CD pipeline in YAML or via the three-step menu in the dialog.
The Pipeline tab of the Create CD Pipeline menu should look like the picture below:
Type the name of the pipeline in the Pipeline Name field by entering at least two characters and by using the lower-case letters, numbers and inner dashes.
Note
The namespace created by the CD pipeline has the following pattern combination: [edp namespace]-[cd pipeline name]-[stage name]. Please be aware that the namespace length should not exceed 63 symbols.
Select the deployment type from the drop-down list:
Container - the pipeline will be deployed in a Docker container;
Custom - this mode allows to deploy non-container applications and customize the Init stage of CD pipeline.
Click the Proceed button to switch to the next menu.
The Pipeline tab of the Create CD Pipeline menu should look like the picture below:
Select the necessary application from the Mapping field name drop-down menu.
Select the plus sign icon near the selected application to specify the necessary codebase Docker branch for the application (the output for the branch and other stages from other CD pipelines).
Select the application branch from the drop-down menu.
Select the Promote in pipeline check box in order to transfer the application from one to another stage by the specified codebase Docker branch. If the Promote in pipeline check box is not selected, the same codebase Docker stream will be deployed regardless of the stage, i.e. the codebase Docker stream input, which was selected for the pipeline, will always be used.
Note
The newly created CD pipeline has the following pattern combination: [pipeline name]-[branch name]. If there is another deployed CD pipeline stage with the respective codebase Docker stream (= image stream as an OpenShift term), the pattern combination will be as follows: [pipeline name]-[stage name]-[application name]-[verified].
Click the Proceed button to switch to the next menu.
Click the plus sign icon in the Stages menu and fill in the necessary fields in the Adding Stage window :
a. Type the stage name;
Note
The namespace created by the CD pipeline has the following pattern combination: [cluster name]-[cd pipeline name]-[stage name]. Please be aware that the namespace length should not exceed 63 symbols.
b. Enter the description for this stage;
c. Select the trigger type. The key benefit of the automatic deploy feature is to keep environments up-to-date. The available trigger types are Manual and Auto. When the Auto trigger type is chosen, the CD pipeline will initiate automatically once the image is built. Manual implies that user has to perform deploy manually by clicking the Deploy button in the CD Pipeline menu. Please refer to the Architecture Scheme of CD Pipeline Operator page for additional details.
Note
In Tekton deploy scenario, automatic deploy will start working only after the first manual deploy.
d. Select the job provisioner. In case of working with non-container-based applications, there is an option to use a custom job provisioner. Please refer to the Manage Jenkins CD Job Provision page for details.
e. Select the groovy-pipeline library;
f. Select the branch;
g. Add an unlimited number of quality gates by clicking a corresponding plus sign icon and remove them as well by clicking the recycle bin icon;
h. Type the step name, which will be displayed in Jenkins/Tekton, for every quality gate;
i. Select the quality gate type:
Manual - means that the promoting process should be confirmed in Jenkins/Tekton manually;
Autotests - means that the promoting process should be confirmed by the successful passing of the autotests.
In the additional fields, select the previously created autotest name (j) and specify its branch for the autotest that will be launched on the current stage (k).
Note
Execution sequence. The image promotion and execution of the pipelines depend on the sequence in which the environments are added.
l. Click the Apply button to display the stage in the Stages menu.
Edit the stage by clicking its name and applying changes, and remove the added stage by clicking the recycle bin icon next to its name.
Click the Apply button to start the provisioning of the pipeline. After the CD pipeline is added, the new project with the stage name will be created in OpenShift.
As soon as the CD pipeline is provisioned and added to the CD Pipelines list, there is an ability to:
Create another application by clicking the plus sign icon in the lower-right corner of the screen and performing the same steps as described in the Add CD Pipeline section.
Open CD pipeline data by clicking its link name. Once clicked, the following blocks will be displayed:
General Info - displays common information about the CD pipeline, such as name and deployment type.
Applications - displays the CD pipeline applications to promote.
Stages - displays the CD pipeline stages and stage metadata (by selecting the information icon near the stage name); allows to add, edit and delete stages, as well as deploy or uninstall image stream versions of the related applications for a stage.
Metadata - displays the CD pipeline name, namespace, creation date, finalizers, generation, resource version, and UID. Open this block by selecting the information icon near the options icon next to the CD pipeline name.
Edit the CD pipeline by selecting the options icon next to its name in the CD Pipelines list, and then selecting Edit. For details see the Edit Existing CD Pipeline section.
Delete the added CD pipeline by selecting the options icon next to its name in the CD Pipelines list, and then selecting Delete.
Info
In OpenShift, if the deployment fails with the ImagePullBackOff error, delete the POD.
Sort the existing CD pipelines in a table by clicking the sorting icons in the table header. When sorting by name, the CD pipelines will be displayed in alphabetical order. You can also sort the CD pipelines by their status.
Search the necessary CD pipeline by the namespace or by entering the corresponding name, language or the build tool into the Filter tool.
Select a number of CD pipelines displayed per page (15, 25 or 50 rows) and navigate between pages if the number of CD pipelines exceeds the capacity of a single page.
Edit the CD pipeline directly from the CD Pipelines overview page or when viewing the CD Pipeline data:
Select Edit in the options icon menu next to the CD pipeline name:
Apply the necessary changes (edit the list of applications for deploy, application branches, and promotion in the pipeline). Add new extra stages by clicking the plus sign icon and filling in the application branch and promotion in the pipeline.
In order to create a new stage for the existing CD pipeline, follow the steps below:
Navigate to the Stages block by clicking the CD pipeline name link in the CD Pipelines list.
Select Create to open the Create stage dialog.
Click Edit YAML in the upper-right corner of the Create stage dialog to open the YAML editor and add a stage. Otherwise, fill in the required fields in the dialog. Please see the Stages Menu section for details.
You cannot remove the last stage, as the CD pipeline does not exist without stages.
In order to delete a stage for the existing CD pipeline, follow the steps below:
Navigate to the Stages block by clicking the CD pipeline name link in the CD Pipelines list.
Select the options icon related to the necessary stage and then select Delete. After the confirmation, the CD stage is deleted with all its components: database record, Jenkins/Tekton pipeline, and cluster namespace.
To view the CD pipeline stage data for the existing CD pipeline, follow the steps below:
Navigate to the Stages block by clicking the CD pipeline name link in the CD Pipelines list.
Select the expand icon near the stage name. The following blocks will be displayed:
Applications - displays the status of the applications related to the stage and allows deploying the applications. Applications health and sync statuses are returned from the Argo CD tool.
General Info - displays the stage status, CD pipeline, description, job provisioning, order, trigger type, and source.
Quality Gates - displays the stage quality gate type, step name, autotest name, and branch name.
Navigate to the Applications block of the stage and select an application. Select the image stream version from the drop-down list and click Deploy. The application will be deployed in the Argo CD tool as well.
To update or uninstall the application, select Update or Uninstall.
After this, the application will be updated or uninstalled in the Argo CD tool as well.
Note
In a nutshell, the Update button updates your image version in the Helm chart, whereas the Uninstall button deletes the Helm chart from the namespace where the pipeline is deployed.
\ No newline at end of file
diff --git a/headlamp-user-guide/add-git-server/index.html b/headlamp-user-guide/add-git-server/index.html
index d0dadeee4..b1faf52e8 100644
--- a/headlamp-user-guide/add-git-server/index.html
+++ b/headlamp-user-guide/add-git-server/index.html
@@ -1 +1 @@
- Add Git Server - EPAM Delivery Platform
This article describes how to add a Git Server when deploying EDP with Jenkins. When deploying EDP with Tekton, Git Server is created automatically.
Add Git servers to use the Import strategy for Jenkins and Tekton when creating an application, autotest or library in EDP Headlamp (Codebase Info step of the Create Application/Autotest/Library dialog). Enabling the Import strategy is a prerequisite to integrate EDP with Gitlab or GitHub.
Note
GitServer Custom Resource can be also created manually. See step 3 for Jenkins import strategy in the Integrate GitHub/GitLab in Jenkins article.
To add a Git server, navigate to the Git servers section on the navigation bar and click Create (the plus sign icon in the lower-right corner of the screen). Once clicked, the Create Git server dialog will appear. You can create a Git server in YAML or via the three-step menu in the dialog.
Private SSH key - enter a private SSH key for Git integration. To generate this key, follow the instructions of the step 1 for Jenkins in the Integrate GitHub/GitLab in Jenkins article.
Access token - enter an access token for Git integration. To generate this token, go to GitLab/GitHub account -> Settings -> SSH and GPG keys -> select New SSH key and add SSH key.
Click the Apply button to add the Git server to the Git servers list. As a result, the Git Server object and the corresponding secret for further integration will be created.
This article describes how to add a Git Server when deploying EDP with Jenkins. When deploying EDP with Tekton, Git Server is created automatically.
Add Git servers to use the Import strategy for Jenkins and Tekton when creating an application, autotest or library in EDP Headlamp (Codebase Info step of the Create Application/Autotest/Library dialog). Enabling the Import strategy is a prerequisite to integrate EDP with Gitlab or GitHub.
Note
GitServer Custom Resource can be also created manually. See step 3 for Jenkins import strategy in the Integrate GitHub/GitLab in Jenkins article.
To add a Git server, navigate to the Git servers section on the navigation bar and click Create (the plus sign icon in the lower-right corner of the screen). Once clicked, the Create Git server dialog will appear. You can create a Git server in YAML or via the three-step menu in the dialog.
Private SSH key - enter a private SSH key for Git integration. To generate this key, follow the instructions of the step 1 for Jenkins in the Integrate GitHub/GitLab in Jenkins article.
Access token - enter an access token for Git integration. To generate this token, go to GitLab/GitHub account -> Settings -> SSH and GPG keys -> select New SSH key and add SSH key.
Click the Apply button to add the Git server to the Git servers list. As a result, the Git Server object and the corresponding secret for further integration will be created.
Headlamp allows to create, clone and import an infrastructure. Its functionality is to create resources in cloud provider.
To add an application, navigate to the Components section on the navigation bar and click Create (the plus sign icon in the lower-right corner of the screen). Once clicked, the Create new component dialog will appear, then select Application and choose one of the strategies which will be described later in this page. You can create an Application in YAML or via the two-step menu in the dialog.
Follow the instructions below to fill in the fields of the Codebase Info menu:
In the Create new component menu, select Infrastructure:
Select the necessary configuration strategy:
Create from template – creates a project on the pattern in accordance with an infrastructure language, a build tool, and a framework.
Import project - allows configuring a replication from the Git server. While importing the existing repository, select the Git server from the drop-down list and define the relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example.
Clone project – clones the indicated repository into EPAM Delivery Platform. While cloning the existing repository, it is required to fill in the Repository URL field as well:
In our example, we will use the Create from template strategy:
Select the Git server from the drop-down list and define the Git repo relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example.
Type the name of the infrastructure in the Component name field by entering at least two characters and by using the lower-case letters, numbers and inner dashes.
Write the description in the Description field.
To create an application with an empty repository in Gerrit, select the Empty project check box.
Select any of the supported application languages with their providers in the Infrastructure Code Language field. So far, only HCL is supported.
Note
The Create from template strategy does not allow to customize the default code language set.
Select necessary Language version/framework depending on the Infrastructure code language field. So far, only AWS is supported.
Choose the necessary build tool in the Build Tool field. So far, only Terraform is supported.\
Note
The Select Build Tool field disposes of the default tools and can be changed in accordance with the selected code language.
The Advanced Settings menu should look similar to the picture below:
Follow the instructions below to fill in the fields of the Advanced Setting menu:
a. Specify the name of the Default branch where you want the development to be performed.
Note
The default branch cannot be deleted. For the Clone project and Import project strategies: if you want to use the existing branch, enter its name into this field.
b. Select the necessary codebase versioning type:
default - using the default versioning type, in order to specify the version of the current artifacts, images, and tags in the Version Control System, a developer should navigate to the corresponding file and change the version manually.
edp - using the edp versioning type, a developer indicates the version number that will be used for all the artifacts stored in artifactory: binaries, pom.xml, metadata, etc. The version stored in repository (e.g. pom.xml) will not be affected or used. Using this versioning overrides any version stored in the repository files without changing actual file.
When selecting the edp versioning type, the extra field will appear:
Type the version number from which you want the artifacts to be versioned.
Note
The Start Version From field should be filled out in compliance with the semantic versioning rules, e.g. 1.2.3 or 10.10.10. Please refer to the Semantic Versioning page for details.
c. Specify the pattern to validate a commit message. Use regular expression to indicate the pattern that is followed on the project to validate a commit message in the code review pipeline. An example of the pattern: ^[PROJECT_NAME-d{4}]:.*$.
d. Select the Integrate with Jira Server check box in case it is required to connect Jira tickets with the commits and have a respective label in the Fix Version field.
Note
To adjust the Jira integration functionality, first apply the necessary changes described on the Adjust Jira Integration page, and setup the Adjust VCS Integration With Jira. Pay attention that the Jira integration feature is not available when using the GitLab CI tool.
e. In the Jira Server field, select the Jira server.
f. Specify the pattern to find a Jira ticket number in a commit message. Based on this pattern, the value from EDP will be displayed in Jira. Combine several variables to obtain the desired value.
Note
The GitLab CI tool is available only with the Import strategy and makes the Jira integration feature unavailable.
g. In the Mapping field name section, specify the names of the Jira fields that should be filled in with attributes from EDP:
Select the name of the field in a Jira ticket from the Mapping field name drop-down menu. The available fields are the following: Fix Version/s, Component/s and Labels.
Click the Add button to add the mapping field name.
Enter Jira pattern for the field name:
For the Fix Version/s field, select the EDP_VERSION variable that represents an EDP upgrade version, as in 2.7.0-SNAPSHOT. Combine variables to make the value more informative. For example, the pattern EDP_VERSION-EDP_COMPONENT will be displayed as 2.7.0-SNAPSHOT-nexus-operator in Jira.
For the Component/s field, select the EDP_COMPONENT variable that defines the name of the existing repository. For example, nexus-operator.
For the Labels field, select the EDP_GITTAG variable that defines a tag assigned to the commit in GitHub. For example, build/2.7.0-SNAPSHOT.59.
Click the bin icon to remove the Jira field name.
h. Click the Apply button to add the application to the Applications list.
Note
After the complete adding of the application, inspect the Application Overview part.
Note
Since EDP v3.3.0, the CI tool field has been hidden. Now Headlamp automatically defines the CI tool depending on which one is deployed with EDP. If both Jenkins and Tekton are deployed, Headlamp chooses Tekton by default. To define the CI tool manualy, operate with the spec.ciTool parameters.
Headlamp allows to create, clone and import an infrastructure. Its functionality is to create resources in cloud provider.
To add an application, navigate to the Components section on the navigation bar and click Create (the plus sign icon in the lower-right corner of the screen). Once clicked, the Create new component dialog will appear, then select Application and choose one of the strategies which will be described later in this page. You can create an Application in YAML or via the two-step menu in the dialog.
Follow the instructions below to fill in the fields of the Codebase Info menu:
In the Create new component menu, select Infrastructure:
Select the necessary configuration strategy:
Create from template – creates a project on the pattern in accordance with an infrastructure language, a build tool, and a framework.
Import project - allows configuring a replication from the Git server. While importing the existing repository, select the Git server from the drop-down list and define the relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example.
Clone project – clones the indicated repository into EPAM Delivery Platform. While cloning the existing repository, it is required to fill in the Repository URL field as well:
In our example, we will use the Create from template strategy:
Select the Git server from the drop-down list and define the Git repo relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example.
Type the name of the infrastructure in the Component name field by entering at least two characters and by using the lower-case letters, numbers and inner dashes.
Write the description in the Description field.
To create an application with an empty repository in Gerrit, select the Empty project check box.
Select any of the supported application languages with their providers in the Infrastructure Code Language field. So far, only HCL is supported.
Note
The Create from template strategy does not allow to customize the default code language set.
Select necessary Language version/framework depending on the Infrastructure code language field. So far, only AWS is supported.
Choose the necessary build tool in the Build Tool field. So far, only Terraform is supported.\
Note
The Select Build Tool field disposes of the default tools and can be changed in accordance with the selected code language.
The Advanced Settings menu should look similar to the picture below:
Follow the instructions below to fill in the fields of the Advanced Setting menu:
a. Specify the name of the Default branch where you want the development to be performed.
Note
The default branch cannot be deleted. For the Clone project and Import project strategies: if you want to use the existing branch, enter its name into this field.
b. Select the necessary codebase versioning type:
default - using the default versioning type, in order to specify the version of the current artifacts, images, and tags in the Version Control System, a developer should navigate to the corresponding file and change the version manually.
edp - using the edp versioning type, a developer indicates the version number that will be used for all the artifacts stored in artifactory: binaries, pom.xml, metadata, etc. The version stored in repository (e.g. pom.xml) will not be affected or used. Using this versioning overrides any version stored in the repository files without changing actual file.
When selecting the edp versioning type, the extra field will appear:
Type the version number from which you want the artifacts to be versioned.
Note
The Start Version From field should be filled out in compliance with the semantic versioning rules, e.g. 1.2.3 or 10.10.10. Please refer to the Semantic Versioning page for details.
c. Specify the pattern to validate a commit message. Use regular expression to indicate the pattern that is followed on the project to validate a commit message in the code review pipeline. An example of the pattern: ^[PROJECT_NAME-d{4}]:.*$.
d. Select the Integrate with Jira Server check box in case it is required to connect Jira tickets with the commits and have a respective label in the Fix Version field.
Note
To adjust the Jira integration functionality, first apply the necessary changes described on the Adjust Jira Integration page, and setup the Adjust VCS Integration With Jira. Pay attention that the Jira integration feature is not available when using the GitLab CI tool.
e. In the Jira Server field, select the Jira server.
f. Specify the pattern to find a Jira ticket number in a commit message. Based on this pattern, the value from EDP will be displayed in Jira. Combine several variables to obtain the desired value.
Note
The GitLab CI tool is available only with the Import strategy and makes the Jira integration feature unavailable.
g. In the Mapping field name section, specify the names of the Jira fields that should be filled in with attributes from EDP:
Select the name of the field in a Jira ticket from the Mapping field name drop-down menu. The available fields are the following: Fix Version/s, Component/s and Labels.
Click the Add button to add the mapping field name.
Enter Jira pattern for the field name:
For the Fix Version/s field, select the EDP_VERSION variable that represents an EDP upgrade version, as in 2.7.0-SNAPSHOT. Combine variables to make the value more informative. For example, the pattern EDP_VERSION-EDP_COMPONENT will be displayed as 2.7.0-SNAPSHOT-nexus-operator in Jira.
For the Component/s field, select the EDP_COMPONENT variable that defines the name of the existing repository. For example, nexus-operator.
For the Labels field, select the EDP_GITTAG variable that defines a tag assigned to the commit in GitHub. For example, build/2.7.0-SNAPSHOT.59.
Click the bin icon to remove the Jira field name.
h. Click the Apply button to add the application to the Applications list.
Note
After the complete adding of the application, inspect the Application Overview part.
Note
Since EDP v3.3.0, the CI tool field has been hidden. Now Headlamp automatically defines the CI tool depending on which one is deployed with EDP. If both Jenkins and Tekton are deployed, Headlamp chooses Tekton by default. To define the CI tool manualy, operate with the spec.ciTool parameters.
Headlamp helps to create, clone and import a library and add it to the environment. It can also be deployed in Gerrit (if the Clone or Create strategy is used) with the Code Review and Build pipelines built in Jenkins/Tekton.
To add a library, navigate to the Components section on the navigation bar and click Create (the plus sign icon in the lower-right corner of the screen). Once clicked, the Create new component dialog will appear, then select Library and choose one of the strategies which will be described later in this page. You can create a library in YAML or via the two-step menu in the dialog.
The Create new component menu should look like the following:
In the Create new component menu, select the necessary configuration strategy. The choice will define the parameters you will need to specify:
Create from template – creates a project on the pattern in accordance with a library language, a build tool, and a framework.
Import project - allows configuring a replication from the Git server. While importing the existing repository, select the Git server from the drop-down list and define the relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example.
Clone project – clones the indicated repository into EPAM Delivery Platform. While cloning the existing repository, it is required to fill in the Repository URL field as well:
In our example, we will use the Create from template strategy:
While importing the existing repository, select the Git server from the drop-down list and define the relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example
Type the name of the library in the Component name field by entering at least two characters and by using the lower-case letters, numbers and inner dashes.
Type the library description.
To create a library with an empty repository in Gerrit, select the Empty project check box. The empty repository option is available only for the Create from template strategy.
Select any of the supported code languages with its framework in the Library code language field:
Java – selecting specific Java version available.
JavaScript - selecting JavaScript allows using the NPM tool.
Python - selecting Python allows using the Python v.3.8, FastAPI, Flask.
Groovy-pipeline - selecting Groovy-pipeline allows having the ability to customize a stages logic. For details, please refer to the Customize CD Pipeline page.
Terraform - selecting Terraform allows using the Terraform different versions via the Terraform version manager (tfenv). EDP supports all actions available in Terraform, thus providing the ability to modify the virtual infrastructure and launch some checks with the help of linters. For details, please refer to the Use Terraform Library in EDP page.
Rego - this option allows using Rego code language with an Open Policy Agent (OPA) Library. For details, please refer to the Use Open Policy Agent page.
Container - this option allows using the Kaniko tool for building the container images from a Dockerfile. For details, please refer to the CI Pipeline for Container page.
Helm - this option allows using the chart testing lint (Pipeline) for Helm charts or using Helm chart as a set of other Helm charts organized according to the example.
C# - selecting C# allows using .Net v.3.1 and .Net v.6.0.
Other - selecting Other allows extending the default code languages when creating a codebase with the Clone/Import strategy. To add another code language, inspect the Add Other Code Language page.
Note
The Create strategy does not allow to customize the default code language set.
Select necessary Language version/framework depending on the Library code language field.
The Select Build Tool field disposes of the default tools and can be changed in accordance with the selected code language.
Click the Proceed button to switch to the next menu.
The Advanced Settings menu should look like the picture below:
a. Specify the name of the default branch where you want the development to be performed.
Note
The default branch cannot be deleted.
b. Select the necessary codebase versioning type:
default: Using the default versioning type, in order to specify the version of the current artifacts, images, and tags in the Version Control System, a developer should navigate to the corresponding file and change the version manually.
edp: Using the edp versioning type, a developer indicates the version number from which all the artifacts will be versioned and, as a result, automatically registered in the corresponding file (e.g. pom.xml).
When selecting the edp versioning type, the extra field will appear:
Type the version number from which you want the artifacts to be versioned.
Note
The Start Version From field should be filled out in compliance with the semantic versioning rules, e.g. 1.2.3 or 10.10.10. Please refer to the Semantic Versioning page for details.
c. Specify the pattern to validate a commit message. Use regular expression to indicate the pattern that is followed on the project to validate a commit message in the code review pipeline. An example of the pattern: ^[PROJECT_NAME-d{4}]:.*$
d. Select the Integrate with Jira server check box in case it is required to connect Jira tickets with the commits and have a respective label in the Fix Version field.
Note
To adjust the Jira integration functionality, first apply the necessary changes described on the Adjust Jira Integration page, and Adjust VCS Integration With Jira. Pay attention that the Jira integration feature is not available when using the GitLab CI tool.
e. As soon as the Jira server is set, select it in the Jira Server field.
f. Specify the pattern to find a Jira ticket number in a commit message. Based on this pattern, the value from EDP will be displayed in Jira.
g. In the Advanced Mapping section, specify the names of the Jira fields that should be filled in with attributes from EDP:
Select the name of the field in a Jira ticket. The available fields are the following: Fix Version/s, Component/s and Labels.
Click the Add button to add the mapping field name.
Enter Jira pattern for the field name:
For the Fix Version/s field, select the EDP_VERSION variable that represents an EDP upgrade version, as in 2.7.0-SNAPSHOT. Combine variables to make the value more informative. For example, the pattern EDP_VERSION-EDP_COMPONENT will be displayed as 2.7.0-SNAPSHOT-nexus-operator in Jira.
For the Component/s field select the EDP_COMPONENT variable that defines the name of the existing repository. For example, nexus-operator.
For the Labels field select the EDP_GITTAGvariable that defines a tag assigned to the commit in Git Hub. For example, build/2.7.0-SNAPSHOT.59.
Click the bin icon to remove the Jira field name.
h. Click the Apply button to add the library to the Libraries list.
Note
After the complete adding of the library, inspect the Library Overview part.
Note
Since EDP v3.3.0, the CI tool field has been hidden. Now Headlamp automatically defines the CI tool depending on which one is deployed with EDP. If both Jenkins and Tekton are deployed, Headlamp chooses Tekton by default. To define the CI tool manualy, operate with the spec.ciTool parameters.
Headlamp helps to create, clone and import a library and add it to the environment. It can also be deployed in Gerrit (if the Clone or Create strategy is used) with the Code Review and Build pipelines built in Jenkins/Tekton.
To add a library, navigate to the Components section on the navigation bar and click Create (the plus sign icon in the lower-right corner of the screen). Once clicked, the Create new component dialog will appear, then select Library and choose one of the strategies which will be described later in this page. You can create a library in YAML or via the two-step menu in the dialog.
The Create new component menu should look like the following:
In the Create new component menu, select the necessary configuration strategy. The choice will define the parameters you will need to specify:
Create from template – creates a project on the pattern in accordance with a library language, a build tool, and a framework.
Import project - allows configuring a replication from the Git server. While importing the existing repository, select the Git server from the drop-down list and define the relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example.
Clone project – clones the indicated repository into EPAM Delivery Platform. While cloning the existing repository, it is required to fill in the Repository URL field as well:
In our example, we will use the Create from template strategy:
While importing the existing repository, select the Git server from the drop-down list and define the relative path to the repository, such as /epmd-edp/examples/basic/edp-auto-tests-simple-example
Type the name of the library in the Component name field by entering at least two characters and by using the lower-case letters, numbers and inner dashes.
Type the library description.
To create a library with an empty repository in Gerrit, select the Empty project check box. The empty repository option is available only for the Create from template strategy.
Select any of the supported code languages with its framework in the Library code language field:
Java – selecting specific Java version available.
JavaScript - selecting JavaScript allows using the NPM tool.
Python - selecting Python allows using the Python v.3.8, FastAPI, Flask.
Groovy-pipeline - selecting Groovy-pipeline allows having the ability to customize a stages logic. For details, please refer to the Customize CD Pipeline page.
Terraform - selecting Terraform allows using the Terraform different versions via the Terraform version manager (tfenv). EDP supports all actions available in Terraform, thus providing the ability to modify the virtual infrastructure and launch some checks with the help of linters. For details, please refer to the Use Terraform Library in EDP page.
Rego - this option allows using Rego code language with an Open Policy Agent (OPA) Library. For details, please refer to the Use Open Policy Agent page.
Container - this option allows using the Kaniko tool for building the container images from a Dockerfile. For details, please refer to the CI Pipeline for Container page.
Helm - this option allows using the chart testing lint (Pipeline) for Helm charts or using Helm chart as a set of other Helm charts organized according to the example.
C# - selecting C# allows using .Net v.3.1 and .Net v.6.0.
Other - selecting Other allows extending the default code languages when creating a codebase with the Clone/Import strategy. To add another code language, inspect the Add Other Code Language page.
Note
The Create strategy does not allow to customize the default code language set.
Select necessary Language version/framework depending on the Library code language field.
The Select Build Tool field disposes of the default tools and can be changed in accordance with the selected code language.
Click the Proceed button to switch to the next menu.
The Advanced Settings menu should look like the picture below:
a. Specify the name of the default branch where you want the development to be performed.
Note
The default branch cannot be deleted.
b. Select the necessary codebase versioning type:
default: Using the default versioning type, in order to specify the version of the current artifacts, images, and tags in the Version Control System, a developer should navigate to the corresponding file and change the version manually.
edp: Using the edp versioning type, a developer indicates the version number from which all the artifacts will be versioned and, as a result, automatically registered in the corresponding file (e.g. pom.xml).
When selecting the edp versioning type, the extra field will appear:
Type the version number from which you want the artifacts to be versioned.
Note
The Start Version From field should be filled out in compliance with the semantic versioning rules, e.g. 1.2.3 or 10.10.10. Please refer to the Semantic Versioning page for details.
c. Specify the pattern to validate a commit message. Use regular expression to indicate the pattern that is followed on the project to validate a commit message in the code review pipeline. An example of the pattern: ^[PROJECT_NAME-d{4}]:.*$
d. Select the Integrate with Jira server check box in case it is required to connect Jira tickets with the commits and have a respective label in the Fix Version field.
Note
To adjust the Jira integration functionality, first apply the necessary changes described on the Adjust Jira Integration page, and Adjust VCS Integration With Jira. Pay attention that the Jira integration feature is not available when using the GitLab CI tool.
e. As soon as the Jira server is set, select it in the Jira Server field.
f. Specify the pattern to find a Jira ticket number in a commit message. Based on this pattern, the value from EDP will be displayed in Jira.
g. In the Advanced Mapping section, specify the names of the Jira fields that should be filled in with attributes from EDP:
Select the name of the field in a Jira ticket. The available fields are the following: Fix Version/s, Component/s and Labels.
Click the Add button to add the mapping field name.
Enter Jira pattern for the field name:
For the Fix Version/s field, select the EDP_VERSION variable that represents an EDP upgrade version, as in 2.7.0-SNAPSHOT. Combine variables to make the value more informative. For example, the pattern EDP_VERSION-EDP_COMPONENT will be displayed as 2.7.0-SNAPSHOT-nexus-operator in Jira.
For the Component/s field select the EDP_COMPONENT variable that defines the name of the existing repository. For example, nexus-operator.
For the Labels field select the EDP_GITTAGvariable that defines a tag assigned to the commit in Git Hub. For example, build/2.7.0-SNAPSHOT.59.
Click the bin icon to remove the Jira field name.
h. Click the Apply button to add the library to the Libraries list.
Note
After the complete adding of the library, inspect the Library Overview part.
Note
Since EDP v3.3.0, the CI tool field has been hidden. Now Headlamp automatically defines the CI tool depending on which one is deployed with EDP. If both Jenkins and Tekton are deployed, Headlamp chooses Tekton by default. To define the CI tool manualy, operate with the spec.ciTool parameters.
With the built-in Marketplace, users can easily create a new application by clicking several buttons. This page contains detailed guidelines on how to create a new component with the help of the Marketplace feature.
With the built-in Marketplace, users can easily create a new application by clicking several buttons. This page contains detailed guidelines on how to create a new component with the help of the Marketplace feature.
This section describes how to use quality gate in EDP and how to customize the quality gate for the CD pipeline with the selected build version of the promoted application between stages.
Quality gate pipeline is a usual Tekton pipeline but with a specific label: app.edp.epam.com/pipelinetype: deploy. To add and apply the quality gate to your pipelines, follow the steps below:
1. To use the Tekton pipeline as a quality gate pipeline, add this label to the pipelines:
This section describes how to use quality gate in EDP and how to customize the quality gate for the CD pipeline with the selected build version of the promoted application between stages.
Quality gate pipeline is a usual Tekton pipeline but with a specific label: app.edp.epam.com/pipelinetype: deploy. To add and apply the quality gate to your pipelines, follow the steps below:
1. To use the Tekton pipeline as a quality gate pipeline, add this label to the pipelines:
As soon as the application is successfully provisioned, the following will be created:
Code Review and Build pipelines in Jenkins/Tekton for this application. The Build pipeline will be triggered automatically if at least one environment is already added.
A new project in Gerrit or another VCS.
SonarQube integration will be available after the Build pipeline in Jenkins/Tekton is passed.
Nexus Repository Manager will be available after the Build pipeline in Jenkins/Tekton is passed as well.
The added application will be listed in the Applications list allowing you to do the following:
Application status - displays the Git Server status. Can be red or green depending on if the Headlamp managed to connect to the Git Server with the specified credentials or not.
Application name (clickable) - displays the Git Server name set during the Git Server creation.
Open documentation - opens the documentation that leads to this page.
Enable filtering - enables filtering by Git Server name and namespace where this custom resource is located in.
Create new application - displays the Create new component menu.
Edit application - edit the application by selecting the options icon next to its name in the applications list, and then selecting Edit. For details see the Edit Existing Application section.
Delete application - remove application by selecting the options icon next to its name in the applications list, and then selecting Delete.
Note
The application that is used in a CD pipeline cannot be removed.
There are also options to sort the applications:
Sort the existing applications in a table by clicking the sorting icons in the table header. Sort the applications alphabetically by their name, language, build tool, framework, and CI tool. You can also sort the applications by their status: Created, Failed, or In progress.
Select a number of applications displayed per page (15, 25 or 50 rows) and navigate between pages if the number of applications exceeds the capacity of a single page:
EDP Headlamp provides the ability to enable, disable or edit the Jira Integration functionality for applications.R
To edit an application directly from the Applications overview page or when viewing the application data:
Select Edit in the options icon menu:
The Edit Application dialog opens.
To enable Jira integration, in the Edit Application dialog do the following:
a. Mark the Integrate with Jira server check box and fill in the necessary fields. Please see steps d-h of the Add Application page.
b. Select the Apply button to apply the changes.
c. (Optional) Enable commit validation mechanism by navigating to Jenkins/Tekton and adding the commit-validate stage in the Code Review pipeline to have your commits reviewed.
To disable Jira integration, in the Edit Application dialog do the following:
a. Unmark the Integrate with Jira server check box.
b. Select the Apply button to apply the changes.
c. (Optional) Disable commit validation mechanism by navigating to Jenkins/Tekton and removing the commit-validate stage in the Code Review pipeline to have your commits reviewed.
To create, edit and delete application branches, please refer to the Manage Branches page.
As soon as the application is successfully provisioned, the following will be created:
Code Review and Build pipelines in Jenkins/Tekton for this application. The Build pipeline will be triggered automatically if at least one environment is already added.
A new project in Gerrit or another VCS.
SonarQube integration will be available after the Build pipeline in Jenkins/Tekton is passed.
Nexus Repository Manager will be available after the Build pipeline in Jenkins/Tekton is passed as well.
The added application will be listed in the Applications list allowing you to do the following:
Application status - displays the Git Server status. Can be red or green depending on if the Headlamp managed to connect to the Git Server with the specified credentials or not.
Application name (clickable) - displays the Git Server name set during the Git Server creation.
Open documentation - opens the documentation that leads to this page.
Enable filtering - enables filtering by Git Server name and namespace where this custom resource is located in.
Create new application - displays the Create new component menu.
Edit application - edit the application by selecting the options icon next to its name in the applications list, and then selecting Edit. For details see the Edit Existing Application section.
Delete application - remove application by selecting the options icon next to its name in the applications list, and then selecting Delete.
Note
The application that is used in a CD pipeline cannot be removed.
There are also options to sort the applications:
Sort the existing applications in a table by clicking the sorting icons in the table header. Sort the applications alphabetically by their name, language, build tool, framework, and CI tool. You can also sort the applications by their status: Created, Failed, or In progress.
Select a number of applications displayed per page (15, 25 or 50 rows) and navigate between pages if the number of applications exceeds the capacity of a single page:
EDP Headlamp provides the ability to enable, disable or edit the Jira Integration functionality for applications.R
To edit an application directly from the Applications overview page or when viewing the application data:
Select Edit in the options icon menu:
The Edit Application dialog opens.
To enable Jira integration, in the Edit Application dialog do the following:
a. Mark the Integrate with Jira server check box and fill in the necessary fields. Please see steps d-h of the Add Application page.
b. Select the Apply button to apply the changes.
c. (Optional) Enable commit validation mechanism by navigating to Jenkins/Tekton and adding the commit-validate stage in the Code Review pipeline to have your commits reviewed.
To disable Jira integration, in the Edit Application dialog do the following:
a. Unmark the Integrate with Jira server check box.
b. Select the Apply button to apply the changes.
c. (Optional) Disable commit validation mechanism by navigating to Jenkins/Tekton and removing the commit-validate stage in the Code Review pipeline to have your commits reviewed.
To create, edit and delete application branches, please refer to the Manage Branches page.
As soon as the autotest is successfully provisioned, the following will be created:
Code Review and Build pipelines in Jenkins/Tekton for this autotest. The Build pipeline will be triggered automatically if at least one environment is already added.
A new project in Gerrit or another VCS.
SonarQube integration will be available after the Build pipeline in Jenkins/Tekton is passed.
Nexus Repository Manager will be available after the Build pipeline in Jenkins/Tekton is passed as well.
Info
To navigate quickly to OpenShift, Jenkins/Tekton, Gerrit, SonarQube, Nexus, and other resources, click the Overview section on the navigation bar and hit the necessary link.
The added autotest will be listed in the Autotests list allowing you to do the following:
Autotest status - displays the Git Server status. Can be red or green depending on if the Headlamp managed to connect to the Git Server with the specified credentials or not.
Autotest name (clickable) - displays the Git Server name set during the Git Server creation.
Open documentation - opens the documentation that leads to this page.
Enable filtering - enables filtering by Git Server name and namespace where this custom resource is located in.
Create new autotest - displays the Create new component menu.
Edit autotest - edit the autotest by selecting the options icon next to its name in the autotests list, and then selecting Edit. For details see the Edit Existing Autotest section.
Delete autotest - remove autotest with the corresponding database and Jenkins/Tekton pipelines by selecting the options icon next to its name in the Autotests list, and then selecting Delete:
Note
The autotest that is used in a CD pipeline cannot be removed.
There are also options to sort the applications:
Sort the existing autotests in a table by clicking the sorting icons in the table header. Sort the autotests alphabetically by their name, language, build tool, framework, and CI tool. You can also sort the autotests by their status: Created, Failed, or In progress.
Select a number of autotests displayed per page (15, 25 or 50 rows) and navigate between pages if the number of autotests exceeds the capacity of a single page.
EDP Headlamp provides the ability to enable, disable or edit the Jira Integration functionality for autotests.
To edit an autotest directly from the Autotests overview page or when viewing the autotest data:
Select Edit in the options icon menu:
The Edit Autotest dialog opens.
To enable Jira integration, on the Edit Autotest page do the following:
a. Mark the Integrate with Jira server check box and fill in the necessary fields. Please see steps d-h on the Add Autotests page.
b. Select the Apply button to apply the changes.
c. Navigate to Jenkins/Tekton and add the create-jira-issue-metadata stage in the Build pipeline. Also add the commit-validate stage in the Code Review pipeline.
Note
Pay attention that the Jira integration feature is not available when using the GitLab CI tool.
To disable Jira integration, in the Edit Autotest dialog do the following:
a. Unmark the Integrate with Jira server check box.
b. Select the Apply button to apply the changes.
c. Navigate to Jenkins/Tekton and remove the create-jira-issue-metadata stage in the Build pipeline. Also remove the commit-validate stage in the Code Review pipeline.
As a result, the necessary changes will be applied.
To create, edit and delete application branches, please refer to the Manage Branches page.
In order to add an autotest as a quality gate to a newly added CD pipeline, do the following:
Create a CD pipeline with the necessary parameters. Please refer to the Add CD Pipeline section for the details.
In the Stages menu, select the Autotest quality gate type. It means the promoting process should be confirmed by the successful passing of the autotests.
In the additional fields, select the previously created autotest name and specify its branch.
After filling in all the necessary fields, click the Create button to start the provisioning of the pipeline. After the CD pipeline is added, the new namespace containing the stage name will be created in Kubernetes (in OpenShift, a new project will be created) with the following name pattern: [cluster name]-[cd pipeline name]-[stage name].
There is an ability to run the autotests locally using the IDEA (Integrated Development Environment Application, such as IntelliJ, NetBeans etc.). To launch the autotest project for the local verification, perform the following steps:
Clone the project to the local machine.
Open the project in IDEA and find the run.json file to copy out the necessary command value.
Paste the copied command value into the Command line field and run it with the necessary values and namespace.
As a result, all the launched tests will be executed.
As soon as the autotest is successfully provisioned, the following will be created:
Code Review and Build pipelines in Jenkins/Tekton for this autotest. The Build pipeline will be triggered automatically if at least one environment is already added.
A new project in Gerrit or another VCS.
SonarQube integration will be available after the Build pipeline in Jenkins/Tekton is passed.
Nexus Repository Manager will be available after the Build pipeline in Jenkins/Tekton is passed as well.
Info
To navigate quickly to OpenShift, Jenkins/Tekton, Gerrit, SonarQube, Nexus, and other resources, click the Overview section on the navigation bar and hit the necessary link.
The added autotest will be listed in the Autotests list allowing you to do the following:
Autotest status - displays the Git Server status. Can be red or green depending on if the Headlamp managed to connect to the Git Server with the specified credentials or not.
Autotest name (clickable) - displays the Git Server name set during the Git Server creation.
Open documentation - opens the documentation that leads to this page.
Enable filtering - enables filtering by Git Server name and namespace where this custom resource is located in.
Create new autotest - displays the Create new component menu.
Edit autotest - edit the autotest by selecting the options icon next to its name in the autotests list, and then selecting Edit. For details see the Edit Existing Autotest section.
Delete autotest - remove autotest with the corresponding database and Jenkins/Tekton pipelines by selecting the options icon next to its name in the Autotests list, and then selecting Delete:
Note
The autotest that is used in a CD pipeline cannot be removed.
There are also options to sort the applications:
Sort the existing autotests in a table by clicking the sorting icons in the table header. Sort the autotests alphabetically by their name, language, build tool, framework, and CI tool. You can also sort the autotests by their status: Created, Failed, or In progress.
Select a number of autotests displayed per page (15, 25 or 50 rows) and navigate between pages if the number of autotests exceeds the capacity of a single page.
EDP Headlamp provides the ability to enable, disable or edit the Jira Integration functionality for autotests.
To edit an autotest directly from the Autotests overview page or when viewing the autotest data:
Select Edit in the options icon menu:
The Edit Autotest dialog opens.
To enable Jira integration, on the Edit Autotest page do the following:
a. Mark the Integrate with Jira server check box and fill in the necessary fields. Please see steps d-h on the Add Autotests page.
b. Select the Apply button to apply the changes.
c. Navigate to Jenkins/Tekton and add the create-jira-issue-metadata stage in the Build pipeline. Also add the commit-validate stage in the Code Review pipeline.
Note
Pay attention that the Jira integration feature is not available when using the GitLab CI tool.
To disable Jira integration, in the Edit Autotest dialog do the following:
a. Unmark the Integrate with Jira server check box.
b. Select the Apply button to apply the changes.
c. Navigate to Jenkins/Tekton and remove the create-jira-issue-metadata stage in the Build pipeline. Also remove the commit-validate stage in the Code Review pipeline.
As a result, the necessary changes will be applied.
To create, edit and delete application branches, please refer to the Manage Branches page.
In order to add an autotest as a quality gate to a newly added CD pipeline, do the following:
Create a CD pipeline with the necessary parameters. Please refer to the Add CD Pipeline section for the details.
In the Stages menu, select the Autotest quality gate type. It means the promoting process should be confirmed by the successful passing of the autotests.
In the additional fields, select the previously created autotest name and specify its branch.
After filling in all the necessary fields, click the Create button to start the provisioning of the pipeline. After the CD pipeline is added, the new namespace containing the stage name will be created in Kubernetes (in OpenShift, a new project will be created) with the following name pattern: [cluster name]-[cd pipeline name]-[stage name].
There is an ability to run the autotests locally using the IDEA (Integrated Development Environment Application, such as IntelliJ, NetBeans etc.). To launch the autotest project for the local verification, perform the following steps:
Clone the project to the local machine.
Open the project in IDEA and find the run.json file to copy out the necessary command value.
Paste the copied command value into the Command line field and run it with the necessary values and namespace.
As a result, all the launched tests will be executed.
This section describes the subsequent possible actions that can be performed with the newly added or existing clusters.
In a nutshell, cluster in Headlamp is a Kubernetes secret that stores credentials and enpoint to connect to the another cluster. Adding new clusters allows users to deploy applications in several clusters, thus improving flexibilty of your infrastructure.
The added cluster will be listed in the clusters list allowing you to do the following:
This section describes the subsequent possible actions that can be performed with the newly added or existing clusters.
In a nutshell, cluster in Headlamp is a Kubernetes secret that stores credentials and enpoint to connect to the another cluster. Adding new clusters allows users to deploy applications in several clusters, thus improving flexibilty of your infrastructure.
The added cluster will be listed in the clusters list allowing you to do the following:
Git Server is a custom resource that is required for using the import strategy when creating a new component, whether it is an application, library, autotest or infrastructure.
Under the hood, Git server in Headlamp is a Kubernetes secret that stores credentials to the remote Git server.
The added application will be listed in the Applications list allowing you to do the following:
Git Server status - displays the Git Server status. Can be red or green depending on if the Headlamp managed to connect to the Git Server with the specified credentials or not.
Git Server name - displays the Git Server name set during the Git Server creation.
Open documentation - opens the documentation that leads to this page.
Enable filtering - enables filtering by Git Server name and namespace where this custom resource is located in.
Create new Git Server - displays the Create Git Server menu.
Note
Git Server can't be deleted via the Headlamp UI. Use the kubectl delete GitServer <Git_server_name> -n <edp-project> command to delete the GitServer custom resource.
Git Server is a custom resource that is required for using the import strategy when creating a new component, whether it is an application, library, autotest or infrastructure.
Under the hood, Git server in Headlamp is a Kubernetes secret that stores credentials to the remote Git server.
The added application will be listed in the Applications list allowing you to do the following:
Git Server status - displays the Git Server status. Can be red or green depending on if the Headlamp managed to connect to the Git Server with the specified credentials or not.
Git Server name - displays the Git Server name set during the Git Server creation.
Open documentation - opens the documentation that leads to this page.
Enable filtering - enables filtering by Git Server name and namespace where this custom resource is located in.
Create new Git Server - displays the Create Git Server menu.
Note
Git Server can't be deleted via the Headlamp UI. Use the kubectl delete GitServer <Git_server_name> -n <edp-project> command to delete the GitServer custom resource.
The EDP Headlamp user guide is intended for developers and provides details on working with EDP Headlamp, different codebase types, and EDP CI/CD flow.
Headlamp is a central management tool in the EDP ecosystem that provides the ability to define pipelines, project resources and new technologies in a simple way. Using Headlamp enables to manage business entities:
Create such codebase types as Applications, Libraries, Autotests and Inrastructures;
Create/Update CD Pipelines;
Add external Git servers and Clusters.
Navigation bar – consists of the following sections: Overview, Marketplace, Components, CD Pipelines, and Configuration.
Top panel bar – contains documentation link, notifications, Headlamp settings, and cluster settings, such as default and allowed namespaces.
Main links – displays the corresponding links to the major adjusted toolset, to the management tool and to the OpenShift cluster.
Filters – used for searching and filtering the namespaces.
Headlamp is a complete tool allowing to manage and control the codebases (applications, autotests, libraries and infrastructures) added to the environment as well as to create a CD pipeline.
Inspect the main features available in Headlamp by following the corresponding link:
The EDP Headlamp user guide is intended for developers and provides details on working with EDP Headlamp, different codebase types, and EDP CI/CD flow.
Headlamp is a central management tool in the EDP ecosystem that provides the ability to define pipelines, project resources and new technologies in a simple way. Using Headlamp enables to manage business entities:
Create such codebase types as Applications, Libraries, Autotests and Inrastructures;
Create/Update CD Pipelines;
Add external Git servers and Clusters.
Navigation bar – consists of the following sections: Overview, Marketplace, Components, CD Pipelines, and Configuration.
Top panel bar – contains documentation link, notifications, Headlamp settings, and cluster settings, such as default and allowed namespaces.
Main links – displays the corresponding links to the major adjusted toolset, to the management tool and to the OpenShift cluster.
Filters – used for searching and filtering the namespaces.
Headlamp is a complete tool allowing to manage and control the codebases (applications, autotests, libraries and infrastructures) added to the environment as well as to create a CD pipeline.
Inspect the main features available in Headlamp by following the corresponding link:
As soon as the infrastructure is successfully provisioned, the following will be created:
Code Review and Build pipelines in Jenkins/Tekton for this application. The Build pipeline will be triggered automatically if at least one environment is already added.
A new project in Gerrit or another VCS.
SonarQube integration will be available after the Build pipeline in Jenkins/Tekton is passed.
Nexus Repository Manager will be available after the Build pipeline in Jenkins/Tekton is passed as well.
The added application will be listed in the Applications list allowing you to do the following:
Infrastructure status - displays the Git Server status. Can be red or green depending on if the Headlamp managed to connect to the Git Server with the specified credentials or not.
Infrastructure name (clickable) - displays the infrastructure name set during the Git Server creation.
Open documentation - opens the documentation that leads to this page.
Enable filtering - enables filtering by Git Server name and namespace where this custom resource is located in.
Create new infrastructure - displays the Create new component menu.
Edit infrastructure - edit the infrastructure by selecting the options icon next to its name in the infrastructures list, and then selecting Edit. For details see the Edit Existing Application section.
Delete infrastructure - remove infrastructure by selecting the options icon next to its name in the infrastructures list, and then selecting Delete.
There are also options to sort the infrastructures:
Sort the existing infrastructures in a table by clicking the sorting icons in the table header. Sort the infrastructures alphabetically by their name, language, build tool, framework, and CI tool. You can also sort the infrastructures by their status: Created, Failed, or In progress.
Select a number of infrastructures displayed per page (15, 25 or 50 rows) and navigate between pages if the number of applications exceeds the capacity of a single page.
EDP Headlamp provides the ability to enable, disable or edit the Jira Integration functionality for infrastructures.
To edit an infrastructure directly from the infrastructures overview page or when viewing the infrastructure data:
Select Edit in the options icon menu:
The Edit Infrastructure dialog opens.
To enable Jira integration, in the Edit Infrastructure dialog do the following:
a. Mark the Integrate with Jira server check box and fill in the necessary fields. Please see steps d-h on the Add Infrastructure page.
b. Select the Apply button to apply the changes.
c. Navigate to Jenkins/Tekton and add the create-jira-issue-metadata stage in the Build pipeline. Also add the commit-validate stage in the Code Review pipeline.
To disable Jira integration, in the Edit Infrastructure dialog do the following:
a. Unmark the Integrate with Jira server check box.
b. Select the Apply button to apply the changes.
c. Navigate to Jenkins/Tekton and remove the create-jira-issue-metadata stage in the Build pipeline. Also remove the commit-validate stage in the Code Review pipeline.
To create, edit and delete infrastructure branches, please refer to the Manage Branches page.
As soon as the infrastructure is successfully provisioned, the following will be created:
Code Review and Build pipelines in Jenkins/Tekton for this application. The Build pipeline will be triggered automatically if at least one environment is already added.
A new project in Gerrit or another VCS.
SonarQube integration will be available after the Build pipeline in Jenkins/Tekton is passed.
Nexus Repository Manager will be available after the Build pipeline in Jenkins/Tekton is passed as well.
The added application will be listed in the Applications list allowing you to do the following:
Infrastructure status - displays the Git Server status. Can be red or green depending on if the Headlamp managed to connect to the Git Server with the specified credentials or not.
Infrastructure name (clickable) - displays the infrastructure name set during the Git Server creation.
Open documentation - opens the documentation that leads to this page.
Enable filtering - enables filtering by Git Server name and namespace where this custom resource is located in.
Create new infrastructure - displays the Create new component menu.
Edit infrastructure - edit the infrastructure by selecting the options icon next to its name in the infrastructures list, and then selecting Edit. For details see the Edit Existing Application section.
Delete infrastructure - remove infrastructure by selecting the options icon next to its name in the infrastructures list, and then selecting Delete.
There are also options to sort the infrastructures:
Sort the existing infrastructures in a table by clicking the sorting icons in the table header. Sort the infrastructures alphabetically by their name, language, build tool, framework, and CI tool. You can also sort the infrastructures by their status: Created, Failed, or In progress.
Select a number of infrastructures displayed per page (15, 25 or 50 rows) and navigate between pages if the number of applications exceeds the capacity of a single page.
EDP Headlamp provides the ability to enable, disable or edit the Jira Integration functionality for infrastructures.
To edit an infrastructure directly from the infrastructures overview page or when viewing the infrastructure data:
Select Edit in the options icon menu:
The Edit Infrastructure dialog opens.
To enable Jira integration, in the Edit Infrastructure dialog do the following:
a. Mark the Integrate with Jira server check box and fill in the necessary fields. Please see steps d-h on the Add Infrastructure page.
b. Select the Apply button to apply the changes.
c. Navigate to Jenkins/Tekton and add the create-jira-issue-metadata stage in the Build pipeline. Also add the commit-validate stage in the Code Review pipeline.
To disable Jira integration, in the Edit Infrastructure dialog do the following:
a. Unmark the Integrate with Jira server check box.
b. Select the Apply button to apply the changes.
c. Navigate to Jenkins/Tekton and remove the create-jira-issue-metadata stage in the Build pipeline. Also remove the commit-validate stage in the Code Review pipeline.
To create, edit and delete infrastructure branches, please refer to the Manage Branches page.
As soon as the library is successfully provisioned, the following will be created:
Code Review and Build pipelines in Jenkins/Tekton for this library. The Build pipeline will be triggered automatically if at least one environment is already added.
A new project in Gerrit or another VCS.
SonarQube integration will be available after the Build pipeline in Jenkins/Tekton is passed.
Nexus Repository Manager will be available after the Build pipeline in Jenkins/Tekton is passed as well.
Info
To navigate quickly to OpenShift, Jenkins/Tekton, Gerrit, SonarQube, Nexus, and other resources, click the Overview section on the navigation bar and hit the necessary link.
The added library will be listed in the Libraries list allowing to do the following:
Create another library by clicking the plus sign icon in the lower-right corner of the screen and performing the same steps as described on the Add Library page.
Open library data by clicking its link name. Once clicked, the following blocks will be displayed:
Library status - displays the Git Server status. Can be red or green depending on if the Headlamp managed to connect to the Git Server with the specified credentials or not.
Library name (clickable) - displays the Git Server name set during the Git Server creation.
Open documentation - opens the documentation that leads to this page.
Enable filtering - enables filtering by Git Server name and namespace where this custom resource is located in.
Create new library - displays the Create new component menu.
Edit library - edit the library by selecting the options icon next to its name in the libraries list, and then selecting Edit. For details see the Edit Existing Library section.
Delete Library - remove library with the corresponding database and Jenkins/Tekton pipelines by selecting the options icon next to its name in the libraries list, and then selecting Delete.
Note
The library that is used in a CD pipeline cannot be removed.
There are also options to sort the libraries:
Sort the existing libraries in a table by clicking the sorting icons in the table header. Sort the libraries alphabetically by their name, language, build tool, framework, and CI tool. You can also sort the libraries by their status: Created, Failed, or In progress.
Select a number of libraries displayed per page (15, 25 or 50 rows) and navigate between pages if the number of libraries exceeds the capacity of a single page.
EDP Headlamp provides the ability to enable, disable or edit the Jira Integration functionality for libraries.
To edit a library directly from the Libraries overview page or when viewing the library data:
Select Edit in the options icon menu:
The Edit Library dialog opens.
To enable Jira integration, in the Edit Library dialog do the following:
a. Mark the Integrate with Jira server check box and fill in the necessary fields. Please see steps d-h on the Add Library page.
b. Select the Apply button to apply the changes.
c. Navigate to Jenkins/Tekton and add the create-jira-issue-metadata stage in the Build pipeline. Also add the commit-validate stage in the Code Review pipeline.
To disable Jira integration, in the Edit Library dialog do the following:
a. Unmark the Integrate with Jira server check box.
b. Select the Apply button to apply the changes.
c. Navigate to Jenkins/Tekton and remove the create-jira-issue-metadata stage in the Build pipeline. Also remove the commit-validate stage in the Code Review pipeline.
As a result, the necessary changes will be applied.
To create, edit and delete library branches, please refer to the Manage Branches page.
As soon as the library is successfully provisioned, the following will be created:
Code Review and Build pipelines in Jenkins/Tekton for this library. The Build pipeline will be triggered automatically if at least one environment is already added.
A new project in Gerrit or another VCS.
SonarQube integration will be available after the Build pipeline in Jenkins/Tekton is passed.
Nexus Repository Manager will be available after the Build pipeline in Jenkins/Tekton is passed as well.
Info
To navigate quickly to OpenShift, Jenkins/Tekton, Gerrit, SonarQube, Nexus, and other resources, click the Overview section on the navigation bar and hit the necessary link.
The added library will be listed in the Libraries list allowing to do the following:
Create another library by clicking the plus sign icon in the lower-right corner of the screen and performing the same steps as described on the Add Library page.
Open library data by clicking its link name. Once clicked, the following blocks will be displayed:
Library status - displays the Git Server status. Can be red or green depending on if the Headlamp managed to connect to the Git Server with the specified credentials or not.
Library name (clickable) - displays the Git Server name set during the Git Server creation.
Open documentation - opens the documentation that leads to this page.
Enable filtering - enables filtering by Git Server name and namespace where this custom resource is located in.
Create new library - displays the Create new component menu.
Edit library - edit the library by selecting the options icon next to its name in the libraries list, and then selecting Edit. For details see the Edit Existing Library section.
Delete Library - remove library with the corresponding database and Jenkins/Tekton pipelines by selecting the options icon next to its name in the libraries list, and then selecting Delete.
Note
The library that is used in a CD pipeline cannot be removed.
There are also options to sort the libraries:
Sort the existing libraries in a table by clicking the sorting icons in the table header. Sort the libraries alphabetically by their name, language, build tool, framework, and CI tool. You can also sort the libraries by their status: Created, Failed, or In progress.
Select a number of libraries displayed per page (15, 25 or 50 rows) and navigate between pages if the number of libraries exceeds the capacity of a single page.
EDP Headlamp provides the ability to enable, disable or edit the Jira Integration functionality for libraries.
To edit a library directly from the Libraries overview page or when viewing the library data:
Select Edit in the options icon menu:
The Edit Library dialog opens.
To enable Jira integration, in the Edit Library dialog do the following:
a. Mark the Integrate with Jira server check box and fill in the necessary fields. Please see steps d-h on the Add Library page.
b. Select the Apply button to apply the changes.
c. Navigate to Jenkins/Tekton and add the create-jira-issue-metadata stage in the Build pipeline. Also add the commit-validate stage in the Code Review pipeline.
To disable Jira integration, in the Edit Library dialog do the following:
a. Unmark the Integrate with Jira server check box.
b. Select the Apply button to apply the changes.
c. Navigate to Jenkins/Tekton and remove the create-jira-issue-metadata stage in the Build pipeline. Also remove the commit-validate stage in the Code Review pipeline.
As a result, the necessary changes will be applied.
To create, edit and delete library branches, please refer to the Manage Branches page.
When working with libraries, pay attention when specifying the branch name: the branch name is involved in the formation of the library version, so it must comply with the versioning semantic rules for the library.
When adding a component, the default branch is a master branch. In order to add a new branch, follow the steps below:
Navigate to the Branches block by clicking the component name link in the Components list.
Select the options icon related to the necessary branch and then select Create:
Click Edit YAML in the upper-right corner of the dialog to open the YAML editor and add a branch. Otherwise, fill in the required fields in the dialog:
a. Release Branch - select the Release Branch check box if you need to create a release branch.
b. Branch name - type the branch name. Pay attention that this field remains static if you create a release branch. For the Clone and Import strategies: if you want to use the existing branch, enter its name into this field.
c. From Commit Hash - paste the commit hash from which the branch will be created. For the Clone and Import strategies: Note that if the From Commit Hash field is empty, the latest commit from the branch name will be used.
d. Branch version - enter the necessary branch version for the artifact. The Release Candidate (RC) postfix is concatenated to the branch version number.
e. Default branch version - type the branch version that will be used in a master branch after the release creation. The Snapshot postfix is concatenated to the master branch version number.
f. Click the Apply button and wait until the new branch will be added to the list.
Info
Adding of a new branch is indicated in the context of the edp versioning type.
The default component repository is cloned and changed to the new indicated version before the build, i.e. the new indicated version will not be committed to the repository; thus, the existing repository will keep the default version.
When working with libraries, pay attention when specifying the branch name: the branch name is involved in the formation of the library version, so it must comply with the versioning semantic rules for the library.
When adding a component, the default branch is a master branch. In order to add a new branch, follow the steps below:
Navigate to the Branches block by clicking the component name link in the Components list.
Select the options icon related to the necessary branch and then select Create:
Click Edit YAML in the upper-right corner of the dialog to open the YAML editor and add a branch. Otherwise, fill in the required fields in the dialog:
a. Release Branch - select the Release Branch check box if you need to create a release branch.
b. Branch name - type the branch name. Pay attention that this field remains static if you create a release branch. For the Clone and Import strategies: if you want to use the existing branch, enter its name into this field.
c. From Commit Hash - paste the commit hash from which the branch will be created. For the Clone and Import strategies: Note that if the From Commit Hash field is empty, the latest commit from the branch name will be used.
d. Branch version - enter the necessary branch version for the artifact. The Release Candidate (RC) postfix is concatenated to the branch version number.
e. Default branch version - type the branch version that will be used in a master branch after the release creation. The Snapshot postfix is concatenated to the master branch version number.
f. Click the Apply button and wait until the new branch will be added to the list.
Info
Adding of a new branch is indicated in the context of the edp versioning type.
The default component repository is cloned and changed to the new indicated version before the build, i.e. the new indicated version will not be committed to the repository; thus, the existing repository will keep the default version.
The EDP Marketplace offers a range of Templates, predefined tools and settings for creating software. These Templates speed up development, minimize errors, and ensure consistency. A key EDP Marketplace feature is customization. Organizations can create and share their own Templates, finely tuned to their needs. Each Template serves as a tailored blueprint of tools and settings.
These tailored Templates include preset CI/CD pipelines, automating your development workflows. From initial integration to final deployment, these processes are efficiently managed. Whether for new applications or existing ones, these templates enhance processes, save time, and ensure consistency.
To see the Marketplace section, navigate to the Main menu -> EDP -> Marketplace. General look of the Marketplace section is described below:
Marketplace templates - all the components marketplace can offer;
Template properties - the item summary that shows the type, category, language, framework, build tool and maturity;
Enable/disable filters - enables users to enable/disable searching by the item name or namespace it is available in;
Change view - allows switching from the listed view to the tiled one and vice versa. See the screenshot below for details.
There is also a possibility to switch into the tiled view instead of the listed one:
To view the details of a marketplace item, simply click on its name:
The details window shows suplemental information, such as item's author, keywords, release version and the link to the repository it is located in. The window also contains the Create from template button that allows users to create the component by the chosen template. The procedure of creating new components is described in the Add Component via Marketplace page.
The EDP Marketplace offers a range of Templates, predefined tools and settings for creating software. These Templates speed up development, minimize errors, and ensure consistency. A key EDP Marketplace feature is customization. Organizations can create and share their own Templates, finely tuned to their needs. Each Template serves as a tailored blueprint of tools and settings.
These tailored Templates include preset CI/CD pipelines, automating your development workflows. From initial integration to final deployment, these processes are efficiently managed. Whether for new applications or existing ones, these templates enhance processes, save time, and ensure consistency.
To see the Marketplace section, navigate to the Main menu -> EDP -> Marketplace. General look of the Marketplace section is described below:
Marketplace templates - all the components marketplace can offer;
Template properties - the item summary that shows the type, category, language, framework, build tool and maturity;
Enable/disable filters - enables users to enable/disable searching by the item name or namespace it is available in;
Change view - allows switching from the listed view to the tiled one and vice versa. See the screenshot below for details.
There is also a possibility to switch into the tiled view instead of the listed one:
To view the details of a marketplace item, simply click on its name:
The details window shows suplemental information, such as item's author, keywords, release version and the link to the repository it is located in. The window also contains the Create from template button that allows users to create the component by the chosen template. The procedure of creating new components is described in the Add Component via Marketplace page.
\ No newline at end of file
diff --git a/index.html b/index.html
index d87b82fc3..3988ea3a6 100644
--- a/index.html
+++ b/index.html
@@ -148,4 +148,4 @@
width: 33.3%;
}
}
-
Build your delivery rocket
Boost your delivery with the development culture based on the modern CI/CD stack, golden path and self-service capabilities of the EPAM Delivery Platform (EDP).
Boost your delivery with the development culture based on the modern CI/CD stack, golden path and self-service capabilities of the EPAM Delivery Platform (EDP).
Every Jenkins agent is based on epamedp/edp-jenkins-base-agent. Check DockerHub for the latest version. Use it to create a new agent (or update an old one). See the example with Dockerfile of gradle-java11-agent below:
Every Jenkins agent is based on epamedp/edp-jenkins-base-agent. Check DockerHub for the latest version. Use it to create a new agent (or update an old one). See the example with Dockerfile of gradle-java11-agent below:
View: Dockerfile
# Copyright 2021 EPAM Systems.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
diff --git a/operator-guide/add-other-code-language/index.html b/operator-guide/add-other-code-language/index.html
index d8588ce66..e4dec6604 100644
--- a/operator-guide/add-other-code-language/index.html
+++ b/operator-guide/add-other-code-language/index.html
@@ -1,4 +1,4 @@
- Add Other Code Language - EPAM Delivery Platform
Open the new repository and create a directory with the /src/com/epam/edp/customStages/impl/ci/impl/stageName/ name in the library repository, for example: /src/com/epam/edp/customStages/impl/ci/impl/security/. After that, add a Groovy file with another name to the same stages catalog, for example: CustomSAST.groovy.
Open the new repository and create a directory with the /src/com/epam/edp/customStages/impl/ci/impl/stageName/ name in the library repository, for example: /src/com/epam/edp/customStages/impl/ci/impl/security/. After that, add a Groovy file with another name to the same stages catalog, for example: CustomSAST.groovy.
EDP uses Jenkins Pipeline as a part of the Continues Delivery/Continues Deployment implementation. Another approach is to use Argo CD tool as an alternative to Jenkins. Argo CD follows the best GitOps practices, uses Kubernetes native approach for the Deployment Management, has rich UI and required RBAC capabilities.
Both approaches can be deployed with High Availability (HA) or Non High Availability (non HA) installation manifests.
EDP uses the HA deployment with the cluster-admin permissions, to minimize cluster resources consumption by sharing single Argo CD instance across multiple EDP Tenants. Please follow the installation instructions to deploy Argo CD.
Argo CD is deployed in a separate argocd namespace.
Argo CD uses a cluster-admin role for managing cluster-scope resources.
The control-plane application is created using the App of Apps approach, and its code is managed by the control-plane members.
The control-plane is used to onboard new Argo CD Tenants (Argo CD Projects - AppProject).
The EDP Tenant Member manages Argo CD Applications using kind: Application in the edpTenant namespace.
The App Of Apps approach is used to manage the EDP Tenants. Inspect the edp-grub repository structure that is used to provide the EDP Tenants for the Argo CD Projects:
edp-grub
+ Argo CD Integration - EPAM Delivery Platform
EDP uses Jenkins Pipeline as a part of the Continues Delivery/Continues Deployment implementation. Another approach is to use Argo CD tool as an alternative to Jenkins. Argo CD follows the best GitOps practices, uses Kubernetes native approach for the Deployment Management, has rich UI and required RBAC capabilities.
Both approaches can be deployed with High Availability (HA) or Non High Availability (non HA) installation manifests.
EDP uses the HA deployment with the cluster-admin permissions, to minimize cluster resources consumption by sharing single Argo CD instance across multiple EDP Tenants. Please follow the installation instructions to deploy Argo CD.
Argo CD is deployed in a separate argocd namespace.
Argo CD uses a cluster-admin role for managing cluster-scope resources.
The control-plane application is created using the App of Apps approach, and its code is managed by the control-plane members.
The control-plane is used to onboard new Argo CD Tenants (Argo CD Projects - AppProject).
The EDP Tenant Member manages Argo CD Applications using kind: Application in the edpTenant namespace.
The App Of Apps approach is used to manage the EDP Tenants. Inspect the edp-grub repository structure that is used to provide the EDP Tenants for the Argo CD Projects:
edp-grub
├──LICENSE
├──README.md
├──apps### All Argo CD Applications are stored here
diff --git a/operator-guide/configure-keycloak-oidc-eks/index.html b/operator-guide/configure-keycloak-oidc-eks/index.html
index e8f1a383d..975ad23a5 100644
--- a/operator-guide/configure-keycloak-oidc-eks/index.html
+++ b/operator-guide/configure-keycloak-oidc-eks/index.html
@@ -1,4 +1,4 @@
- EKS OIDC With Keycloak - EPAM Delivery Platform
To follow the instruction, check the following prerequisites:
terraform 0.14.10
hashicorp/aws = 4.8.0
mrparkers/keycloak >= 3.0.0
hashicorp/kubernetes ~> 2.9.0
kubectl = 1.22
kubelogin >= v1.25.1
Ensure that Keycloak has network availability for AWS (not in a private network).
Note
To connect OIDC with a cluster, install and configure the kubelogin plugin. For Windows, it is recommended to download the kubelogin as a binary and add it to your PATH.
The solution includes three types of the resources - AWS (EKS), Keycloak, Kubernetes. The left part of Keycloak resources remain unchanged after creation, thus allowing us to associate a claim for a user group membership. Other resources can be created, deleted or changed if needed. The most crucial from Kubernetes permissions are Kubernetes RoleBindings and ClusterRoles/Roles. Roles present a set of permissions, in turn RoleBindings map Kubernetes Role to representative Keycloak groups, so a group member can have just appropriate permissions.
To follow the instruction, check the following prerequisites:
terraform 0.14.10
hashicorp/aws = 4.8.0
mrparkers/keycloak >= 3.0.0
hashicorp/kubernetes ~> 2.9.0
kubectl = 1.22
kubelogin >= v1.25.1
Ensure that Keycloak has network availability for AWS (not in a private network).
Note
To connect OIDC with a cluster, install and configure the kubelogin plugin. For Windows, it is recommended to download the kubelogin as a binary and add it to your PATH.
The solution includes three types of the resources - AWS (EKS), Keycloak, Kubernetes. The left part of Keycloak resources remain unchanged after creation, thus allowing us to associate a claim for a user group membership. Other resources can be created, deleted or changed if needed. The most crucial from Kubernetes permissions are Kubernetes RoleBindings and ClusterRoles/Roles. Roles present a set of permissions, in turn RoleBindings map Kubernetes Role to representative Keycloak groups, so a group member can have just appropriate permissions.
Harbor serves as a tool for storing images and artifacts. This documentation contains instructions on how to create a project in Harbor and set up a robot account for interacting with the registry from CI pipelines.
Harbor integration with Tekton enables the centralized storage of container images within the cluster, eliminating the need for external services. By leveraging Harbor as the container registry, users can manage and store their automation results and reports in one place.
The process of creating new projects is the following:
Log in to the Harbor console using your credentials.
Navigate to the Projects menu, click the New Project button:
On the New Project menu, enter a project name that matches your EDP namespace in the Project Name field. Keep other fields as default and click OK to continue:
Harbor serves as a tool for storing images and artifacts. This documentation contains instructions on how to create a project in Harbor and set up a robot account for interacting with the registry from CI pipelines.
Harbor integration with Tekton enables the centralized storage of container images within the cluster, eliminating the need for external services. By leveraging Harbor as the container registry, users can manage and store their automation results and reports in one place.
The process of creating new projects is the following:
Log in to the Harbor console using your credentials.
Navigate to the Projects menu, click the New Project button:
On the New Project menu, enter a project name that matches your EDP namespace in the Project Name field. Keep other fields as default and click OK to continue:
It is highly recommended to delete all the resources created via Headlamp UI first. It can be:
Applications;
Libraries;
Autotests;
Infrastructures;
CD Pipelines.
We recommend deleting them via Headlamp UI respectively, although it is also possible to delete all the Headlamp resources using the kubectl delete command.
Delete application namespaces. They should be called according to the <edp-project>-<cd-pipeline>-<stage-name> pattern.
Uninstall EDP the same way it was installed.
Run the script that deletes the rest of the custom resources: