Skip to content

Commit

Permalink
[Build] Provide task to build all public available artifacts
Browse files Browse the repository at this point in the history
We have suggested using `./gradlew assemble` in our README to build elasticsearch artifacts. This
causes problems since this also includes building our wolfi image that is based on a non public available wolfi base image.
To work around this we introduce a task `assemblePublicArtifacts` that we also reference in our docs
for third party elasticsearch source user.
This also reduces the stuff that is build beyond the distribution. like fixtures that are not part
of the distribution.
  • Loading branch information
breskeby committed Nov 7, 2024
1 parent b8fc7d6 commit 2baad8d
Show file tree
Hide file tree
Showing 2 changed files with 75 additions and 35 deletions.
56 changes: 28 additions & 28 deletions README.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Elasticsearch is a distributed search and analytics engine, scalable data store

Use cases enabled by Elasticsearch include:

* https://www.elastic.co/search-labs/blog/articles/retrieval-augmented-generation-rag[Retrieval Augmented Generation (RAG)]
* https://www.elastic.co/search-labs/blog/articles/retrieval-augmented-generation-rag[Retrieval Augmented Generation (RAG)]
* https://www.elastic.co/search-labs/blog/categories/vector-search[Vector search]
* Full-text search
* Logs
Expand All @@ -17,7 +17,7 @@ Use cases enabled by Elasticsearch include:
To learn more about Elasticsearch's features and capabilities, see our
https://www.elastic.co/products/elasticsearch[product page].

To access information on https://www.elastic.co/search-labs/blog/categories/ml-research[machine learning innovations] and the latest https://www.elastic.co/search-labs/blog/categories/lucene[Lucene contributions from Elastic], more information can be found in https://www.elastic.co/search-labs[Search Labs].
To access information on https://www.elastic.co/search-labs/blog/categories/ml-research[machine learning innovations] and the latest https://www.elastic.co/search-labs/blog/categories/lucene[Lucene contributions from Elastic], more information can be found in https://www.elastic.co/search-labs[Search Labs].

[[get-started]]
== Get started
Expand All @@ -27,20 +27,20 @@ https://www.elastic.co/cloud/as-a-service[Elasticsearch Service on Elastic
Cloud].

If you prefer to install and manage Elasticsearch yourself, you can download
the latest version from
the latest version from
https://www.elastic.co/downloads/elasticsearch[elastic.co/downloads/elasticsearch].

=== Run Elasticsearch locally

////
////
IMPORTANT: This content is replicated in the Elasticsearch repo. See `run-elasticsearch-locally.asciidoc`.
Ensure both files are in sync.
https://github.com/elastic/start-local is the source of truth.
////
////

[WARNING]
====
====
DO NOT USE THESE INSTRUCTIONS FOR PRODUCTION DEPLOYMENTS.
This setup is intended for local development and testing only.
Expand Down Expand Up @@ -93,20 +93,20 @@ Use this key to connect to Elasticsearch with a https://www.elastic.co/guide/en/
From the `elastic-start-local` folder, check the connection to Elasticsearch using `curl`:

[source,sh]
----
----
source .env
curl $ES_LOCAL_URL -H "Authorization: ApiKey ${ES_LOCAL_API_KEY}"
----
// NOTCONSOLE

=== Send requests to Elasticsearch

You send data and other requests to Elasticsearch through REST APIs.
You can interact with Elasticsearch using any client that sends HTTP requests,
You send data and other requests to Elasticsearch through REST APIs.
You can interact with Elasticsearch using any client that sends HTTP requests,
such as the https://www.elastic.co/guide/en/elasticsearch/client/index.html[Elasticsearch
language clients] and https://curl.se[curl].
language clients] and https://curl.se[curl].

==== Using curl
==== Using curl

Here's an example curl command to create a new Elasticsearch index, using basic auth:

Expand Down Expand Up @@ -149,19 +149,19 @@ print(client.info())

==== Using the Dev Tools Console

Kibana's developer console provides an easy way to experiment and test requests.
Kibana's developer console provides an easy way to experiment and test requests.
To access the console, open Kibana, then go to **Management** > **Dev Tools**.

**Add data**

You index data into Elasticsearch by sending JSON objects (documents) through the REST APIs.
Whether you have structured or unstructured text, numerical data, or geospatial data,
Elasticsearch efficiently stores and indexes it in a way that supports fast searches.
You index data into Elasticsearch by sending JSON objects (documents) through the REST APIs.
Whether you have structured or unstructured text, numerical data, or geospatial data,
Elasticsearch efficiently stores and indexes it in a way that supports fast searches.

For timestamped data such as logs and metrics, you typically add documents to a
data stream made up of multiple auto-generated backing indices.

To add a single document to an index, submit an HTTP post request that targets the index.
To add a single document to an index, submit an HTTP post request that targets the index.

----
POST /customer/_doc/1
Expand All @@ -171,19 +171,19 @@ POST /customer/_doc/1
}
----

This request automatically creates the `customer` index if it doesn't exist,
adds a new document that has an ID of 1, and
This request automatically creates the `customer` index if it doesn't exist,
adds a new document that has an ID of 1, and
stores and indexes the `firstname` and `lastname` fields.

The new document is available immediately from any node in the cluster.
The new document is available immediately from any node in the cluster.
You can retrieve it with a GET request that specifies its document ID:

----
GET /customer/_doc/1
----

To add multiple documents in one request, use the `_bulk` API.
Bulk data must be newline-delimited JSON (NDJSON).
Bulk data must be newline-delimited JSON (NDJSON).
Each line must end in a newline character (`\n`), including the last line.

----
Expand All @@ -200,15 +200,15 @@ PUT customer/_bulk

**Search**

Indexed documents are available for search in near real-time.
The following search matches all customers with a first name of _Jennifer_
Indexed documents are available for search in near real-time.
The following search matches all customers with a first name of _Jennifer_
in the `customer` index.

----
GET customer/_search
{
"query" : {
"match" : { "firstname": "Jennifer" }
"match" : { "firstname": "Jennifer" }
}
}
----
Expand All @@ -223,9 +223,9 @@ data streams, or index aliases.

. Go to **Management > Stack Management > Kibana > Data Views**.
. Select **Create data view**.
. Enter a name for the data view and a pattern that matches one or more indices,
such as _customer_.
. Select **Save data view to Kibana**.
. Enter a name for the data view and a pattern that matches one or more indices,
such as _customer_.
. Select **Save data view to Kibana**.

To start exploring, go to **Analytics > Discover**.

Expand Down Expand Up @@ -256,7 +256,7 @@ To build a distribution for another platform, run the related command:

To build distributions for all supported platforms, run:
----
./gradlew assemble
./gradlew assemblePublicArtifacts
----

Distributions are output to `distribution/archives`.
Expand All @@ -281,7 +281,7 @@ The https://github.com/elastic/elasticsearch-labs[`elasticsearch-labs`] repo con
[[contribute]]
== Contribute

For contribution guidelines, see xref:CONTRIBUTING.md[CONTRIBUTING].
For contribution guidelines, see xref:CONTRIBUTING.md[CONTRIBUTING].

[[questions]]
== Questions? Problems? Suggestions?
Expand Down
54 changes: 47 additions & 7 deletions build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,13 @@ import com.avast.gradle.dockercompose.tasks.ComposePull
import com.fasterxml.jackson.databind.JsonNode
import com.fasterxml.jackson.databind.ObjectMapper

import org.elasticsearch.gradle.DistributionDownloadPlugin
import org.elasticsearch.gradle.Version
import org.elasticsearch.gradle.internal.BaseInternalPluginBuildPlugin
import org.elasticsearch.gradle.internal.ResolveAllDependencies
import org.elasticsearch.gradle.internal.info.BuildParams
import org.elasticsearch.gradle.util.GradleUtils
import org.gradle.plugins.ide.eclipse.model.AccessRule
import org.gradle.plugins.ide.eclipse.model.ProjectDependency
import org.elasticsearch.gradle.DistributionDownloadPlugin

import java.nio.file.Files

Expand Down Expand Up @@ -89,7 +88,7 @@ class ListExpansion {

// Filters out intermediate patch releases to reduce the load of CI testing
def filterIntermediatePatches = { List<Version> versions ->
versions.groupBy {"${it.major}.${it.minor}"}.values().collect {it.max()}
versions.groupBy { "${it.major}.${it.minor}" }.values().collect { it.max() }
}

tasks.register("updateCIBwcVersions") {
Expand All @@ -101,7 +100,10 @@ tasks.register("updateCIBwcVersions") {
}
}

def writeBuildkitePipeline = { String outputFilePath, String pipelineTemplatePath, List<ListExpansion> listExpansions, List<StepExpansion> stepExpansions = [] ->
def writeBuildkitePipeline = { String outputFilePath,
String pipelineTemplatePath,
List<ListExpansion> listExpansions,
List<StepExpansion> stepExpansions = [] ->
def outputFile = file(outputFilePath)
def pipelineTemplate = file(pipelineTemplatePath)

Expand Down Expand Up @@ -132,7 +134,12 @@ tasks.register("updateCIBwcVersions") {
// Writes a Buildkite pipeline from a template, and replaces $BWC_STEPS with a list of steps, one for each version
// Useful when you need to configure more versions than are allowed in a matrix configuration
def expandBwcSteps = { String outputFilePath, String pipelineTemplatePath, String stepTemplatePath, List<Version> versions ->
writeBuildkitePipeline(outputFilePath, pipelineTemplatePath, [], [new StepExpansion(templatePath: stepTemplatePath, versions: versions, variable: "BWC_STEPS")])
writeBuildkitePipeline(
outputFilePath,
pipelineTemplatePath,
[],
[new StepExpansion(templatePath: stepTemplatePath, versions: versions, variable: "BWC_STEPS")]
)
}

doLast {
Expand All @@ -150,7 +157,11 @@ tasks.register("updateCIBwcVersions") {
new ListExpansion(versions: filterIntermediatePatches(BuildParams.bwcVersions.unreleasedIndexCompatible), variable: "BWC_LIST"),
],
[
new StepExpansion(templatePath: ".buildkite/pipelines/periodic.bwc.template.yml", versions: filterIntermediatePatches(BuildParams.bwcVersions.indexCompatible), variable: "BWC_STEPS"),
new StepExpansion(
templatePath: ".buildkite/pipelines/periodic.bwc.template.yml",
versions: filterIntermediatePatches(BuildParams.bwcVersions.indexCompatible),
variable: "BWC_STEPS"
),
]
)

Expand Down Expand Up @@ -302,7 +313,7 @@ allprojects {
if (project.path.startsWith(":x-pack:")) {
if (project.path.contains("security") || project.path.contains(":ml")) {
tasks.register('checkPart4') { dependsOn 'check' }
} else if (project.path == ":x-pack:plugin" || project.path.contains("ql") || project.path.contains("smoke-test")) {
} else if (project.path == ":x-pack:plugin" || project.path.contains("ql") || project.path.contains("smoke-test")) {
tasks.register('checkPart3') { dependsOn 'check' }
} else if (project.path.contains("multi-node")) {
tasks.register('checkPart5') { dependsOn 'check' }
Expand Down Expand Up @@ -453,6 +464,35 @@ tasks.register("buildReleaseArtifacts").configure {
.findAll { it != null }
}

/**
* This tasks allows third party open source user to build all artifacts
* that are build from public available sources
* We used to propose to run `./gradlew assemble` but this leads to:
* - building artifacts that are not required for a user (e.g. internally used test fixtures)
* - failure when building docker images with non public base images (e.g. wolfi)
* */
tasks.register("assemblePublicArtifacts").configure {
group = 'build'
description = 'Builds all artifacts including public docker images that are build from public available sources'

dependsOn allprojects.findAll {
it.path.startsWith(':distribution:docker') == false
&& it.path.startsWith(':ml-cpp') == false
&& it.path.startsWith(':distribution:bwc') == false
&& it.path.startsWith(':test:fixture') == false
}
.collect { GradleUtils.findByName(it.tasks, 'assemble') }
.findAll { it != null }

dependsOn ":distribution:docker:buildDockerImage"
dependsOn ":distribution:docker:buildAarch64DockerImage"
dependsOn ":distribution:docker:buildAarch64IronBankDockerImage"
dependsOn ":distribution:docker:buildIronBankDockerImage"
dependsOn ":distribution:docker:buildAarch64UbiDockerImage"
dependsOn ":distribution:docker:buildUbiDockerImage"

}

tasks.register("spotlessApply").configure {
dependsOn gradle.includedBuild('build-tools').task(':spotlessApply')
dependsOn gradle.includedBuild('build-tools').task(':reaper:spotlessApply')
Expand Down

0 comments on commit 2baad8d

Please sign in to comment.