Skip to content

Commit

Permalink
Merge branch 'opensearch-project:main' into conc-test-changes
Browse files Browse the repository at this point in the history
  • Loading branch information
kasundra07 authored Nov 8, 2023
2 parents b759c33 + 0a9dfec commit de91c97
Show file tree
Hide file tree
Showing 109 changed files with 4,048 additions and 228 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/lucene-snapshots.yml
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ jobs:
run: ./gradlew publishJarsPublicationToMavenLocal -Pversion.suffix=snapshot-${{ steps.version.outputs.REVISION }}

- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v2
uses: aws-actions/configure-aws-credentials@v4
with:
role-to-assume: ${{ secrets.LUCENE_SNAPSHOTS_ROLE }}
aws-region: us-west-2
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/publish-maven-snapshots.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ jobs:
java-version: 17

- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v2
uses: aws-actions/configure-aws-credentials@v4
with:
role-to-assume: ${{ secrets.PUBLISH_SNAPSHOTS_ROLE }}
aws-region: us-east-1
Expand Down
10 changes: 8 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Return 409 Conflict HTTP status instead of 503 on failure to concurrently execute snapshots ([#8986](https://github.com/opensearch-project/OpenSearch/pull/5855))
- Add task completion count in search backpressure stats API ([#10028](https://github.com/opensearch-project/OpenSearch/pull/10028/))
- Performance improvement for Datetime field caching ([#4558](https://github.com/opensearch-project/OpenSearch/issues/4558))
- Deprecate CamelCase `PathHierarchy` tokenizer name in favor to lowercase `path_hierarchy` ([#10894](https://github.com/opensearch-project/OpenSearch/pull/10894))


### Deprecated
Expand Down Expand Up @@ -98,8 +99,9 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Introduce ConcurrentQueryProfiler to profile query using concurrent segment search path and support concurrency during rewrite and create weight ([10352](https://github.com/opensearch-project/OpenSearch/pull/10352))
- Update the indexRandom function to create more segments for concurrent search tests ([10247](https://github.com/opensearch-project/OpenSearch/pull/10247))
- [Remote cluster state] Make index and global metadata upload timeout dynamic cluster settings ([#10814](https://github.com/opensearch-project/OpenSearch/pull/10814))
- Added cluster setting cluster.restrict.index.replication_type to restrict setting of index setting replication type ([#10866](https://github.com/opensearch-project/OpenSearch/pull/10866))
- Add cluster state stats ([#10670](https://github.com/opensearch-project/OpenSearch/pull/10670))
- Adding slf4j license header to LoggerMessageFormat.java ([#11069](https://github.com/opensearch-project/OpenSearch/pull/11069))
- [Streaming Indexing] Introduce new experimental server HTTP transport based on Netty 4 and Project Reactor (Reactor Netty) ([#9672](https://github.com/opensearch-project/OpenSearch/pull/9672))

### Dependencies
- Bump `com.google.api.grpc:proto-google-common-protos` from 2.10.0 to 2.25.1 ([#10208](https://github.com/opensearch-project/OpenSearch/pull/10208), [#10298](https://github.com/opensearch-project/OpenSearch/pull/10298))
Expand All @@ -113,6 +115,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Bump `com.google.http-client:google-http-client` from 1.43.2 to 1.43.3 ([#10635](https://github.com/opensearch-project/OpenSearch/pull/10635))
- Bump `com.squareup.okio:okio` from 3.5.0 to 3.6.0 ([#10637](https://github.com/opensearch-project/OpenSearch/pull/10637))
- Bump `org.apache.logging.log4j:log4j-core` from 2.20.0 to 2.21.1 ([#10858](https://github.com/opensearch-project/OpenSearch/pull/10858), [#11000](https://github.com/opensearch-project/OpenSearch/pull/11000))
- Bump `aws-actions/configure-aws-credentials` from 2 to 4 ([#10504](https://github.com/opensearch-project/OpenSearch/pull/10504))

### Changed
- Mute the query profile IT with concurrent execution ([#9840](https://github.com/opensearch-project/OpenSearch/pull/9840))
Expand All @@ -125,6 +128,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Add instrumentation for indexing in transport bulk action and transport shard bulk action. ([#10273](https://github.com/opensearch-project/OpenSearch/pull/10273))
- [BUG] Disable sort optimization for HALF_FLOAT ([#10999](https://github.com/opensearch-project/OpenSearch/pull/10999))
- Performance improvement for MultiTerm Queries on Keyword fields ([#7057](https://github.com/opensearch-project/OpenSearch/issues/7057))
- Disable concurrent aggs for Diversified Sampler and Sampler aggs ([#11087](https://github.com/opensearch-project/OpenSearch/issues/11087))

### Deprecated

Expand All @@ -137,8 +141,10 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Fix Segment Replication ShardLockObtainFailedException bug during index corruption ([10370](https://github.com/opensearch-project/OpenSearch/pull/10370))
- Fix some test methods in SimulatePipelineRequestParsingTests never run and fix test failure ([#10496](https://github.com/opensearch-project/OpenSearch/pull/10496))
- Fix passing wrong parameter when calling newConfigurationException() in DotExpanderProcessor ([#10737](https://github.com/opensearch-project/OpenSearch/pull/10737))
- Fix SuggestSearch.testSkipDuplicates by forceing refresh when indexing its test documents ([#11068](https://github.com/opensearch-project/OpenSearch/pull/11068))
- Adding version condition while adding geoshape doc values to the index, to ensure backward compatibility.([#11095](https://github.com/opensearch-project/OpenSearch/pull/11095))

### Security

[Unreleased 3.0]: https://github.com/opensearch-project/OpenSearch/compare/2.x...HEAD
[Unreleased 2.x]: https://github.com/opensearch-project/OpenSearch/compare/2.12...2.x
[Unreleased 2.x]: https://github.com/opensearch-project/OpenSearch/compare/2.12...2.x
4 changes: 4 additions & 0 deletions buildSrc/version.properties
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,10 @@ jna = 5.13.0
netty = 4.1.100.Final
joda = 2.12.2

# project reactor
reactor_netty = 1.1.12
reactor = 3.5.11

# client dependencies
httpclient5 = 5.2.1
httpcore5 = 5.2.2
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,13 @@
* GitHub history for details.
*/

/*
* This code is based on code from SFL4J 1.5.11
* Copyright (c) 2004-2007 QOS.ch
* All rights reserved.
* SPDX-License-Identifier: MIT
*/

package org.opensearch.core.common.logging;

import java.util.HashSet;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -394,7 +394,17 @@ public Map<String, AnalysisProvider<TokenizerFactory>> getTokenizers() {
// TODO deprecate and remove in API
tokenizers.put("lowercase", XLowerCaseTokenizerFactory::new);
tokenizers.put("path_hierarchy", PathHierarchyTokenizerFactory::new);
tokenizers.put("PathHierarchy", PathHierarchyTokenizerFactory::new);
tokenizers.put("PathHierarchy", (IndexSettings indexSettings, Environment environment, String name, Settings settings) -> {
// TODO Remove "PathHierarchy" tokenizer name in 4.0 and throw exception
if (indexSettings.getIndexVersionCreated().onOrAfter(Version.V_3_0_0)) {
deprecationLogger.deprecate(
"PathHierarchy_tokenizer_deprecation",
"The [PathHierarchy] tokenizer name is deprecated and will be removed in a future version. "
+ "Please change the tokenizer name to [path_hierarchy] instead."
);
}
return new PathHierarchyTokenizerFactory(indexSettings, environment, name, settings);
});
tokenizers.put("pattern", PatternTokenizerFactory::new);
tokenizers.put("uax_url_email", UAX29URLEmailTokenizerFactory::new);
tokenizers.put("whitespace", WhitespaceTokenizerFactory::new);
Expand Down Expand Up @@ -662,8 +672,17 @@ public List<PreConfiguredTokenizer> getPreConfiguredTokenizers() {
}
return new EdgeNGramTokenizer(NGramTokenizer.DEFAULT_MIN_NGRAM_SIZE, NGramTokenizer.DEFAULT_MAX_NGRAM_SIZE);
}));
tokenizers.add(PreConfiguredTokenizer.singleton("PathHierarchy", PathHierarchyTokenizer::new));

tokenizers.add(PreConfiguredTokenizer.openSearchVersion("PathHierarchy", (version) -> {
// TODO Remove "PathHierarchy" tokenizer name in 4.0 and throw exception
if (version.onOrAfter(Version.V_3_0_0)) {
deprecationLogger.deprecate(
"PathHierarchy_tokenizer_deprecation",
"The [PathHierarchy] tokenizer name is deprecated and will be removed in a future version. "
+ "Please change the tokenizer name to [path_hierarchy] instead."
);
}
return new PathHierarchyTokenizer();
}));
return tokenizers;
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -35,16 +35,61 @@
import com.carrotsearch.randomizedtesting.generators.RandomPicks;

import org.apache.lucene.analysis.Tokenizer;
import org.opensearch.Version;
import org.opensearch.cluster.metadata.IndexMetadata;
import org.opensearch.common.settings.Settings;
import org.opensearch.core.index.Index;
import org.opensearch.env.Environment;
import org.opensearch.env.TestEnvironment;
import org.opensearch.index.IndexSettings;
import org.opensearch.index.analysis.IndexAnalyzers;
import org.opensearch.index.analysis.NamedAnalyzer;
import org.opensearch.indices.analysis.AnalysisModule;
import org.opensearch.test.IndexSettingsModule;
import org.opensearch.test.OpenSearchTokenStreamTestCase;
import org.opensearch.test.VersionUtils;

import java.io.IOException;
import java.io.StringReader;
import java.util.Collections;

public class PathHierarchyTokenizerFactoryTests extends OpenSearchTokenStreamTestCase {

private IndexAnalyzers buildAnalyzers(Version version, String tokenizer) throws IOException {
Settings settings = Settings.builder().put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString()).build();
Settings indexSettings = Settings.builder()
.put(IndexMetadata.SETTING_VERSION_CREATED, version)
.put("index.analysis.analyzer.my_analyzer.tokenizer", tokenizer)
.build();
IndexSettings idxSettings = IndexSettingsModule.newIndexSettings("index", indexSettings);
return new AnalysisModule(TestEnvironment.newEnvironment(settings), Collections.singletonList(new CommonAnalysisModulePlugin()))
.getAnalysisRegistry()
.build(idxSettings);
}

/**
* Test that deprecated "PathHierarchy" tokenizer name is still available via {@link CommonAnalysisModulePlugin} starting in 3.x.
*/
public void testPreConfiguredTokenizer() throws IOException {

{
try (
IndexAnalyzers indexAnalyzers = buildAnalyzers(
VersionUtils.randomVersionBetween(random(), Version.V_3_0_0, Version.CURRENT),
"PathHierarchy"
)
) {
NamedAnalyzer analyzer = indexAnalyzers.get("my_analyzer");
assertNotNull(analyzer);
assertTokenStreamContents(analyzer.tokenStream("dummy", "/a/b/c"), new String[] { "/a", "/a/b", "/a/b/c" });
// Once LUCENE-12750 is fixed we can use the following testing method instead.
// Similar testing approach has been used for deprecation of (Edge)NGrams tokenizers as well.
// assertAnalyzesTo(analyzer, "/a/b/c", new String[] { "/a", "/a/b", "/a/b/c" });

}
}
}

public void testDefaults() throws IOException {
final Index index = new Index("test", "_na_");
final Settings indexSettings = newAnalysisSettingsBuilder().build();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -298,6 +298,9 @@

---
"path_hierarchy":
- skip:
features: "allowed_warnings"

- do:
indices.analyze:
body:
Expand All @@ -312,6 +315,8 @@
- match: { detail.tokenizer.tokens.2.token: a/b/c }

- do:
allowed_warnings:
- 'The [PathHierarchy] tokenizer name is deprecated and will be removed in a future version. Please change the tokenizer name to [path_hierarchy] instead.'
indices.analyze:
body:
text: "a/b/c"
Expand All @@ -337,11 +342,13 @@
- match: { detail.tokenizer.tokens.2.token: a/b/c }

- do:
allowed_warnings:
- 'The [PathHierarchy] tokenizer name is deprecated and will be removed in a future version. Please change the tokenizer name to [path_hierarchy] instead.'
indices.analyze:
body:
text: "a/b/c"
explain: true
tokenizer: PathHierarchy
tokenizer: PathHierarchy
- length: { detail.tokenizer.tokens: 3 }
- match: { detail.tokenizer.name: PathHierarchy }
- match: { detail.tokenizer.tokens.0.token: a }
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ protected boolean forbidPrivateIndexSettings() {
*/
protected void prepareGeoShapeIndexForAggregations(final Random random) throws Exception {
expectedDocsCountForGeoShapes = new HashMap<>();
final Settings settings = Settings.builder().put(IndexMetadata.SETTING_VERSION_CREATED, version).build();
final Settings settings = Settings.builder().put(IndexMetadata.SETTING_VERSION_CREATED, Version.CURRENT).build();
final List<IndexRequestBuilder> geoshapes = new ArrayList<>();
assertAcked(prepareCreate(GEO_SHAPE_INDEX_NAME).setSettings(settings).setMapping(GEO_SHAPE_FIELD_NAME, "type" + "=geo_shape"));
boolean isShapeIntersectingBB = false;
Expand Down Expand Up @@ -136,7 +136,7 @@ protected void prepareSingleValueGeoPointIndex(final Random random) throws Excep
expectedDocCountsForSingleGeoPoint = new HashMap<>();
createIndex("idx_unmapped");
final Settings settings = Settings.builder()
.put(IndexMetadata.SETTING_VERSION_CREATED, version)
.put(IndexMetadata.SETTING_VERSION_CREATED, Version.CURRENT)
.put("index.number_of_shards", 4)
.put("index.number_of_replicas", 0)
.build();
Expand All @@ -160,7 +160,7 @@ protected void prepareSingleValueGeoPointIndex(final Random random) throws Excep

protected void prepareMultiValuedGeoPointIndex(final Random random) throws Exception {
multiValuedExpectedDocCountsGeoPoint = new HashMap<>();
final Settings settings = Settings.builder().put(IndexMetadata.SETTING_VERSION_CREATED, version).build();
final Settings settings = Settings.builder().put(IndexMetadata.SETTING_VERSION_CREATED, Version.CURRENT).build();
final List<IndexRequestBuilder> cities = new ArrayList<>();
assertAcked(
prepareCreate("multi_valued_idx").setSettings(settings)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -116,6 +116,9 @@
import static org.opensearch.http.HttpTransportSettings.SETTING_HTTP_TCP_SEND_BUFFER_SIZE;
import static org.opensearch.http.HttpTransportSettings.SETTING_PIPELINING_MAX_EVENTS;

/**
* The HTTP transport implementations based on Netty 4.
*/
public class Netty4HttpServerTransport extends AbstractHttpServerTransport {
private static final Logger logger = LogManager.getLogger(Netty4HttpServerTransport.class);

Expand Down Expand Up @@ -184,6 +187,17 @@ public class Netty4HttpServerTransport extends AbstractHttpServerTransport {
private volatile ServerBootstrap serverBootstrap;
private volatile SharedGroupFactory.SharedGroup sharedGroup;

/**
* Creates new HTTP transport implementations based on Netty 4
* @param settings seetings
* @param networkService network service
* @param bigArrays big array allocator
* @param threadPool thread pool instance
* @param xContentRegistry XContent registry instance
* @param dispatcher dispatcher instance
* @param clusterSettings cluster settings
* @param sharedGroupFactory shared group factory
*/
public Netty4HttpServerTransport(
Settings settings,
NetworkService networkService,
Expand Down

This file was deleted.

21 changes: 0 additions & 21 deletions plugins/discovery-ec2/licenses/reactive-streams-LICENSE.txt

This file was deleted.

Loading

0 comments on commit de91c97

Please sign in to comment.