Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support elasticsearch rollover #68

Open
pavolloffay opened this issue Jul 24, 2019 · 11 comments
Open

Support elasticsearch rollover #68

pavolloffay opened this issue Jul 24, 2019 · 11 comments

Comments

@pavolloffay
Copy link
Member

Support elasticearch rollover indices - we need to read data from jaeger-span-read and not daily indices.

The write could be done to daily indices - we do not support rollover aliases for dependency index at the moment.

@frittentheke
Copy link
Contributor

I just ran into this issues after successfully configuring all of Jaeger to use ES rollover.

Using rollover (and also ILM) is just much more sensible when dealing with the ever changing amounts of data that tracing might produce (or might not).
Would just be great if the spark job would work with it.

@frittentheke
Copy link
Contributor

Considering there also is the Flink based implementation, but which currently lacks ES support altogether (jaegertracing/jaeger-analytics-flink#7).

What is the intended way forward around creating the dependencies data?

@pavolloffay
Copy link
Member Author

There hasn't been much work on the flink job. There are no plans to support ES in the flink project. The whole project hasn't been "productized".

It makes sense to add support for rollover here, however it will need some changes how the data are loaded since we cannot read all the data from the read alias. Maybe we could use timestamps from spans instead.

@frittentheke
Copy link
Contributor

frittentheke commented Mar 11, 2020

@pavolloffay while peeking into the code, I expected the only change required is to directly address the read and write alias in the run method (i.e. when env ES_USE_ALIASES is set instead of calling the indexDate method (see: https://github.com/jaegertracing/spark-dependencies/blob/master/jaeger-spark-dependencies-elasticsearch/src/main/java/io/jaegertracing/spark/dependencies/elastic/ElasticsearchDependenciesJob.java#L203)

@pavolloffay
Copy link
Member Author

Read alias might point to multiple indices for extended period of time e.g. week or two weeks. We cannot load that much data into memory.

Also the dependencies reader in Jaeger expects derived dependencies for the current day (or previous).

@frittentheke
Copy link
Contributor

frittentheke commented Mar 11, 2020

@pavolloffay I see, my bad. But thank you for taking the time to think about this issue.

But in any case a limit on the number of docs requested from ES makes sense. Even a single day index could contain up to 2 billion docs (ES / Lucene limit)

How about simply applying a filter via startTimeMillis when querying jaeger-span-read?
Certainly the same would need to be done in the UI when showing the dependencies from the jaeger-dependencies-read alias. But ES is very good with range queries on date fields.

Potentially using the terms query build into ES could speed things up even more (https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-terms-query.html#query-dsl-terms-lookup) -- this will need to be done in chunks of 65k terms though....

@pavolloffay
Copy link
Member Author

This is exactly what I have already proposed. Use the span timestamp for query and then just store the data into dependency daily index like we do now. It does not require the changes on the Jaeger side.

@frittentheke
Copy link
Contributor

Yeah - that sounds like simplest approach. Only downside about having a daily dependency index is that is is quite wasteful on the number shards ... but potentially not really an issue when talking about a few days worth of indices.

@lgenasi-gocity
Copy link

@frittentheke I know this is an old issue now but I tried setting ES_USE_ALIASES: true and it still not using the aliases. Am I missing something obvious?

   jaeger-jaeger-operator-jaeger-spark-dependencies:
    Image:      jaegertracing/spark-dependencies
    Port:       <none>
    Host Port:  <none>
    Environment:
      STORAGE:         elasticsearch
      ES_NODES:        http://elasticsearch-master.elastic-system:9200
      ES_USE_ALIASES:  true
22/12/04 08:38:30 INFO ElasticsearchDependenciesJob: Running Dependencies job for 2022-12-04T00:00Z, reading from jaeger-span-2022-12-04 index, result storing to jaeger-dependencies-2022-12-04
22/12/04 08:38:31 INFO ElasticsearchDependenciesJob: Done, 0 dependency objects created

It still seems to be attempting to read an index with a date suffix
reading from jaeger-span-2022-12-04 index

The container image ID being used seems to be the latest on dockerhub
docker.io/jaegertracing/spark-dependencies@sha256:08dca989f4c7de0af8940ab3466e9fcc69e4c159ddb23be28ffab378ea66e03b

Any help understanding what's going on would be much appreciated, thanks.

@frittentheke
Copy link
Contributor

The container image ID being used seems to be the latest on dockerhub
docker.io/jaegertracing/spark-dependencies@sha256:08dca989f4c7de0af8940ab3466e9fcc69e4c159ddb23be28ffab378ea66e03b

Any help understanding what's going on would be much appreciated, thanks.

Sorry @lgenasi-gocity for never responding.
I honestly don't know if there was a container release after my PR was merged.

@albertteoh ?

@sergeykad
Copy link

@frittentheke The latest release is at ghcr.io/jaegertracing/spark-dependencies/spark-dependencies according to the readme.

It worked for me with the following configuration.

spark:
  enabled: true
  image:
    registry: ghcr.io
    repository: jaegertracing/spark-dependencies/spark-dependencies
  extraEnv:
    - name: ES_USE_ALIASES
      value: "true"

The only issue is that it is configured differently from the rest of the Jaeger services, but it's not critical.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants