-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Derivatives in continuous queries don't work #6401
Comments
I realized I had skipped adding the "resample" option, but it still appears to be using the grouping period to generate the time range. |
I've tried all permutations of [RESAMPLE [EVERY ] [FOR ]], and the time range is still only for the grouping range. |
I believe this is related to #6395 |
This is a duplicate of #3247. Thank you for the bug report. I'm going to close this since we have another ticket open for this issue. |
Bug report
Continuous queries containing derivatives appear to run but no data is ever generated
System info: [InfluxDB 0.12.1, CentOS 6.7 64 bit VM on a dual xeon server with gobs of memory
Steps to reproduce:
Expected behavior: [What you expected to happen]
One would expect to see derivatives of the data that is being sampled.
Actual behavior: [What actually happened]
Instead the records all have unset values.
Additional info: [Include gist of relevant config, logs, etc.]
The time period of the query is the same length as the interval of the derivative and therefor doesn't allow for 2 points to be sampled and compared. Extending the time period (25 seconds seems to do it) and a value of 0 is generated. It seems to take a multiple of 3 sampling periods to capture the derivative.
Please note, the quickest way to fix a bug is to open a Pull Request.
Feature Request
Opening a feature request kicks off a discussion.
Requests may be closed if we're not actively planning to work on them.
Proposal: [Description of the feature]
Current behavior: [What currently happens]
Desired behavior: [What you would like to happen]
Use case: [Why is this important (helps with prioritizing requests)]
The text was updated successfully, but these errors were encountered: