Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update logs pipeline to separate integration and customer pipelines #332

Merged
merged 10 commits into from
Oct 15, 2019

Conversation

tt810
Copy link
Contributor

@tt810 tt810 commented Sep 30, 2019

  1. create resource to represent integration pipline, rename the existing one to customer_pipline.
  2. add documentation for this new resource, and add more details on the errors for pipline order
  3. remove the test cases that has states.

@ghost ghost added the size/XS label Sep 30, 2019
@ghost ghost added size/L documentation and removed size/XS labels Oct 3, 2019
@tt810 tt810 changed the title Update test cases for pipeline Update logs pipeline to separate integration and customer pipelines Oct 3, 2019
@ghost ghost added size/XL and removed size/L labels Oct 3, 2019
Copy link
Contributor

@zippolyte zippolyte left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR. Left a few inline comments, mostly about wording.

datadog/import_datadog_logs_pipeline_test.go Outdated Show resolved Hide resolved
website/docs/r/logs_customer_pipeline.html.markdown Outdated Show resolved Hide resolved
website/docs/r/logs_integration_pipeline.html.markdown Outdated Show resolved Hide resolved
website/docs/r/logs_integration_pipeline.html.markdown Outdated Show resolved Hide resolved
@tt810
Copy link
Contributor Author

tt810 commented Oct 7, 2019

Hi @zippolyte, thanks for the review. I have addressed your comments.

Copy link
Contributor

@zippolyte zippolyte left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just missed one :)
I'll also let @nmuesch do another pass at it, since i'm not too familiar yet with terraform dev

datadog/import_datadog_logs_pipeline_test.go Outdated Show resolved Hide resolved
@tt810
Copy link
Contributor Author

tt810 commented Oct 8, 2019

@zippolyte thanks for the review, I updated the last comment. Let me know if we are all good @nmuesch ! thank you guys!

Copy link
Contributor

@nmuesch nmuesch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall looks good. I left a few small notes.

datadog/resource_datadog_logs_integration_pipeline.go Outdated Show resolved Hide resolved
}
return false, err
}
return ddPipeline.GetIsReadOnly(), nil
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the given ID exists but its a custom pipeline we return false? Should we instead print a message that this is a custom pipeline and should be created via the integration pipeline resource ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I notice terraform is calling this function for other types of instructions. If I return true for a custom pipeline, it would not be accurate.

datadog/resource_datadog_logs_pipeline_order_test.go Outdated Show resolved Hide resolved
datadog/resource_datadog_logs_pipeline_order_test.go Outdated Show resolved Hide resolved
datadog/resource_datadog_logs_pipeline_order_test.go Outdated Show resolved Hide resolved
datadog/resource_datadog_logs_pipeline_order_test.go Outdated Show resolved Hide resolved
`Unprocessable Entity` with error message like `Cannot map pipelines to existing ones`, most likely the pipelines are
incompatible with the ones declared in your `datadog_logs_pipeline_order` resource. In this case, make
sure that no pipeline gets added or deleted via other method (for example, through API call or from Datadog Logs
configuration UI).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there anything we can do to print a more helpful message to the user? Maybe even just printing the list of pipelines and asking users to confirm they have everything in that list defined in their config?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(That was what we discussed last time) I find is not very straightforward on API side to do so. Do you think it's enough just document it for now?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ahh ok. Can/do we provide a helpful error message in the code if this error is retrieved?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I updated the method to do an extra call a GET to get the order from the API, and print it along with the order from resource? what do you think? (it's a extra call to the API, but maybe it worths it since otherwise, user probably gonna need to do that anyways).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also remove this troubleshooting paragraph.

@tt810
Copy link
Contributor Author

tt810 commented Oct 14, 2019

Hi @zippolyte thanks for approving it. Do you mind to merge it (as I don't have this permission).

@zippolyte zippolyte merged commit f714728 into DataDog:master Oct 15, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants