-
-
Notifications
You must be signed in to change notification settings - Fork 144
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Minor Documentation Fixes: TaskID for Example Custom Flow; Comment on Homepage; More documentation for components
#1243
Conversation
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## develop #1243 +/- ##
===========================================
- Coverage 85.24% 77.49% -7.75%
===========================================
Files 38 38
Lines 5008 5008
===========================================
- Hits 4269 3881 -388
- Misses 739 1127 +388
... and 15 files with indirect coverage changes Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report in Codecov by Sentry. |
componets
componets
components
We could add tests that verify that our examples are not crashing. def test_run_custom_flow_example():
example_code = __import__("examples.30_extended.custom_flow_") This would increase the code coverage and would allow us to see if our examples are actually working. |
We might want to connect this to #1070 and resolve both at the same time. |
I agree that this closes #1241 and #1229. I am not sure if it documents #1231 sufficiently. RE testing the examples: the examples are executed by the workflows that build and deploy the documentation. Therefore, we automatically check them, but they do not contribute to the test coverage. RE #1070 I don't think we can resolve this as we also look into tasks, and they cannot be addressed by names. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Happy to merge if/when unit tests pass.
@mfeurer the usual tests seem to pass. If I have the time this week, I will try to fix all failing tutorials (while building the docs) and include these fixes in the PR. I think this would be appropriate with the current idea of the PR. |
…-compatible) pd.concat
for more information, see https://pre-commit.ci
@mfeurer I think this is ready for merging now. Or do you require any other changes? |
…m Flow; Comment on Homepage; More documentation for `components` (#1243)
Examples are tested, I don't really see a point for calculating coverage metrics on the example code itself. As far as I am aware, this is also not something we do. The difference in code coverage is more likely to be because of changes to the test server state (and thus different error/code paths). If there are parts of the code only covered by examples and not unit tests, then the right way to correct that would be to add unit tests. |
Closes #1241
Moreover, we might want to add or rework the process for the examples if the IDs can change, maybe in relation to #1227.
Also Closes #1229
Also
CLosesrelated to #1231if we think this documentation suffices for the use case.