Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add timeseries page to dashboard #575

Merged
merged 14 commits into from
Jun 5, 2023
Merged

Add timeseries page to dashboard #575

merged 14 commits into from
Jun 5, 2023

Conversation

stefsmeets
Copy link
Contributor

@stefsmeets stefsmeets commented Apr 25, 2023

This PR adds a page for timeseries to the dashboard.

Closes #568

Todo

  • Add caching
  • What is the file type/format for data uploads?
  • Update lime interface

@stefsmeets stefsmeets marked this pull request as ready for review May 23, 2023 14:23
@stefsmeets stefsmeets requested a review from geek-yang May 24, 2023 08:05
@stefsmeets
Copy link
Contributor Author

Hi @geek-yang @cwmeijer , is either of you interested in reviewing this PR?

@stefsmeets stefsmeets requested a review from cwmeijer May 24, 2023 08:06
Copy link
Member

@geek-yang geek-yang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I went over the changes roughly and most of them look pretty nice to me (ps: don't have much experience with streamlit. But I think the code itself looks pretty and very well-organized. It is very readable. I could follow the code and figure it out what are most of the functions used for. Nice lesson learned for me about streamlit 😄 ).

Just some small comments to address, before merging it to main branch. But I can give you a go already.

I put more time on testing the dashboard, especially for timeseries panel. It looks pretty awesome. And the added cache functionality also works for me perfectly. Thanks a lot for the fabulous work 👍!

dianna/methods/lime_timeseries.py Show resolved Hide resolved
dianna/methods/lime_timeseries.py Show resolved Hide resolved
@geek-yang
Copy link
Member

I got a small concern about sonarcloud check. If we merge it as what it is for now, the coverage will drop dramatically. Since we don't want to add unit tests for each function in streamlit (also it makes no sense to do so at this point). Shall we just skip dashboard folder only for sonarcloud check?

@geek-yang
Copy link
Member

See this doc: https://docs.sonarcloud.io/advanced-setup/analysis-scope/
Under the section "File exclusion and inclusion", it seems we can exclude files by tweaking the "analysis scope".

@elboyran
Copy link
Contributor

elboyran commented Jun 1, 2023 via email

@stefsmeets
Copy link
Contributor Author

stefsmeets commented Jun 5, 2023

I got a small concern about sonarcloud check. If we merge it as what it is for now, the coverage will drop dramatically. Since we don't want to add unit tests for each function in streamlit (also it makes no sense to do so at this point). Shall we just skip dashboard folder only for sonarcloud check?

Much of this code is still being tested, just not taken into account for coverage.

The issue is that the code is run in a subprocess, which cannot be measured automatically for coverage. This would be the way to go about it: https://coverage.readthedocs.io/en/stable/subprocess.html#configuring-python-for-sub-process-measurement

@geek-yang
Copy link
Member

I got a small concern about sonarcloud check. If we merge it as what it is for now, the coverage will drop dramatically. Since we don't want to add unit tests for each function in streamlit (also it makes no sense to do so at this point). Shall we just skip dashboard folder only for sonarcloud check?

Much of this code is still being tested, just not taken into account for coverage.

The issue is that the code is run in a subprocess, which cannot be measured automatically for coverage. This would be the way to go about it: https://coverage.readthedocs.io/en/stable/subprocess.html#configuring-python-for-sub-process-measurement

I saw the tests with playwright but didn't realize that it already covers most of the stuff, while they were merely not reflected by sonarcloud. Thanks for clarifying it. The solution you propose looks good to me. Let's give a try then.

@stefsmeets
Copy link
Contributor Author

I got a small concern about sonarcloud check. If we merge it as what it is for now, the coverage will drop dramatically. Since we don't want to add unit tests for each function in streamlit (also it makes no sense to do so at this point). Shall we just skip dashboard folder only for sonarcloud check?

Much of this code is still being tested, just not taken into account for coverage.
The issue is that the code is run in a subprocess, which cannot be measured automatically for coverage. This would be the way to go about it: https://coverage.readthedocs.io/en/stable/subprocess.html#configuring-python-for-sub-process-measurement

I saw the tests with playwright but didn't realize that it already covers most of the stuff, while they were merely not reflected by sonarcloud. Thanks for clarifying it. The solution you propose looks good to me. Let's give a try then.

Let's follow up later then. I spent the last hour fighting this, but I can't get it to work at the moment.

@stefsmeets stefsmeets merged commit eb07f2a into main Jun 5, 2023
@stefsmeets stefsmeets deleted the timeseries-dashboard branch June 5, 2023 09:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add timeseries functionality to dashboard
4 participants