Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor(ChartDataCommand): separate loading query_context form cache into different module #17405

Merged
merged 1 commit into from
Nov 12, 2021

Conversation

ofekisr
Copy link
Contributor

@ofekisr ofekisr commented Nov 11, 2021

Background

When we have worked on #16991 we wanted to test the new functionalities in concrete and accurate unittest.
All The chartData flows and its components are too couple to superset so it is impossible to create unittests.
The flows are not testable and so many components do not meet the very important principle SRP and the code became so dirty

So I've started to refactor it (#17344 ) but many changes were added and it was hard to review so I decided to split those changes into small PRs so will be easier to follow

This is the third PR in a sequence of PRs to meet these
The next PR is #17407

PR description

The CharaDataCommand responsible is to retrieve data given query context parameters.
What it needs is query_context injected to him so it can fulfill its task. Loading the query_contxt form from the cache is another responsible, thus should be in another module to meet the SRP principle.
The module that is responsible for creating the DataChartCommand and injecting what the command needs, should charge for calling the cache loader.

more small changes:
add "_" prefix to methods name that supposed to be private and remove unused code

Test plans

There are no logic changes, the way the cache is loading is the same as was in the command so new tests are not required

Previous PRs

  1. refactor(ChartData): move ChartDataResult enums to common #17399
  2. refactor(ChartData): move chart_data_apis from ChartRestApi ChartDataRestApi #17400

@codecov
Copy link

codecov bot commented Nov 11, 2021

Codecov Report

Merging #17405 (c1931b5) into master (28944f5) will increase coverage by 0.01%.
The diff coverage is 95.83%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master   #17405      +/-   ##
==========================================
+ Coverage   76.94%   76.96%   +0.01%     
==========================================
  Files        1040     1041       +1     
  Lines       56074    56060      -14     
  Branches     7735     7735              
==========================================
  Hits        43146    43146              
+ Misses      12670    12656      -14     
  Partials      258      258              
Flag Coverage Δ
hive 81.51% <95.83%> (+0.04%) ⬆️
mysql 81.93% <95.83%> (+0.04%) ⬆️
postgres 81.94% <95.83%> (+0.04%) ⬆️
python 82.29% <95.83%> (+0.04%) ⬆️
sqlite 81.62% <95.83%> (+0.04%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
superset/charts/commands/data.py 97.95% <ø> (+1.59%) ⬆️
superset/charts/data/query_context_cache_loader.py 90.00% <90.00%> (ø)
superset/charts/api.py 85.22% <100.00%> (+4.17%) ⬆️
superset/charts/data/api.py 88.48% <100.00%> (+0.25%) ⬆️
...frontend/src/dashboard/components/Header/index.jsx 68.30% <0.00%> (-0.55%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 28944f5...c1931b5. Read the comment docs.

Copy link
Member

@amitmiran137 amitmiran137 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🏷️ bot A label used by `supersetbot` to keep track of which PR where auto-tagged with release labels size/L 🚢 1.5.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants