diff --git a/.circleci/config.yml b/.circleci/config.yml index e4fe2263..e63fed44 100644 --- a/.circleci/config.yml +++ b/.circleci/config.yml @@ -5,7 +5,7 @@ defaults: &defaults CIRCLE_ARTIFACTS: /tmp/circleci-artifacts CIRCLE_TEST_REPORTS: /tmp/circleci-test-results CODECOV_TOKEN: b0d35139-0a75-427a-907b-2c78a762f8f0 - VERSION: 1.7.12 + VERSION: 1.7.13 PANDOC_RELEASES_URL: https://github.com/jgm/pandoc/releases steps: - checkout diff --git a/CHANGES.md b/CHANGES.md index 1ed621a5..17038601 100644 --- a/CHANGES.md +++ b/CHANGES.md @@ -1,5 +1,16 @@ ## Changelog +### 1.7.13 (2020-3-7) + * New data storage mechanisms available: Redis, Shelve + * [#100](https://github.com/man-group/dtale/issues/100), turned off data limits on charts by using WebGL + * [#99](https://github.com/man-group/dtale/issues/99), graceful handling of issue calculating min/max information for Describe popup + * [#91](https://github.com/man-group/dtale/issues/91), reshaping of data through usage of aggregations, pivots or transposes + * Export chart to HTML + * Export chart dat to CSV + * Offline chart display for use within notebooks + * Removal of data from the Instances popup + * Updated styling of charts to fit full window dimensions + ### 1.7.12 (2020-3-1) * added syntax highlighting to code exports with react-syntax-highlighting * added arctic integration test diff --git a/README.md b/README.md index 5bf30b1b..f174baa3 100644 --- a/README.md +++ b/README.md @@ -21,7 +21,11 @@ D-Tale was the product of a SAS to Python conversion. What was originally a per ## In The News - [Man Institute](https://www.man.com/maninstitute/d-tale) (warning: contains deprecated functionality) - [Python Bytes](https://pythonbytes.fm/episodes/show/169/jupyter-notebooks-natively-on-your-ipad) + + +## Tutorials - [Pip Install Python YouTube Channel](https://m.youtube.com/watch?v=0RihZNdQc7k&feature=youtu.be) + - [machine_learning_2019](https://www.youtube.com/watch?v=-egtEUVBy9c) ## Contents @@ -36,9 +40,9 @@ D-Tale was the product of a SAS to Python conversion. What was originally a per - [Dimensions/Main Menu](#dimensionsmain-menu) - [Selecting/Deselecting Columns](#selectingdeselecting-columns) - [Main Menu Functions](#main-menu-functions) - - [Describe](#describe), [Filter](#filter), [Charts](#charts), [Correlations](#correlations), [Heat Map](#heat-map), [Instances](#instances), [Code Exports](#code-exports), [About](#about), [Resize](#resize), [Shutdown](#shutdown) + - [Describe](#describe), [Filter](#filter), [Building Columns](#building-columns), [Reshape](#reshape), [Charts](#charts), [Coverage (Deprecated)](#coverage-deprecated), [Correlations](#correlations), [Heat Map](#heat-map), [Instances](#instances), [Code Exports](#code-exports), [About](#about), [Resize](#resize), [Shutdown](#shutdown) - [Column Menu Functions](#column-menu-functions) - - [Moving Columns](#moving-columns), [Hiding Columns](#hiding-columns), [Building Columns](#building-columns), [Lock](#lock), [Unlock](#unlock), [Sorting](#sorting), [Formats](#formats), [Column Analysis](#column-analysis) + - [Moving Columns](#moving-columns), [Hiding Columns](#hiding-columns), [Lock](#lock), [Unlock](#unlock), [Sorting](#sorting), [Formats](#formats), [Column Analysis](#column-analysis) - [Menu Functions within a Jupyter Notebook](#menu-functions-within-a-jupyter-notebook) - [For Developers](#for-developers) - [Cloning](#cloning) @@ -202,7 +206,7 @@ Base CLI options (run `dtale --help` to see all options available) |`--open-browser`|flag to automatically open up your server's default browser to your D-Tale instance| |`--force`|flag to force D-Tale to try an kill any pre-existing process at the port you've specified so it can use it| -Loading data from [**arctic**(high performance datastore for pandas dataframes)](https://github.com/man-group/arctic) +Loading data from [**arctic**(high performance datastore for pandas dataframes)](https://github.com/man-group/arctic) (this requires either installing **arctic** or **dtale[arctic]**) ```bash dtale --arctic-host mongodb://localhost:27027 --arctic-library jdoe.my_lib --arctic-node my_node --arctic-start 20130101 --arctic-end 20161231 ``` @@ -283,6 +287,13 @@ Here's how you would use this loader: DTALE_CLI_LOADERS=./path_to_loaders bash -c 'dtale --testdata-rows 10 --testdata-columns 5' ``` +### Accessing CLI Loaders in Notebook or Console +I am pleased to announce that all CLI loaders will be available within notebooks & consoles. Here are some examples: +- `dtale.show_csv(path='test.csv', parse_dates=['date'])` +- `dtale.show_json(path='http://json-endpoint', parse_dates=['date'])` +- `dtale.show_json(path='test.json', parse_dates=['date'])` +- `dtale.show_arctic(host='host', library='library', node='node', start_date='20200101', end_date='20200101')` + ## UI Once you have kicked off your D-Tale session please copy & paste the link on the last line of output in your browser ![](https://raw.githubusercontent.com/aschonfeld/dtale-media/master/images/Browser1.png) @@ -335,6 +346,24 @@ And here is how you would pass that context variable to D-Tale: `dtale.show(df, FYI: For python 3 users, there is now support for filtering on column names with special characters in them (EX: 'a.b') :metal: +#### Building Columns + +[![](http://img.youtube.com/vi/G6wNS9-lG04/0.jpg)](http://www.youtube.com/watch?v=G6wNS9-lG04 "Build Columns in D-Tale") + +This video shows you how to build the following: + - Numeric: adding/subtracting two columns or columns with static values + - Bins: bucketing values using pandas cut & qcut as well as assigning custom labels + - Dates: retrieving date properties (hour, weekday, month...) as well as conversions (month end) + +#### Reshape + +This is very powerful functionality which allows users to create a new data from currently loaded data. The operations currently available are: +- **Aggregation**: consolidate data by running different aggregations on columns by a specific index +- **Pivot**: this is simple wrapper around [pandas.Dataframe.pivot](https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.pivot.html) and [pandas.pivot_table](https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.pivot_table.html) +- **Transpose**: transpose your data on a index (be careful dataframes can get very wide if your index has many unique values) + +*Tutorials: coming soon* + #### Charts Build custom charts based off your data(powered by [plotly/dash](https://github.com/plotly/dash)). @@ -376,7 +405,44 @@ With a bar chart that only has a single Y-Axis you have the ability to sort the |:------:|:------:| |![](https://raw.githubusercontent.com/aschonfeld/dtale-media/master/images/charts/bar_presort.png)|![](https://raw.githubusercontent.com/aschonfeld/dtale-media/master/images/charts/bar_postsort.png)| -This is a very powerful feature with many more features that could be offered (linked subplots, different statistical aggregations, etc...) so please submit issues :) +**Popup Charts** + +Viewing multiple charts at once and want to separate one out into its own window or simply move one off to the side so you can work on building another for comparison? Well now you can by clicking the "Popup" button :smile: + +**Copy Link** + +Want to send what you're looking at to someone else? Simply click the "Copy Link" button and it will save a pre-populated chart URL into your clipboard. As long as your D-Tale process is still running when that link is opened you will see your original chart. + +**Exporting Charts** + +You can now export your dash charts (with the exception of Wordclouds) to static HTML files which can be emailed to others or saved down to be viewed at a later time. The best part is that all of the javascript for plotly is embedded in these files so the nice zooming, panning, etc is still available! :boom: + +**Exporting CSV** + +I've been asked about being able to export the data that is contained within your chart to a CSV for further analysis in tools like Excel. This button makes that possible. + +**OFFLINE CHARTS** + +Want to run D-Tale in a jupyter notebook and build a chart that will still be displayed even after your D-Tale process has shutdown? Now you can! Here's an example code snippet show how to use it: +``` +import dtale + +def test_data(): + import random + import pandas as pd + import numpy as np + + df = pd.DataFrame([ + dict(x=i, y=i % 2) + for i in range(30) + ]) + rand_data = pd.DataFrame(np.random.randn(len(df), 5), columns=['z{}'.format(j) for j in range(5)]) + return pd.concat([df, rand_data], axis=1) + +d = dtale.show(test_data()) +d.offline_chart(chart_type='bar', x='x', y='z3', agg='sum') +``` +*Tutorial: coming soon* **Disclaimer: Long Running Chart Requests** @@ -388,6 +454,18 @@ If you choose to build a chart that requires a lot of computational resources th If you miss the legacy (non-plotly/dash) charts, not to worry! They are still available from the link in the upper-right corner, but on for a limited time... Here is the documentation for those: [Legacy Charts](https://github.com/man-group/dtale/blob/master/docs/LEGACY_CHARTS.md) +**Your Feedback is Valuable** + +This is a very powerful feature with many more features that could be offered (linked subplots, different statistical aggregations, etc...) so please submit issues :) + +#### Coverage (Deprecated) + +If you have watched the video within the [Man Institute](https://www.man.com/maninstitute/d-tale) blog post you'll notice that there is a "Coverage" popup. This was deprecated with the creation of the "Charts" page. You can create the same coverage chart in that video by choosing the following options in the "Charts" page: +- Type: **Line** +- X: **date** +- Y: **security_id** +- Aggregation: **Count** or **Unique Count** + #### Correlations Shows a pearson correlation matrix of all numeric columns against all other numeric columns - By default, it will show a grid of pearson correlations (filtering available by using drop-down see 2nd table of screenshots) @@ -421,7 +499,6 @@ Turn off Heat Map by clicking menu option again ![](https://raw.githubusercontent.com/aschonfeld/dtale-media/master/images/Heatmap_toggle.png) #### Code Exports - *Code Exports* are small snippets of code representing the current state of the grid you're viewing including things like: - columns built - filtering @@ -496,16 +573,6 @@ All column movements are saved on the server so refreshing your browser won't lo All column movements are saved on the server so refreshing your browser won't lose them :ok_hand: -#### Building Columns - -[![](http://img.youtube.com/vi/G6wNS9-lG04/0.jpg)](http://www.youtube.com/watch?v=G6wNS9-lG04 "Build Columns in D-Tale") - -This video shows you how to build the following: - - Numeric: adding/subtracting two columns or columns with static values - - Bins: bucketing values using pandas cut & qcut as well as assigning custom labels - - Dates: retrieving date properties (hour, weekday, month...) as well as conversions (month end) - - #### Lock Adds your column to "locked" columns - "locked" means that if you scroll horizontally these columns will stay pinned to the right-hand side diff --git a/docker/2_7/Dockerfile b/docker/2_7/Dockerfile index 56e5813e..0e29f84a 100644 --- a/docker/2_7/Dockerfile +++ b/docker/2_7/Dockerfile @@ -44,4 +44,4 @@ WORKDIR /app RUN set -eux \ ; . /root/.bashrc \ - ; easy_install dtale-1.7.12-py2.7.egg + ; easy_install dtale-1.7.13-py2.7.egg diff --git a/docker/3_6/Dockerfile b/docker/3_6/Dockerfile index 0142419d..05978a51 100644 --- a/docker/3_6/Dockerfile +++ b/docker/3_6/Dockerfile @@ -44,4 +44,4 @@ WORKDIR /app RUN set -eux \ ; . /root/.bashrc \ - ; easy_install dtale-1.7.12-py3.7.egg + ; easy_install dtale-1.7.13-py3.7.egg diff --git a/docs/JUPYTERHUB_KUBERNETES.md b/docs/JUPYTERHUB_KUBERNETES.md index 4e25c6c2..fd596385 100644 --- a/docs/JUPYTERHUB_KUBERNETES.md +++ b/docs/JUPYTERHUB_KUBERNETES.md @@ -11,6 +11,18 @@ StatefulSet, whatever) - for `k8s Ingress` take a look at when they list [here](https://kubernetes.io/docs/concepts/services-networking/ingress/) - you must configure it ahead of time for which port you D-Tale to run on - FYI: we went with traefik.io (think of it as a cluster-wide Nginx reverse proxy) +- make sure when setting up the multiple target ports use different values for the notebook and D-Tale `port`. Here's an example (this was supported by k8s as `multi-port services`) +``` +ports: + - name: http-notebook + port: 80 + protocol: TCP + targetPort: 8888 + - name: http-notebook-dtale + port: 40000 + protocol: TCP + targetPort: 40000 +``` *DISCLAIMER: the Service and Ingress here are created by our own code-tweaked version of JupyterHub* @@ -35,3 +47,5 @@ these environment variables: this will kill the previous instance running at 40000 (`DTALE_MIN_PORT`) and replace it with this instance *hopefully this scenario won't get hit very often, it hasn't for us* +**Sample Issue Threads** +- [Failed to connect D-Tale process in hosted notebook](https://github.com/man-group/dtale/issues/95) diff --git a/docs/source/conf.py b/docs/source/conf.py index f3c3f9d4..4f41c0cb 100644 --- a/docs/source/conf.py +++ b/docs/source/conf.py @@ -64,9 +64,9 @@ # built documents. # # The short X.Y version. -version = u'1.7.12' +version = u'1.7.13' # The full version, including alpha/beta/rc tags. -release = u'1.7.12' +release = u'1.7.13' # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. diff --git a/dtale/__init__.py b/dtale/__init__.py index 52b5e5b7..a5b5149e 100644 --- a/dtale/__init__.py +++ b/dtale/__init__.py @@ -3,7 +3,7 @@ dtale = Blueprint('dtale', __name__, url_prefix='/dtale') # flake8: NOQA -from dtale.app import show, get_instance, instances # isort:skip +from dtale.app import show, get_instance, instances, offline_chart # isort:skip from dtale.cli.loaders import LOADERS # isort:skip for loader_name, loader in LOADERS.items(): diff --git a/dtale/app.py b/dtale/app.py index 248897ac..651b0d99 100644 --- a/dtale/app.py +++ b/dtale/app.py @@ -549,3 +549,52 @@ def get_instance(data_id): if global_state.get_data(data_id_str) is not None: return DtaleData(data_id_str, build_url(ACTIVE_PORT, ACTIVE_HOST)) return None + + +def offline_chart(df, chart_type=None, query=None, x=None, y=None, z=None, group=None, agg=None, window=None, + rolling_comp=None, barmode=None, barsort=None, filepath=None, **kwargs): + """ + Builds the HTML for a plotly chart figure to saved to a file or output to a jupyter notebook + + :param df: integer string identifier for a D-Tale process's data + :type df: :class:`pandas:pandas.DataFrame` + :param chart_type: type of chart, possible options are line|bar|pie|scatter|3d_scatter|surface|heatmap + :type chart_type: str + :param query: pandas dataframe query string + :type query: str, optional + :param x: column to use for the X-Axis + :type x: str + :param y: columns to use for the Y-Axes + :type y: list of str + :param z: column to use for the Z-Axis + :type z: str, optional + :param group: column(s) to use for grouping + :type group: list of str or str, optional + :param agg: specific aggregation that can be applied to y or z axes. Possible values are: count, first, last mean, + median, min, max, std, var, mad, prod, sum. This is included in label of axis it is being applied to. + :type agg: str, optional + :param window: number of days to include in rolling aggregations + :type window: int, optional + :param rolling_comp: computation to use in rolling aggregations + :type rolling_comp: str, optional + :param barmode: mode to use for bar chart display. possible values are stack|group(default)|overlay|relative + :type barmode: str, optional + :param barsort: axis name to sort the bars in a bar chart by (default is the 'x', but other options are any of + columns names used in the 'y' parameter + :type barsort: str, optional + :param filepath: location to save HTML output + :type filepath: str, optional + :param kwargs: optional keyword arguments, here in case invalid arguments are passed to this function + :type kwargs: dict + :return: possible outcomes are: + - if run within a jupyter notebook and no 'filepath' is specified it will print the resulting HTML + within a cell in your notebook + - if 'filepath' is specified it will save the chart to the path specified + - otherwise it will return the HTML output as a string + """ + instance = startup(url=None, data=df, data_id=999) + output = instance.offline_chart(chart_type=chart_type, query=query, x=x, y=y, z=z, group=group, agg=agg, + window=window, rolling_comp=rolling_comp, barmode=barmode, barsort=barsort, + filepath=filepath) + global_state.cleanup() + return output diff --git a/dtale/charts/utils.py b/dtale/charts/utils.py index d9e29f83..8c6d2338 100644 --- a/dtale/charts/utils.py +++ b/dtale/charts/utils.py @@ -91,6 +91,45 @@ def _handler(col_def): return _handler +def group_filter_handler(col_def, group_val, group_classifier): + col_def_segs = col_def.split('|') + if len(col_def_segs) > 1: + col, freq = col_def_segs + if freq == 'WD': + return '{}.dt.dayofweek == {}'.format(col, group_val) + elif freq == 'H2': + return '{}.dt.hour == {}'.format(col, group_val) + elif freq == 'H': + ts_val = pd.Timestamp(group_val) + return "{col}.dt.date == '{day}' and {col}.dt.hour == {hour}".format( + col=col, day=ts_val.strftime('%Y%m%d'), hour=ts_val.hour + ) + elif freq == 'D': + ts_val = pd.Timestamp(group_val) + return "{col}.dt.date == '{day}'".format(col=col, day=ts_val.strftime('%Y%m%d')) + elif freq == 'W': + ts_val = pd.Timestamp(group_val) + return "{col}.dt.year == {year} and {col}.dt.week == {week}".format( + col=col, year=ts_val.year, week=ts_val.week + ) + elif freq == 'M': + ts_val = pd.Timestamp(group_val) + return "{col}.dt.year == {year} and {col}.dt.month == {month}".format( + col=col, year=ts_val.year, month=ts_val.month + ) + elif freq == 'Q': + ts_val = pd.Timestamp(group_val) + return "{col}.dt.year == {year} and {col}.dt.quarter == {quarter}".format( + col=col, year=ts_val.year, quarter=ts_val.quarter + ) + elif freq == 'Y': + ts_val = pd.Timestamp(group_val) + return "{col}.dt.year == {year}".format(col=col, year=ts_val.year) + if group_classifier in ['I', 'F']: + return '{col} == {val}'.format(col=col_def, val=group_val) + return "{col} == '{val}'".format(col=col_def, val=group_val) + + def retrieve_chart_data(df, x, y, z, group=None): """ Retrieves data from a dataframe for x, y, z & group inputs complete with date frequency @@ -110,7 +149,7 @@ def retrieve_chart_data(df, x, y, z, group=None): :rtype: :class:`pandas:pandas.DataFrame` """ freq_handler = date_freq_handler(df) - cols = [x] + make_list(y) + [z] + make_list(group) + cols = [x] + make_list(y) + make_list(z) + make_list(group) all_code = [] all_data = [] for col in cols: @@ -141,7 +180,7 @@ def check_all_nan(df, cols=None): LIMIT_MSG = 'Dataset exceeds {} records, cannot render. Please apply filter...' -def check_exceptions(df, allow_duplicates, data_limit=15000, limit_msg=LIMIT_MSG): +def check_exceptions(df, allow_duplicates, unlimited_data=False, data_limit=15000, limit_msg=LIMIT_MSG): """ Checker function to test the output of any chart aggregations to see if it is one of the following: - too large to be rendered by web client @@ -161,7 +200,7 @@ def check_exceptions(df, allow_duplicates, data_limit=15000, limit_msg=LIMIT_MSG if not allow_duplicates and any(df.duplicated()): raise Exception( '{} contains duplicates, please specify group or additional filtering'.format(', '.join(df.columns))) - if len(df) > data_limit: + if not unlimited_data and len(df) > data_limit: raise Exception(limit_msg.format(data_limit)) @@ -222,7 +261,8 @@ def build_agg_data(df, x, y, inputs, agg, z=None): ] -def build_chart(raw_data, x, y, group_col=None, agg=None, allow_duplicates=False, **kwargs): +def build_chart(raw_data, x, y, group_col=None, agg=None, allow_duplicates=False, return_raw=False, + unlimited_data=False, **kwargs): """ Helper function to return data for 'chart-data' & 'correlations-ts' endpoints. Will return a dictionary of dictionaries (one for each series) which contain the data for the x & y axes of the chart as well as the minimum & @@ -250,9 +290,7 @@ def build_chart(raw_data, x, y, group_col=None, agg=None, allow_duplicates=False x_col = str('x') y_cols = make_list(y) z_col = kwargs.get('z') - z_cols = [] - if z_col is not None: - z_cols = [z_col] + z_cols = make_list(z_col) if group_col is not None and len(group_col): data = data.sort_values(group_col + [x]) code.append("chart_data = chart_data.sort_values(['{cols}'])".format(cols="', '".join(group_col + [x]))) @@ -274,6 +312,8 @@ def build_chart(raw_data, x, y, group_col=None, agg=None, allow_duplicates=False raise Exception(msg) data = data.dropna() + if return_raw: + return data.rename(columns={x_col: x}) code.append("chart_data = chart_data.dropna()") data_f, range_f = build_formatters(data) ret_data = dict( @@ -286,11 +326,13 @@ def build_chart(raw_data, x, y, group_col=None, agg=None, allow_duplicates=False group_fmt_overrides = {'I': lambda v, as_string: json_int(v, as_string=as_string, fmt='{}')} group_fmts = {c: find_dtype_formatter(dtypes[c], overrides=group_fmt_overrides) for c in group_col} for group_val, grp in data.groupby(group_col): - group_val = '/'.join([ - group_fmts[gc](gv, as_string=True) for gv, gc in zip(make_list(group_val), group_col) - ]) - ret_data['data'][group_val] = data_f.format_lists(grp) - ret_data['dtypes'] = {c: classify_type(dtype) for c, dtype in dtypes.items()} + + def _group_filter(): + for gv, gc in zip(make_list(group_val), group_col): + classifier = classify_type(dtypes[gc]) + yield group_filter_handler(gc, group_fmts[gc](gv, as_string=True), classifier) + group_filter = ' and '.join(list(_group_filter())) + ret_data['data'][group_filter] = data_f.format_lists(grp) return ret_data, code sort_cols = [x] + (y_cols if len(z_cols) else []) data = data.sort_values(sort_cols) @@ -303,10 +345,12 @@ def build_chart(raw_data, x, y, group_col=None, agg=None, allow_duplicates=False data, agg_code = build_agg_data(data, x_col, y_cols, kwargs, agg, z=z_col) code += agg_code data = data.dropna() + if return_raw: + return data.rename(columns={x_col: x}) code.append("chart_data = chart_data.dropna()") dupe_cols = [x_col] + (y_cols if len(z_cols) else []) - check_exceptions(data[dupe_cols].rename(columns={'x': x}), allow_duplicates, + check_exceptions(data[dupe_cols].rename(columns={'x': x}), allow_duplicates, unlimited_data=unlimited_data, data_limit=40000 if len(z_cols) else 15000) data_f, range_f = build_formatters(data) ret_data = dict( diff --git a/dtale/dash_application/charts.py b/dtale/dash_application/charts.py index ae596de8..4e3fddf5 100644 --- a/dtale/dash_application/charts.py +++ b/dtale/dash_application/charts.py @@ -2,17 +2,20 @@ import math import traceback import urllib +from logging import getLogger import dash_core_components as dcc import dash_html_components as html import numpy as np import pandas as pd import plotly.graph_objs as go -from six import PY3 +from plotly.io import write_html +from six import PY3, string_types import dtale.dash_application.components as dash_components import dtale.global_state as global_state from dtale.charts.utils import YAXIS_CHARTS, ZAXIS_CHARTS, build_agg_data +from dtale.charts.utils import build_chart as build_chart_data from dtale.charts.utils import build_formatters as chart_formatters from dtale.charts.utils import (check_all_nan, check_exceptions, retrieve_chart_data, valid_chart, @@ -22,7 +25,96 @@ from dtale.utils import (build_code_export, classify_type, dict_merge, divide_chunks, flatten_lists, get_dtypes, make_list, run_query) -from dtale.views import build_chart as build_chart_data + +if PY3: + from io import StringIO +else: + from StringIO import StringIO + + +logger = getLogger(__name__) + + +def get_url_parser(): + """ + Returns URL parser based on whether Python 2 or 3 is being used. + """ + if PY3: + return urllib.parse.parse_qsl + else: + try: + return urllib.parse_qsl + except BaseException: + from urlparse import parse_qsl + return parse_qsl + + +def chart_url_params(search): + """ + Builds chart parameters by parsing the query string from main URL + + :param search: URL querystring + :param search: str + :return: dictionary of parsed querystring key/values + :rtype: dict + """ + if not search: + return {} + if isinstance(search, string_types): + params = dict(get_url_parser()(search.lstrip('?'))) + else: + params = search + for gp in ['y', 'group', 'yaxis']: + if gp in params: + params[gp] = json.loads(params[gp]) + params['cpg'] = 'true' == params.get('cpg') + if 'window' in params: + params['window'] = int(params['window']) + if 'group_filter' in params: + params['query'] = ' and '.join([params[p] for p in ['query', 'group_filter'] if params.get(p) is not None]) + del params['group_filter'] + params['cpg'] = False + return params + + +def url_encode_func(): + return urllib.parse.urlencode if PY3 else urllib.urlencode + + +def chart_url_querystring(params, data=None, group_filter=None): + base_props = ['chart_type', 'query', 'x', 'z', 'agg', 'window', 'rolling_comp', 'barmode', 'barsort'] + final_params = {k: params[k] for k in base_props if params.get(k) is not None} + final_params['cpg'] = 'true' if params.get('cpg') is True else 'false' + for gp in ['y', 'group']: + list_param = [val for val in params.get(gp) or [] if val is not None] + if len(list_param): + final_params[gp] = json.dumps(list_param) + + if final_params['chart_type'] in YAXIS_CHARTS: + params_yaxis = {} + for y, range in (params.get('yaxis') or {}).items(): + if y not in ((data or {}).get('min') or {}): + continue + if not (range['min'], range['max']) == (data['min'][y], data['max'][y]): + params_yaxis[y] = range + if len(params_yaxis): + final_params['yaxis'] = json.dumps(params_yaxis) + + if group_filter is not None: + group_val, y_val = (group_filter.get(p) for p in ['group', 'y']) + if group_val: + final_params['group_filter'] = group_val + if y_val: + final_params['y'] = json.dumps([y_val]) + return url_encode_func()(final_params) + + +def graph_wrapper(**kwargs): + curr_style = kwargs.pop('style', None) or {} + return dcc.Graph( + style=dict_merge({'height': '100%'}, curr_style), + **kwargs + ) def build_axes(data_id, x, axis_inputs, mins, maxs, z=None, agg=None): @@ -107,17 +199,11 @@ def build_spaced_ticks(ticktext): :rtype: dict """ size = len(ticktext) - tickvals = list(range(size)) if size <= 30: - return {'tickmode': 'array', 'tickvals': tickvals, 'ticktext': ticktext} - spaced_ticks, spaced_text = [tickvals[0]], [ticktext[0]] + return {'tickmode': 'auto', 'nticks': size} factor = int(math.ceil(size / 28.0)) - for i in range(factor, size - 1, factor): - spaced_ticks.append(tickvals[i]) - spaced_text.append(ticktext[i]) - spaced_ticks.append(tickvals[-1]) - spaced_text.append(ticktext[-1]) - return {'tickmode': 'array', 'tickvals': spaced_ticks, 'ticktext': spaced_text} + nticks = len(range(factor, size - 1, factor)) + 2 + return {'tickmode': 'auto', 'nticks': nticks} def chart_wrapper(data_id, data, url_params=None): @@ -134,49 +220,44 @@ def chart_wrapper(data_id, data, url_params=None): if url_params is None: return lambda chart: chart - url_params_func = urllib.parse.urlencode if PY3 else urllib.urlencode - base_props = ['chart_type', 'query', 'x', 'z', 'agg', 'window', 'rolling_comp', 'barmode', 'barsort'] - params = {k: url_params[k] for k in base_props if url_params.get(k) is not None} - params['cpg'] = 'true' if url_params.get('cpg') is True else 'false' - for gp in ['y', 'group']: - group_param = [val for val in url_params.get(gp) or [] if val is not None] - if len(group_param): - params[gp] = json.dumps(group_param) - if params['chart_type'] in YAXIS_CHARTS: - params_yaxis = {} - for y, range in (url_params.get('yaxis') or {}).items(): - if y not in data['min']: - continue - if not (range['min'], range['max']) == (data['min'][y], data['max'][y]): - params_yaxis[y] = range - if len(params_yaxis): - params['yaxis'] = json.dumps(params_yaxis) - - popup_link = html.A( - [html.I(className='far fa-window-restore mr-4'), html.Span('Popup Chart')], - href='/charts/{}?{}'.format(data_id, url_params_func(params)), - target='_blank', - className='mr-5' - ) - copy_link = html.Div( - [html.A( - [html.I(className='ico-link mr-4'), html.Span('Copy Link')], - href='/charts/{}?{}'.format(data_id, url_params_func(params)), + def _chart_wrapper(chart, group_filter=None): + querystring = chart_url_querystring(url_params, data=data, group_filter=group_filter) + popup_link = html.A( + [html.I(className='far fa-window-restore mr-4'), html.Span('Popup Chart')], + href='/charts/{}?{}'.format(data_id, querystring), target='_blank', - className='mr-5 copy-link-btn' - ), html.Div('Copied to clipboard', className="hoverable__content copy-tt-bottom") - ], - className='hoverable-click' - ) - code_snippet = html.A( - [html.I(className='ico-code mr-4'), html.Span('Code Export')], - href='#', - className='code-snippet-btn', - ) - links = html.Div([popup_link, copy_link, code_snippet], style={'position': 'absolute', 'zIndex': 5}) - - def _chart_wrapper(chart): - return html.Div([links, chart], style={'position': 'relative'}) + className='mr-5' + ) + copy_link = html.Div( + [html.A( + [html.I(className='ico-link mr-4'), html.Span('Copy Link')], + href='/charts/{}?{}'.format(data_id, querystring), + target='_blank', + className='mr-5 copy-link-btn' + ), html.Div('Copied to clipboard', className="hoverable__content copy-tt-bottom") + ], + className='hoverable-click' + ) + code_snippet = html.A( + [html.I(className='ico-code mr-4'), html.Span('Code Export')], + href='#', + className='code-snippet-btn mr-5', + ) + export_html_link = html.A( + [html.I(className='fas fa-file-code mr-4'), html.Span('Export Chart')], + href='/dtale/chart-export/{}?{}'.format(data_id, querystring), + className='export-chart-btn mr-5' + ) + export_csv_link = html.A( + [html.I(className='fas fa-file-csv mr-4'), html.Span('Export CSV')], + href='/dtale/chart-csv-export/{}?{}'.format(data_id, querystring), + className='export-chart-btn' + ) + links = html.Div( + [popup_link, copy_link, code_snippet, export_html_link, export_csv_link], + style={'position': 'absolute', 'zIndex': 5} + ) + return html.Div([links, chart], style={'position': 'relative', 'height': '100%'}) return _chart_wrapper @@ -270,8 +351,13 @@ def cpg_chunker(charts, columns=2): """ if len(charts) == 1: return charts + + def _formatter(chart): + chart.style.pop('height', None) + return html.Div(chart, className='col-md-6') + return [ - html.Div([html.Div(c, className='col-md-6') for c in chunk], className='row') + html.Div([_formatter(c) for c in chunk], className='row') for chunk in divide_chunks(charts, columns) ] @@ -304,7 +390,7 @@ def scatter_builder(data, x, y, axes_builder, wrapper, group=None, z=None, agg=N def layout(axes): if z is not None: - return {'height': 700, 'margin': {'l': 0, 'r': 0, 'b': 0}, + return {'margin': {'l': 0, 'r': 0, 'b': 0}, 'scene': dict_merge(axes, dict(aspectmode='data'))} return axes @@ -314,9 +400,9 @@ def marker(series): 'showscale': True, 'colorbar': {'thickness': 15, 'len': 0.5, 'x': 0.8, 'y': 0.6}} return {'size': 15, 'line': {'width': 0.5, 'color': 'white'}} - scatter_func = go.Scatter3d if z is not None else go.Scatter + scatter_func = go.Scatter3d if z is not None else go.Scattergl return [ - wrapper(dcc.Graph( + wrapper(graph_wrapper( id='scatter-{}-{}'.format(group or 'all', y2), figure={'data': [ scatter_func(**dict_merge( @@ -325,7 +411,7 @@ def marker(series): ) for series_key, d in data['data'].items() if y2 in d and (group is None or group == series_key) ], 'layout': build_layout(dict_merge(build_title(x, y2, group, z=z, agg=agg), layout(axes_builder([y2]))))} - )) + ), group_filter=dict_merge(dict(y=y2), {} if group is None else dict(group=group))) for y2 in y ] @@ -364,16 +450,16 @@ def surface_builder(data, x, y, z, axes_builder, wrapper, agg=None): z_data = df.values axes = axes_builder([y[0]]) - layout = {'height': 700, 'autosize': True, 'margin': {'l': 0, 'r': 0, 'b': 0}, 'scene': dict_merge(axes, scene)} + layout = {'autosize': True, 'margin': {'l': 0, 'r': 0, 'b': 0}, 'scene': dict_merge(axes, scene)} return [ - wrapper(dcc.Graph( + wrapper(graph_wrapper( id='surface-{}'.format(y2), figure={'data': [ go.Surface(x=x_vals, y=y_vals, z=z_data, opacity=0.8, name='all', colorscale='YlGnBu', colorbar={'title': layout['scene']['zaxis']['title'], 'thickness': 15, 'len': 0.5, 'x': 0.8, 'y': 0.6}) ], 'layout': build_layout(dict_merge(build_title(x, y2, z=z, agg=agg), layout))} - )) + ), group_filter=dict(y=y2)) for y2 in y ] @@ -447,7 +533,7 @@ def bar_builder(data, x, y, axes_builder, wrapper, cpg=False, barmode='group', b if cpg: charts = [ - wrapper(dcc.Graph( + wrapper(graph_wrapper( id='bar-{}-graph'.format(series_key), figure={ 'data': [ @@ -463,7 +549,7 @@ def bar_builder(data, x, y, axes_builder, wrapper, cpg=False, barmode='group', b build_title(x, y, series_key, agg=kwargs.get('agg')), axes, dict(barmode=barmode or 'group') )) } - )) + ), group_filter=dict(group=series_key)) for series_key, series in data['data'].items() ] return cpg_chunker(charts) @@ -483,7 +569,7 @@ def bar_builder(data, x, y, axes_builder, wrapper, cpg=False, barmode='group', b if barmode == 'group' and len(y or []) > 1: data_cfgs = list(build_grouped_bars_with_multi_yaxis(data_cfgs, y)) - return wrapper(dcc.Graph( + return wrapper(graph_wrapper( id='bar-graph', figure={'data': data_cfgs, 'layout': build_layout( dict_merge(build_title(x, y, agg=kwargs.get('agg')), axes, dict(barmode=barmode or 'group')))} @@ -514,15 +600,23 @@ def line_builder(data, x, y, axes_builder, wrapper, cpg=False, **inputs): axes = axes_builder(y) name_builder = build_series_name(y, cpg) - line_cfg = {'mode': 'lines', 'line': {'shape': 'spline', 'smoothing': 0.3}} + + def line_func(s): + return go.Scattergl if len(s['x']) > 15000 else go.Scatter + + def line_cfg(s): + if len(s['x']) > 15000: + {'mode': 'lines', 'line': {'shape': 'linear'}} + return {'mode': 'lines', 'line': {'shape': 'spline', 'smoothing': 0.3}} + if cpg: charts = [ - wrapper(dcc.Graph( + wrapper(graph_wrapper( id='line-{}-graph'.format(series_key), figure={ 'data': [ - go.Scatter(**dict_merge( - line_cfg, + line_func(series)(**dict_merge( + line_cfg(series), {'x': series['x'], 'y': series[y2]}, name_builder(y2, series_key), {} if i == 1 else {'yaxis': 'y{}'.format(i)} @@ -531,15 +625,15 @@ def line_builder(data, x, y, axes_builder, wrapper, cpg=False, **inputs): ], 'layout': build_layout(dict_merge(build_title(x, y, group=series_key, agg=inputs.get('agg')), axes)) } - )) + ), group_filter=dict(group=series_key)) for series_key, series in data['data'].items() ] return cpg_chunker(charts) data_cfgs = flatten_lists([ [ - go.Scatter(**dict_merge( - line_cfg, + line_func(series)(**dict_merge( + line_cfg(series), {'x': series['x'], 'y': series[y2]}, name_builder(y2, series_key), {} if i == 1 else {'yaxis': 'y{}'.format(i)} @@ -548,13 +642,13 @@ def line_builder(data, x, y, axes_builder, wrapper, cpg=False, **inputs): ] for series_key, series in data['data'].items() ]) - return wrapper(dcc.Graph( + return wrapper(graph_wrapper( id='line-graph', figure={'data': data_cfgs, 'layout': build_layout(dict_merge(build_title(x, y, agg=inputs.get('agg')), axes))} )) -def pie_builder(data, x, y, wrapper, **inputs): +def pie_builder(data, x, y, wrapper, export=False, **inputs): """ Builder function for :plotly:`plotly.graph_objects.Pie ` @@ -582,15 +676,19 @@ def build_pies(): if y_val < 0: negative_values.append('{} ({})'.format(x_val, y_val)) - chart = wrapper(dcc.Graph( + layout = build_layout(build_title(x, y2, group=series_key, agg=inputs.get('agg'))) + if len(series['x']) > 5: + layout.pop('legend', None) + layout['showlegend'] = False + chart = wrapper(graph_wrapper( id='pie-{}-graph'.format(series_key), figure={ 'data': [go.Pie(**dict_merge( dict(labels=series['x'], values=series[y2]), name_builder(y2, series_key) ))], - 'layout': build_layout(build_title(x, y2, group=series_key, agg=inputs.get('agg'))) + 'layout': layout } - )) + ), group_filter=dict_merge(dict(y=y2), {} if series_key == 'all' else dict(group=series_key))) if len(negative_values): error_title = ( 'The following negative values could not be represented within the {}Pie chart' @@ -600,16 +698,21 @@ def build_pies(): html.Span(error_title), html.Div(html.Pre(', '.join(negative_values)), className='traceback') ], className='dtale-alert alert alert-danger') - yield html.Div( - [html.Div(error_div, className='col-md-12'), html.Div(chart, className='col-md-12')], - className='row' - ) + if export: + yield chart + else: + yield html.Div( + [html.Div(error_div, className='col-md-12'), html.Div(chart, className='col-md-12 h-100')], + className='row', + ) else: yield chart + if export: + return next(build_pies()) return cpg_chunker(list(build_pies())) -def heatmap_builder(data_id, **inputs): +def heatmap_builder(data_id, export=False, **inputs): """ Builder function for :plotly:`plotly.graph_objects.Heatmap ` @@ -630,7 +733,7 @@ def heatmap_builder(data_id, **inputs): return None, None raw_data = global_state.get_data(data_id) wrapper = chart_wrapper(data_id, raw_data, inputs) - hm_kwargs = dict(hoverongaps=False, colorscale='Greens', showscale=True, hoverinfo='x+y+z') + hm_kwargs = dict(colorscale='Greens', showscale=True, hoverinfo='x+y+z') # hoverongaps=False, x, y, z, agg = (inputs.get(p) for p in ['x', 'y', 'z', 'agg']) y = y[0] data, code = retrieve_chart_data(raw_data, x, y, z) @@ -672,8 +775,9 @@ def heatmap_builder(data_id, **inputs): code += agg_code if not len(data): raise Exception('No data returned for this computation!') - check_exceptions(data[dupe_cols], agg != 'corr', data_limit=40000, - limit_msg='Heatmap exceeds {} cells, cannot render. Please apply filter...') + # check_exceptions(data[dupe_cols], agg != 'corr', data_limit=40000, + # limit_msg='Heatmap exceeds {} cells, cannot render. Please apply filter...') + check_exceptions(data[dupe_cols], agg != 'corr', unlimited_data=True) dtypes = {c: classify_type(dtype) for c, dtype in get_dtypes(data).items()} data_f, _ = chart_formatters(data) data = data_f.format_df(data) @@ -698,28 +802,28 @@ def heatmap_builder(data_id, **inputs): if dtypes.get(y) == 'I': y_axis['tickformat'] = '.0f' - hovertemplate = ''.join([x_title, ': %{customdata[0]}
', y_title, ': %{customdata[1]}
', z_title, - ': %{z}']) - hm_kwargs = dict_merge(hm_kwargs, dict(z=heat_data, colorbar={'title': z_title}, - hoverinfo='x+y+z', hovertemplate=hovertemplate, - customdata=[[[xd, yd] for xd in x_data] for yd in y_data])) - return wrapper(dcc.Graph( + hm_kwargs = dict_merge(hm_kwargs, dict(x=x_data, y=y_data, z=heat_data, colorbar={'title': z_title}, + hoverinfo='x+y+z')) + chart = graph_wrapper( id='heatmap-graph-{}'.format(y), - style={'margin-right': 'auto', 'margin-left': 'auto', 'height': 600}, + style={'margin-right': 'auto', 'margin-left': 'auto'}, figure=dict( - data=[go.Heatmap(**hm_kwargs)], + data=[go.Heatmapgl(**hm_kwargs)], layout=build_layout(dict_merge( dict(xaxis=x_axis, yaxis=y_axis, xaxis_zeroline=False, yaxis_zeroline=False), build_title(x, y, z=z, agg=agg) )) ) - )), code + ) + if export: + return chart + return wrapper(chart), code except BaseException as e: return build_error(str(e), str(traceback.format_exc())), code def build_figure_data(data_id, chart_type=None, query=None, x=None, y=None, z=None, group=None, agg=None, window=None, - rolling_comp=None, **kwargs): + rolling_comp=None, return_raw=False, **kwargs): """ Builds chart figure data for loading into dash:`dash_core_components.Graph ` components @@ -744,7 +848,7 @@ def build_figure_data(data_id, chart_type=None, query=None, x=None, y=None, z=No :type window: int, optional :param rolling_comp: computation to use in rolling aggregations :type rolling_comp: str, optional - :param kwargs: optional keyword arguments, here in case invalid arguements are passed to this function + :param kwargs: optional keyword arguments, here in case invalid arguments are passed to this function :type kwargs: dict :return: dictionary of series data, min/max ranges of columns used in chart :rtype: dict @@ -766,12 +870,62 @@ def build_figure_data(data_id, chart_type=None, query=None, x=None, y=None, z=No if chart_type in ZAXIS_CHARTS: chart_kwargs['z'] = z del chart_kwargs['group_col'] - data, chart_code = build_chart_data(data, x, y, **chart_kwargs) + data, chart_code = build_chart_data(data, x, y, unlimited_data=True, **chart_kwargs) return data, code + chart_code except BaseException as e: return dict(error=str(e), traceback=str(traceback.format_exc())), code +def build_raw_figure_data(data_id, chart_type=None, query=None, x=None, y=None, z=None, group=None, agg=None, + window=None, rolling_comp=None, **kwargs): + """ + Returns a :class:`pandas:pandas.DataFrame` of data used within chart configuration + + :param data_id: integer string identifier for a D-Tale process's data + :type data_id: str + :param chart_type: type of chart (line, bar, pie, scatter...) + :type chart_type: str + :param query: pandas dataframe query string + :type query: str, optional + :param x: column to use for the X-Axis + :type x: str + :param y: columns to use for the Y-Axes + :type y: list of str + :param z: column to use for the Z-Axis + :type z: str, optional + :param group: column(s) to use for grouping + :type group: list of str or str, optional + :param agg: specific aggregation that can be applied to y or z axes. Possible values are: count, first, last mean, + median, min, max, std, var, mad, prod, sum. This is included in label of axis it is being applied to. + :type agg: str, optional + :param window: number of days to include in rolling aggregations + :type window: int, optional + :param rolling_comp: computation to use in rolling aggregations + :type rolling_comp: str, optional + :param kwargs: optional keyword arguments, here in case invalid arguments are passed to this function + :type kwargs: dict + :return: dataframe of all data used in chart + :rtype: :class:`pandas:pandas.DataFrame` + """ + if not valid_chart(**dict(x=x, y=y, z=z, chart_type=chart_type, agg=agg, window=window, + rolling_comp=rolling_comp)): + raise ValueError('invalid chart configuration: {}'.format( + dict(x=x, y=y, z=z, chart_type=chart_type, agg=agg, window=window, rolling_comp=rolling_comp) + )) + + data = run_query( + global_state.get_data(data_id), + query, + global_state.get_context_variables(data_id) + ) + chart_kwargs = dict(group_col=group, agg=agg, allow_duplicates=chart_type == 'scatter', rolling_win=window, + rolling_comp=rolling_comp) + if chart_type in ZAXIS_CHARTS: + chart_kwargs['z'] = z + del chart_kwargs['group_col'] + return build_chart_data(data, x, y, unlimited_data=True, return_raw=True, **chart_kwargs) + + def build_chart(data_id=None, **inputs): """ Factory method that forks off into the different chart building methods (heatmaps are handled separately) @@ -796,8 +950,8 @@ def build_chart(data_id=None, **inputs): code = None try: if inputs.get('chart_type') == 'heatmap': - data, code = heatmap_builder(data_id, **inputs) - return data, None, code + chart, code = heatmap_builder(data_id, **inputs) + return chart, None, code data, code = build_figure_data(data_id, **inputs) if data is None: @@ -810,23 +964,26 @@ def build_chart(data_id=None, **inputs): range_data = dict(min=data['min'], max=data['max']) axis_inputs = inputs.get('yaxis', {}) chart_builder = chart_wrapper(data_id, data, inputs) - chart_type, x, y, z, agg = (inputs.get(p) for p in ['chart_type', 'x', 'y', 'z', 'agg']) + chart_type, x, y, z, agg, group = (inputs.get(p) for p in ['chart_type', 'x', 'y', 'z', 'agg', 'group']) z = z if chart_type in ZAXIS_CHARTS else None chart_inputs = {k: v for k, v in inputs.items() if k not in ['chart_type', 'x', 'y', 'z', 'group']} if chart_type == 'wordcloud': return ( - chart_builder(dash_components.Wordcloud(id='wc', data=data, y=y, group=inputs.get('group'))), + chart_builder(dash_components.Wordcloud(id='wc', data=data, y=y, group=group)), range_data, code ) + if chart_type == 'pie': + return pie_builder(data, x, y, chart_builder, **chart_inputs), range_data, code + axes_builder = build_axes(data_id, x, axis_inputs, data['min'], data['max'], z=z, agg=agg) if chart_type == 'scatter': if inputs['cpg']: scatter_charts = flatten_lists([ - scatter_builder(data, x, y, axes_builder, chart_builder, group=group, agg=agg) - for group in data['data'] + scatter_builder(data, x, y, axes_builder, chart_builder, group=subgroup, agg=agg) + for subgroup in data['data'] ]) else: scatter_charts = scatter_builder(data, x, y, axes_builder, chart_builder, agg=agg) @@ -844,8 +1001,112 @@ def build_chart(data_id=None, **inputs): if chart_type == 'line': return line_builder(data, x, y, axes_builder, chart_builder, **chart_inputs), range_data, code - if chart_type == 'pie': - return pie_builder(data, x, y, chart_builder, **chart_inputs), range_data, code raise NotImplementedError('chart type: {}'.format(chart_type)) except BaseException as e: return build_error(str(e), str(traceback.format_exc())), None, code + + +def build_raw_chart(data_id=None, **inputs): + """ + Factory method that forks off into the different chart building methods + - heatmap + - line + - bar + - scatter + - pie + - 3D scatter + - surface + + :param data_id: identifier of data to build axis configurations against + :type data_id: str + :param inputs: Optional keyword arguments containing the following information: + - x: column to be used as x-axis of chart + - y: column to be used as y-axis of chart + - z: column to use for the Z-Axis + - agg: points to a specific function that can be applied to :func: pandas.core.groupby.DataFrameGroupBy + :return: plotly chart object(s) + :rtype: type of (:dash:`dash_core_components.Graph `, dict) + """ + + def clean_output(output): + if isinstance(output, list): + output = output[0] + if isinstance(output, dcc.Graph): + output = output.figure + return output + + def chart_builder_passthru(chart, group_filter=None): + return chart + + def _raw_chart_builder(): + if inputs.get('chart_type') == 'heatmap': + chart = heatmap_builder(data_id, **inputs) + return chart + + data, _ = build_figure_data(data_id, **inputs) + if data is None: + return None + + if 'error' in data: + logger.error(data['traceback']) + return None + + chart_type, x, y, z, agg = (inputs.get(p) for p in ['chart_type', 'x', 'y', 'z', 'agg']) + z = z if chart_type in ZAXIS_CHARTS else None + + axis_inputs = inputs.get('yaxis', {}) + chart_builder = chart_builder_passthru # we'll ignore wrapper functionality for raw charts + chart_inputs = {k: v for k, v in inputs.items() if k not in ['chart_type', 'x', 'y', 'z', 'group']} + + if chart_type == 'pie': + return pie_builder(data, x, y, chart_builder, **chart_inputs) + + axes_builder = build_axes(data_id, x, axis_inputs, data['min'], data['max'], z=z, agg=agg) + if chart_type == 'scatter': + return scatter_builder(data, x, y, axes_builder, chart_builder, agg=agg) + + if chart_type == '3d_scatter': + return scatter_builder(data, x, y, axes_builder, chart_builder, z=z, agg=agg) + + if chart_type == 'surface': + return surface_builder(data, x, y, z, axes_builder, chart_builder, agg=agg) + + if chart_type == 'bar': + return bar_builder(data, x, y, axes_builder, chart_builder, **chart_inputs) + + return line_builder(data, x, y, axes_builder, chart_builder, **chart_inputs) + return clean_output(_raw_chart_builder()) + + +def export_chart(data_id, params): + chart = build_raw_chart(data_id, export=True, **params) + post_script_css = '\n'.join([ + "var css = document.createElement('style');", + "css.type = 'text/css';", + ( + "css.appendChild(document.createTextNode('div.modebar > div.modebar-group:last-child," + "div.modebar > div.modebar-group:first-child { display: none; }'));" + ), + 'document.getElementsByTagName("head")[0].appendChild(css);' + ]) + html_buffer = StringIO() + write_html(chart, file=html_buffer, include_plotlyjs=True, auto_open=False, post_script=post_script_css) + html_buffer.seek(0) + return html_buffer + + +def export_chart_data(data_id, params): + data = build_raw_figure_data(data_id, **params) + if PY3: + from io import BytesIO + proxy = StringIO() + data.to_csv(proxy, encoding='utf-8', index=False) + csv_buffer = BytesIO() + csv_buffer.write(proxy.getvalue().encode('utf-8')) + proxy.close() + else: + csv_buffer = StringIO() + data.to_csv(csv_buffer, encoding='utf-8', index=False) + + csv_buffer.seek(0) + return csv_buffer diff --git a/dtale/dash_application/layout.py b/dtale/dash_application/layout.py index cfb10972..81bd9128 100644 --- a/dtale/dash_application/layout.py +++ b/dtale/dash_application/layout.py @@ -45,13 +45,6 @@ def base_layout(github_fork, **kwargs):
- -
|
@@ -453,6 +446,6 @@ def charts_layout(df, settings, **inputs): ], className='row pt-3 pb-5 charts-filters' ), - dcc.Loading(html.Div(id='chart-content'), type='circle'), + dcc.Loading(html.Div(id='chart-content', style={'height': '70vh'}), type='circle'), dcc.Textarea(id="copy-text", style=dict(position='absolute', left='-110%')) ], className='charts-body') diff --git a/dtale/dash_application/views.py b/dtale/dash_application/views.py index 5ecb7560..64d5ae79 100644 --- a/dtale/dash_application/views.py +++ b/dtale/dash_application/views.py @@ -1,5 +1,3 @@ -import json -import urllib from logging import getLogger import dash @@ -7,11 +5,10 @@ import dash_html_components as html from dash.dependencies import Input, Output, State from dash.exceptions import PreventUpdate -from six import PY3 import dtale.global_state as global_state from dtale.charts.utils import YAXIS_CHARTS, ZAXIS_CHARTS -from dtale.dash_application.charts import build_chart +from dtale.dash_application.charts import build_chart, chart_url_params from dtale.dash_application.layout import (bar_input_style, base_layout, build_input_options, charts_layout, show_chart_per_group, @@ -69,41 +66,6 @@ def add_dash(server): return dash_app.server -def get_url_parser(): - """ - Returns URL parser based on whether Python 2 or 3 is being used. - """ - if PY3: - return urllib.parse.parse_qsl - else: - try: - return urllib.parse_qsl - except BaseException: - from urlparse import parse_qsl - return parse_qsl - - -def chart_url_params(search): - """ - Builds chart parameters by parsing the query string from main URL - - :param search: URL querystring - :param search: str - :return: dictionary of parsed querystring key/values - :rtype: dict - """ - if not search: - return {} - params = dict(get_url_parser()(search.lstrip('?'))) - for gp in ['y', 'group', 'yaxis']: - if gp in params: - params[gp] = json.loads(params[gp]) - params['cpg'] = 'true' == params.get('cpg') - if 'window' in params: - params['window'] = int(params['window']) - return params - - def get_data_id(pathname): """ Parses data ID from query path (ex: 'foo/bar/1' => '1') diff --git a/dtale/data_reshapers.py b/dtale/data_reshapers.py new file mode 100644 index 00000000..bbb58405 --- /dev/null +++ b/dtale/data_reshapers.py @@ -0,0 +1,144 @@ +import pandas as pd + +import dtale.global_state as global_state +from dtale.utils import run_query + + +def flatten_columns(df): + return [' '.join([str(c) for c in col]).strip() for col in df.columns.values] + + +class DataReshaper(object): + + def __init__(self, data_id, shape_type, cfg): + self.data_id = data_id + if shape_type == 'pivot': + self.builder = PivotBuilder(cfg) + elif shape_type == 'aggregate': + self.builder = AggregateBuilder(cfg) + elif shape_type == 'transpose': + self.builder = TransposeBuilder(cfg) + else: + raise NotImplementedError('{} data re-shaper not implemented yet!'.format(shape_type)) + + def reshape(self): + data = run_query( + global_state.get_data(self.data_id), + (global_state.get_settings(self.data_id) or {}).get('query'), + global_state.get_context_variables(self.data_id) + ) + return self.builder.reshape(data) + + def build_code(self): + return self.builder.build_code() + + +class PivotBuilder(object): + + def __init__(self, cfg): + self.cfg = cfg + + def reshape(self, data): + index, columns, values, aggfunc = (self.cfg.get(p) for p in ['index', 'columns', 'values', 'aggfunc']) + if aggfunc is not None or len(values) > 1: + pivot_data = pd.pivot_table(data, values=values, index=index, columns=columns, aggfunc=aggfunc) + if len(values) > 1: + pivot_data.columns = flatten_columns(pivot_data) + elif len(values) == 1: + pivot_data.columns = pivot_data.columns.droplevel(0) + else: + pivot_data = data.pivot(index=index, columns=columns, values=values[0]) + pivot_data = pivot_data.rename_axis(None, axis=1) + return pivot_data + + def build_code(self): + index, columns, values, aggfunc = (self.cfg.get(p) for p in ['index', 'columns', 'values', 'aggfunc']) + code = [] + if aggfunc is not None or len(values) > 1: + code.append("df = pd.pivot_table(df, index='{}', columns='{}', values=['{}'], aggfunc='{}')".format( + index, columns, "', '".join(values), aggfunc + )) + if len(values) > 1: + code.append( + "df.columns = [' '.join([str(c) for c in col]).strip() for col in df.columns.values]" + ) + elif len(values) == 1: + code.append("df.columns = df.columns.droplevel(0)") + else: + code.append("df = df.pivot(index='{index}', columns='{columns}', values='{values}')".format( + index=index, columns=columns, values=values[0] + )) + code.append('df = df.rename_axis(None, axis=1)') + return '\n'.join(code) + + +class AggregateBuilder(object): + + def __init__(self, cfg): + self.cfg = cfg + + def reshape(self, data): + index, agg = (self.cfg.get(p) for p in ['index', 'agg']) + agg_data = data.groupby(index) + agg_type, func, cols = (agg.get(p) for p in ['type', 'func', 'cols']) + if agg_type == 'func': + if cols: + agg_data = agg_data[cols] + return getattr(agg_data, func)() + agg_data = agg_data.aggregate(cols) + agg_data.columns = flatten_columns(agg_data) + return agg_data + + def build_code(self): + index, agg = (self.cfg.get(p) for p in ['index', 'agg']) + index = "', '".join(index) + agg_type, func, cols = (agg.get(p) for p in ['type', 'func', 'cols']) + if agg_type == 'func': + if cols is not None: + return "df = df.groupby(['{index}'])['{columns}'].{agg}()".format( + index=index, columns="', '".join(cols), agg=agg + ) + return "df = df.groupby(['{index}']).{agg}()".format(index="', '".join(index), agg=agg) + code = [ + "df = df.groupby(['{index}']).aggregate(".format(index=index) + "{", + ',\n'.join("\t'{col}': ['{aggs}']".format(col=col, aggs="', '".join(aggs)) for col, aggs in cols.items()), + "})", + "df.columns = [' '.join([str(c) for c in col]).strip() for col in df.columns.values]" + ] + return '\n'.join(code) + + +class TransposeBuilder(object): + + def __init__(self, cfg): + self.cfg = cfg + + def reshape(self, data): + index, columns = (self.cfg.get(p) for p in ['index', 'columns']) + t_data = data.set_index(index) + if any(t_data.index.duplicated()): + raise Exception('Transposed data contains duplicates, please specify additional index or filtering') + if columns is not None: + t_data = t_data[columns] + t_data = t_data.T + if len(index) > 1: + t_data.columns = flatten_columns(t_data) + t_data = t_data.rename_axis(None, axis=1) + if columns is None: + print(t_data.head()) + return t_data + + def build_code(self): + index, columns = (self.cfg.get(p) for p in ['index', 'columns']) + + code = [] + if columns is not None: + code.append("df = df.set_index('{}')['{}'].T".format("', '".join(index), "', '".join(columns))) + else: + code.append("df = df.set_index('{}').T".format("', '".join(index))) + if len(index) > 1: + code.append( + "df.columns = [' '.join([str(c) for c in col]).strip() for col in df.columns.values]" + ) + code.append('df = df.rename_axis(None, axis=1)') + return '\n'.join(code) diff --git a/dtale/static/css/main.css b/dtale/static/css/main.css index 168942ad..0b611f1b 100644 --- a/dtale/static/css/main.css +++ b/dtale/static/css/main.css @@ -4551,7 +4551,8 @@ button.close { } } -div.build-modal > div.modal-lg { +div.build-modal > div.modal-lg, +div.reshape-modal > div.modal-lg { min-width: 720px; } @@ -4559,7 +4560,8 @@ div.build-modal > div.modal-lg { .modal-lg { max-width: 800px; } - div.build-modal > div.modal-lg { + div.build-modal > div.modal-lg, + div.reshape-modal > div.modal-lg { max-width: 720px; } div.histogram-modal > div.modal-lg, @@ -10028,6 +10030,11 @@ select.form-control:focus, min-width: 8em; } +.modal-footer .bouncer { + height: 20px; + width: 20px; +} + @media (min-width: 1200px) { .modal-lg { max-width: 85em; @@ -10471,7 +10478,8 @@ div#popup-content > div.modal-body > div.row { margin-right: 0; } -div.container-fluid.build > div#popup-content > div.modal-body { +div.container-fluid.build > div#popup-content > div.modal-body, +div.container-fluid.reshape > div#popup-content > div.modal-body{ height: 260px; } @@ -10479,14 +10487,16 @@ div.container-fluid.describe > div#popup-content > div.modal-body { height: 450px; } -div.container-fluid.code-popup > div#popup-content > div.modal-footer { +div.container-fluid.code-popup > div#popup-content > div.modal-footer, +div.container-fluid.code-export > div#popup-content > div.modal-footer { position: absolute; bottom: 0; width: 100%; } @media (min-height: 330px) { - div.container-fluid.build > div#popup-content > div.modal-footer { + div.container-fluid.build > div#popup-content > div.modal-footer, + div.container-fluid.reshape > div#popup-content > div.modal-footer{ position: absolute; bottom: 0; width: 100%; diff --git a/dtale/utils.py b/dtale/utils.py index af5ebede..66e49926 100644 --- a/dtale/utils.py +++ b/dtale/utils.py @@ -805,15 +805,18 @@ def build_code_export(data_id, imports='import pandas as pd\n\n', query=None): settings = global_state.get_settings(data_id) or {} ctxt_vars = global_state.get_context_variables(data_id) + startup_code = settings.get('startup_code') + startup_code = '# Data Re-shaping\n{}\n\n'.format(startup_code) if startup_code else '' startup_str = ( "# DISCLAIMER: 'df' refers to the data you passed in when calling 'dtale.show'\n\n" '{imports}' + '{startup}' 'if isinstance(df, (pd.DatetimeIndex, pd.MultiIndex)):\n' '\tdf = df.to_frame(index=False)\n\n' '# remove any pre-existing indices for ease of use in the D-Tale code, but this is not required\n' "df = df.reset_index().drop('index', axis=1, errors='ignore')\n" 'df.columns = [str(c) for c in df.columns] # update columns to strings in case they are numbers\n' - ).format(imports=imports) + ).format(imports=imports, startup=startup_code) final_history = [startup_str] + history final_query = query if final_query is None: @@ -827,7 +830,7 @@ def build_code_export(data_id, imports='import pandas as pd\n\n', query=None): "\n# DISCLAIMER: running this line in a different process than the one it originated will produce\n" "# differing results\n" "ctxt_vars = dtale_global_state.get_context_variables('{data_id}')\n\n" - "df = df.query('{query}', local_dict=ctx_vars)\n" + "df = df.query('{query}', local_dict=ctxt_vars)\n" ).format(query=final_query, data_id=data_id)) else: final_history.append("df = df.query('{}')\n".format(final_query)) diff --git a/dtale/views.py b/dtale/views.py index 3232b411..e58dada8 100644 --- a/dtale/views.py +++ b/dtale/views.py @@ -6,7 +6,7 @@ from builtins import map, range, str, zip from logging import getLogger -from flask import json, redirect, render_template, request +from flask import json, redirect, render_template, request, send_file import numpy as np import pandas as pd @@ -18,6 +18,10 @@ from dtale.charts.utils import build_chart from dtale.cli.clickutils import retrieve_meta_info_and_version from dtale.column_builders import ColumnBuilder +from dtale.dash_application.charts import (build_raw_chart, chart_url_params, + chart_url_querystring, export_chart, + export_chart_data, url_encode_func) +from dtale.data_reshapers import DataReshaper from dtale.utils import (DuplicateDataError, build_code_export, build_shutdown_url, classify_type, dict_merge, divide_chunks, filter_df_for_grid, find_dtype, @@ -183,7 +187,7 @@ def __repr__(self): return '' return self.main_url() - def _build_iframe(self, route='/dtale/iframe/', params=None, width='100%', height=350): + def _build_iframe(self, route='/dtale/iframe/', params=None, width='100%', height=475): """ Helper function to build an :class:`ipython:IPython.display.IFrame` if that module exists within your environment @@ -205,11 +209,13 @@ def _build_iframe(self, route='/dtale/iframe/', params=None, width='100%', heigh return None iframe_url = '{}{}{}'.format(self._url, route, self._data_id) if params is not None: - formatted_params = ['{}={}'.format(k, ','.join(make_list(params[k]))) for k in sorted(params)] - iframe_url = '{}?{}'.format(iframe_url, '&'.join(formatted_params)) + if isinstance(params, string_types): # has this already been encoded? + iframe_url = '{}?{}'.format(iframe_url, params) + else: + iframe_url = '{}?{}'.format(iframe_url, url_encode_func()(params)) return IFrame(iframe_url, width=width, height=height) - def notebook(self, route='/dtale/iframe/', params=None, width='100%', height=350): + def notebook(self, route='/dtale/iframe/', params=None, width='100%', height=475): """ Helper function which checks to see if :mod:`flask:flask.Flask` process is up and running and then tries to build an :class:`ipython:IPython.display.IFrame` and run :meth:`ipython:IPython.display.display` on it so @@ -242,7 +248,7 @@ def notebook(self, route='/dtale/iframe/', params=None, width='100%', height=350 if self._notebook_handle is None: self._notebook_handle = True - def notebook_correlations(self, col1, col2, width='100%', height=350): + def notebook_correlations(self, col1, col2, width='100%', height=475): """ Helper function to build an `ipython:IPython.display.IFrame` pointing at the correlations popup @@ -258,32 +264,106 @@ def notebook_correlations(self, col1, col2, width='100%', height=350): """ self.notebook('/dtale/popup/correlations/', params=dict(col1=col1, col2=col2), width=width, height=height) - def notebook_charts(self, x, y, group=None, aggregation=None, width='100%', height=350): + def notebook_charts(self, chart_type='line', query=None, x=None, y=None, z=None, group=None, agg=None, window=None, + rolling_comp=None, barmode=None, barsort=None, width='100%', height=800): """ Helper function to build an `ipython:IPython.display.IFrame` pointing at the charts popup - :param x: column to be used as x-axis of chart + :param chart_type: type of chart, possible options are line|bar|pie|scatter|3d_scatter|surface|heatmap + :type chart_type: str + :param query: pandas dataframe query string + :type query: str, optional + :param x: column to use for the X-Axis :type x: str - :param y: column to be used as y-axis of chart - :type y: str - :param group: comma-separated string of columns to group chart data by - :type group: str, optional - :param aggregation: points to a specific function that can be applied to - :func: pandas.core.groupby.DataFrameGroupBy. Possible values are: count, first, last mean, - median, min, max, std, var, mad, prod, sum - :type aggregation: str, optional + :param y: columns to use for the Y-Axes + :type y: list of str + :param z: column to use for the Z-Axis + :type z: str, optional + :param group: column(s) to use for grouping + :type group: list of str or str, optional + :param agg: specific aggregation that can be applied to y or z axes. Possible values are: count, first, last, + mean, median, min, max, std, var, mad, prod, sum. This is included in label of axis it is being + applied to. + :type agg: str, optional + :param window: number of days to include in rolling aggregations + :type window: int, optional + :param rolling_comp: computation to use in rolling aggregations + :type rolling_comp: str, optional + :param barmode: mode to use for bar chart display. possible values are stack|group(default)|overlay|relative + :type barmode: str, optional + :param barsort: axis name to sort the bars in a bar chart by (default is the 'x', but other options are any of + columns names used in the 'y' parameter + :type barsort: str, optional :param width: width of the ipython cell :type width: str or int, optional :param height: height of the ipython cell :type height: str or int, optional :return: :class:`ipython:IPython.display.IFrame` """ - params = dict(x=x, y=y) - if group: - params['group'] = ','.join(make_list(group)) - if aggregation: - params['aggregation'] = aggregation - self.notebook('/dtale/popup/charts/', params=params, width=width, height=height) + params = dict(chart_type=chart_type, query=query, x=x, y=make_list(y), z=z, group=make_list(group), agg=agg, + window=window, rolling_comp=rolling_comp, barmode=barmode, barsort=barsort) + self.notebook(route='/charts/', params=chart_url_querystring(params), width=width, height=height) + + def offline_chart(self, chart_type=None, query=None, x=None, y=None, z=None, group=None, agg=None, window=None, + rolling_comp=None, barmode=None, barsort=None, filepath=None, **kwargs): + """ + Builds the HTML for a plotly chart figure to saved to a file or output to a jupyter notebook + + :param chart_type: type of chart, possible options are line|bar|pie|scatter|3d_scatter|surface|heatmap + :type chart_type: str + :param query: pandas dataframe query string + :type query: str, optional + :param x: column to use for the X-Axis + :type x: str + :param y: columns to use for the Y-Axes + :type y: list of str + :param z: column to use for the Z-Axis + :type z: str, optional + :param group: column(s) to use for grouping + :type group: list of str or str, optional + :param agg: specific aggregation that can be applied to y or z axes. Possible values are: count, first, last, + mean, median, min, max, std, var, mad, prod, sum. This is included in label of axis it is being + applied to. + :type agg: str, optional + :param window: number of days to include in rolling aggregations + :type window: int, optional + :param rolling_comp: computation to use in rolling aggregations + :type rolling_comp: str, optional + :param barmode: mode to use for bar chart display. possible values are stack|group(default)|overlay|relative + :type barmode: str, optional + :param barsort: axis name to sort the bars in a bar chart by (default is the 'x', but other options are any of + columns names used in the 'y' parameter + :type barsort: str, optional + :param filepath: location to save HTML output + :type filepath: str, optional + :param kwargs: optional keyword arguments, here in case invalid arguments are passed to this function + :type kwargs: dict + :return: possible outcomes are: + - if run within a jupyter notebook and no 'filepath' is specified it will print the resulting HTML + within a cell in your notebook + - if 'filepath' is specified it will save the chart to the path specified + - otherwise it will return the HTML output as a string + """ + params = dict(chart_type=chart_type, query=query, x=x, y=make_list(y), z=z, group=make_list(group), agg=agg, + window=window, rolling_comp=rolling_comp, barmode=barmode, barsort=barsort) + + if filepath is None and in_ipython_frontend(): + from plotly.offline import iplot, init_notebook_mode + + init_notebook_mode(connected=True) + chart = build_raw_chart(self._data_id, export=True, **params) + iplot(chart) + return + + html_buffer = export_chart(self._data_id, params) + if filepath is None: + return html_buffer.getvalue() + + if not filepath.endswith('.html'): + filepath = '{}.html'.format(filepath) + + with open(filepath, 'w') as f: + f.write(html_buffer.getvalue()) def adjust_cell_dimensions(self, width='100%', height=350): """ @@ -339,7 +419,12 @@ def build_dtypes_state(data, prev_state=None): :return: a list of dictionaries containing column names, indexes and data types """ prev_dtypes = {c['name']: c for c in prev_state or []} - ranges = data.agg([min, max]).to_dict() + try: + ranges = data.agg(['min', 'max']).to_dict() + except ValueError: + # I've seen when transposing data and data types get combined into one column this exception emerges + # when calling 'agg' on the new data + ranges = {} dtype_f = dtype_formatter(data, get_dtypes(data), ranges, prev_dtypes) return [dtype_f(i, c) for i, c in enumerate(data.columns)] @@ -497,7 +582,7 @@ def view_main(data_id=None): :type data_id: str :return: HTML """ - if data_id is None: + if data_id is None or data_id not in global_state.get_data().keys(): return redirect('/dtale/main/{}'.format(head_data_id())) return _view_main(data_id) @@ -776,6 +861,26 @@ def build_column(data_id): return jsonify(dict(error=str(e), traceback=str(traceback.format_exc()))) +@dtale.route('/reshape/') +def reshape_data(data_id): + from flask import current_app + + try: + output = get_str_arg(request, 'output') + shape_type = get_str_arg(request, 'type') + cfg = json.loads(get_str_arg(request, 'cfg')) + builder = DataReshaper(data_id, shape_type, cfg) + if output == 'new': + instance = startup(current_app.base_url, data=builder.reshape(), ignore_duplicate=True) + else: + instance = startup(current_app.base_url, data=builder.reshape(), data_id=data_id, ignore_duplicate=True) + curr_settings = global_state.get_settings(instance._data_id) + global_state.set_settings(instance._data_id, dict_merge(curr_settings, dict(startup_code=builder.build_code()))) + return jsonify(success=True, url=instance._main_url) + except BaseException as e: + return jsonify(dict(error=str(e), traceback=str(traceback.format_exc()))) + + @dtale.route('/test-filter/') def test_filter(data_id): """ @@ -1365,3 +1470,38 @@ def get_code_export(data_id): return jsonify(code='\n'.join(code), success=True) except BaseException as e: return jsonify(error=str(e), traceback=str(traceback.format_exc())) + + +def build_chart_filename(chart_type, ext='html'): + return '{}_export_{}.{}'.format(chart_type, json_timestamp(pd.Timestamp('now')), ext) + + +@dtale.route('/chart-export/') +def chart_export(data_id): + try: + params = chart_url_params(request.args.to_dict()) + html_buffer = export_chart(data_id, params) + filename = build_chart_filename(params['chart_type']) + return send_file(html_buffer, attachment_filename=filename, as_attachment=True, add_etags=False) + except BaseException as e: + return jsonify(error=str(e), traceback=str(traceback.format_exc())) + + +@dtale.route('/chart-csv-export/') +def chart_csv_export(data_id): + try: + params = chart_url_params(request.args.to_dict()) + csv_buffer = export_chart_data(data_id, params) + filename = build_chart_filename(params['chart_type'], ext='csv') + return send_file(csv_buffer, attachment_filename=filename, as_attachment=True, add_etags=False) + except BaseException as e: + return jsonify(error=str(e), traceback=str(traceback.format_exc())) + + +@dtale.route('/cleanup/') +def run_cleanup(data_id): + try: + global_state.cleanup(data_id) + return jsonify(success=True) + except BaseException as e: + return jsonify(error=str(e), traceback=str(traceback.format_exc())) diff --git a/package.json b/package.json index 79a0417c..3c52b419 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "dtale", - "version": "1.7.12", + "version": "1.7.13", "description": "Visualizer for Pandas Data Structures", "main": "main.js", "directories": { diff --git a/setup.py b/setup.py index fad99ca1..23ef2681 100644 --- a/setup.py +++ b/setup.py @@ -50,7 +50,7 @@ def run_tests(self): setup( name="dtale", - version="1.7.12", + version="1.7.13", author="MAN Alpha Technology", author_email="ManAlphaTech@man.com", description="Web Client for Visualizing Pandas Objects", diff --git a/static/__tests__/dtale/DataViewer-base-test.jsx b/static/__tests__/dtale/DataViewer-base-test.jsx index 36302d75..6c65d75f 100644 --- a/static/__tests__/dtale/DataViewer-base-test.jsx +++ b/static/__tests__/dtale/DataViewer-base-test.jsx @@ -118,6 +118,7 @@ describe("DataViewer tests", () => { "Describe", "Filter", "Build Column", + "Reshape", "Correlations", "Charts", "Resize", diff --git a/static/__tests__/dtale/reshape/DataViewer-reshape-aggregate-test.jsx b/static/__tests__/dtale/reshape/DataViewer-reshape-aggregate-test.jsx new file mode 100644 index 00000000..364f1355 --- /dev/null +++ b/static/__tests__/dtale/reshape/DataViewer-reshape-aggregate-test.jsx @@ -0,0 +1,291 @@ +import { mount } from "enzyme"; +import React from "react"; +import { Provider } from "react-redux"; +import Select from "react-select"; + +import { RemovableError } from "../../../RemovableError"; +import mockPopsicle from "../../MockPopsicle"; +import * as t from "../../jest-assertions"; +import reduxUtils from "../../redux-test-utils"; +import { buildInnerHTML, clickMainMenuButton, withGlobalJquery } from "../../test-utils"; + +const originalOffsetHeight = Object.getOwnPropertyDescriptor(HTMLElement.prototype, "offsetHeight"); +const originalOffsetWidth = Object.getOwnPropertyDescriptor(HTMLElement.prototype, "offsetWidth"); +const originalInnerWidth = Object.getOwnPropertyDescriptor(HTMLElement.prototype, "innerWidth"); +const originalInnerHeight = Object.getOwnPropertyDescriptor(HTMLElement.prototype, "innerHeight"); + +describe("DataViewer tests", () => { + const { location, open, opener } = window; + + beforeAll(() => { + delete window.location; + delete window.open; + delete window.opener; + window.location = { + reload: jest.fn(), + pathname: "/dtale/iframe/1", + assign: jest.fn(), + }; + window.open = jest.fn(); + window.opener = { code_popup: { code: "test code", title: "Test" } }; + Object.defineProperty(HTMLElement.prototype, "offsetHeight", { + configurable: true, + value: 500, + }); + Object.defineProperty(HTMLElement.prototype, "offsetWidth", { + configurable: true, + value: 800, + }); + Object.defineProperty(window, "innerWidth", { + configurable: true, + value: 1205, + }); + Object.defineProperty(window, "innerHeight", { + configurable: true, + value: 775, + }); + const mockBuildLibs = withGlobalJquery(() => + mockPopsicle.mock(url => { + const { urlFetcher } = require("../../redux-test-utils").default; + return urlFetcher(url); + }) + ); + const mockChartUtils = withGlobalJquery(() => (ctx, cfg) => { + const chartCfg = { ctx, cfg, data: cfg.data, destroyed: false }; + chartCfg.destroy = () => (chartCfg.destroyed = true); + chartCfg.getElementsAtXAxis = _evt => [{ _index: 0 }]; + chartCfg.getElementAtEvent = _evt => [{ _datasetIndex: 0, _index: 0, _chart: { config: cfg, data: cfg.data } }]; + return chartCfg; + }); + jest.mock("popsicle", () => mockBuildLibs); + jest.mock("chart.js", () => mockChartUtils); + jest.mock("chartjs-plugin-zoom", () => ({})); + jest.mock("chartjs-chart-box-and-violin-plot/build/Chart.BoxPlot.js", () => ({})); + }); + + afterAll(() => { + window.location = location; + window.open = open; + window.opener = opener; + Object.defineProperty(HTMLElement.prototype, "offsetHeight", originalOffsetHeight); + Object.defineProperty(HTMLElement.prototype, "offsetWidth", originalOffsetWidth); + Object.defineProperty(window, "innerWidth", originalInnerWidth); + Object.defineProperty(window, "innerHeight", originalInnerHeight); + }); + + test("DataViewer: reshape aggregate 'By Column'", done => { + const { DataViewer } = require("../../../dtale/DataViewer"); + const Reshape = require("../../../popups/reshape/Reshape").ReactReshape; + const { Aggregate } = require("../../../popups/reshape/Aggregate"); + const store = reduxUtils.createDtaleStore(); + buildInnerHTML({ settings: "" }, store); + const result = mount( + + + , + { attachTo: document.getElementById("content") } + ); + setTimeout(() => { + result.update(); + clickMainMenuButton(result, "Reshape"); + setTimeout(() => { + result.update(); + result + .find(Reshape) + .find("div.modal-body") + .find("button") + .first() + .simulate("click"); + t.equal(result.find(Aggregate).length, 1, "should show reshape pivot"); + const aggComp = result.find(Aggregate).first(); + const aggInputs = aggComp.find(Select); + aggInputs + .first() + .instance() + .onChange({ value: "col1" }); + aggInputs + .at(1) + .instance() + .onChange({ value: "col2" }); + aggInputs + .at(2) + .instance() + .onChange({ value: "count" }); + aggComp + .find("i") + .first() + .simulate("click"); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + setTimeout(() => { + result.update(); + t.equal(result.find(Reshape).length, 1, "should hide reshape"); + result + .find(Reshape) + .find("div.modal-body") + .find("button") + .last() + .simulate("click"); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + setTimeout(() => { + result.update(); + t.equal(result.find(Reshape).length, 0, "should hide reshape"); + done(); + }, 400); + }, 400); + }, 400); + }, 600); + }); + + test("DataViewer: reshape aggregate 'By Function'", done => { + const { DataViewer } = require("../../../dtale/DataViewer"); + const Reshape = require("../../../popups/reshape/Reshape").ReactReshape; + const { Aggregate } = require("../../../popups/reshape/Aggregate"); + const store = reduxUtils.createDtaleStore(); + buildInnerHTML({ settings: "" }, store); + const result = mount( + + + , + { attachTo: document.getElementById("content") } + ); + setTimeout(() => { + result.update(); + clickMainMenuButton(result, "Reshape"); + setTimeout(() => { + result.update(); + result + .find(Reshape) + .find("div.modal-body") + .find("button") + .first() + .simulate("click"); + t.equal(result.find(Aggregate).length, 1, "should show reshape pivot"); + const aggComp = result.find(Aggregate).first(); + const aggInputs = aggComp.find(Select); + aggInputs + .first() + .instance() + .onChange({ value: "col1" }); + aggComp + .find("button") + .last() + .simulate("click"); + aggInputs + .at(1) + .instance() + .onChange({ value: "count" }); + aggInputs + .at(2) + .instance() + .onChange({ value: "col2" }); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + setTimeout(() => { + result + .find(Reshape) + .find("div.modal-body") + .find("button") + .last() + .simulate("click"); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + setTimeout(() => { + result.update(); + t.equal(result.find(Reshape).length, 0, "should hide reshape"); + done(); + }, 400); + }, 400); + }, 400); + }, 600); + }); + + test("DataViewer: reshape aggregate errors", done => { + const { DataViewer } = require("../../../dtale/DataViewer"); + const Reshape = require("../../../popups/reshape/Reshape").ReactReshape; + const { Aggregate } = require("../../../popups/reshape/Aggregate"); + const store = reduxUtils.createDtaleStore(); + buildInnerHTML({ settings: "" }, store); + const result = mount( + + + , + { attachTo: document.getElementById("content") } + ); + setTimeout(() => { + result.update(); + clickMainMenuButton(result, "Reshape"); + setTimeout(() => { + result.update(); + result + .find(Reshape) + .find("div.modal-body") + .find("button") + .first() + .simulate("click"); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + result.update(); + t.equal(result.find(RemovableError).text(), "Missing an index selection!", "should render error"); + const aggComp = result.find(Aggregate).first(); + const aggInputs = aggComp.find(Select); + aggInputs + .first() + .instance() + .onChange({ value: "col1" }); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + t.equal(result.find(RemovableError).text(), "Missing an aggregation selection!", "should render error"); + aggComp + .find("button") + .last() + .simulate("click"); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + t.equal(result.find(RemovableError).text(), "Missing an aggregation selection!", "should render error"); + done(); + }, 400); + }, 600); + }); + + test("DataViewer: reshape aggregation cfg validation", done => { + const { validateAggregateCfg } = require("../../../popups/reshape/Aggregate"); + const cfg = { index: null, agg: null }; + t.equal(validateAggregateCfg(cfg), "Missing an index selection!"); + cfg.index = ["x"]; + cfg.agg = { type: "func" }; + t.equal(validateAggregateCfg(cfg), "Missing an aggregation selection!"); + cfg.agg = { type: "col" }; + t.equal(validateAggregateCfg(cfg), "Missing an aggregation selection!"); + done(); + }); +}); diff --git a/static/__tests__/dtale/reshape/DataViewer-reshape-pivot-test.jsx b/static/__tests__/dtale/reshape/DataViewer-reshape-pivot-test.jsx new file mode 100644 index 00000000..106dd74d --- /dev/null +++ b/static/__tests__/dtale/reshape/DataViewer-reshape-pivot-test.jsx @@ -0,0 +1,235 @@ +import { mount } from "enzyme"; +import React from "react"; +import { ModalClose } from "react-modal-bootstrap"; +import { Provider } from "react-redux"; +import Select from "react-select"; + +import { RemovableError } from "../../../RemovableError"; +import mockPopsicle from "../../MockPopsicle"; +import * as t from "../../jest-assertions"; +import reduxUtils from "../../redux-test-utils"; +import { buildInnerHTML, clickMainMenuButton, withGlobalJquery } from "../../test-utils"; + +const originalOffsetHeight = Object.getOwnPropertyDescriptor(HTMLElement.prototype, "offsetHeight"); +const originalOffsetWidth = Object.getOwnPropertyDescriptor(HTMLElement.prototype, "offsetWidth"); +const originalInnerWidth = Object.getOwnPropertyDescriptor(HTMLElement.prototype, "innerWidth"); +const originalInnerHeight = Object.getOwnPropertyDescriptor(HTMLElement.prototype, "innerHeight"); + +describe("DataViewer tests", () => { + const { location, open, opener } = window; + + beforeAll(() => { + delete window.location; + delete window.open; + delete window.opener; + window.location = { + reload: jest.fn(), + pathname: "/dtale/iframe/1", + assign: jest.fn(), + }; + window.open = jest.fn(); + window.opener = { code_popup: { code: "test code", title: "Test" } }; + Object.defineProperty(HTMLElement.prototype, "offsetHeight", { + configurable: true, + value: 500, + }); + Object.defineProperty(HTMLElement.prototype, "offsetWidth", { + configurable: true, + value: 800, + }); + Object.defineProperty(window, "innerWidth", { + configurable: true, + value: 1205, + }); + Object.defineProperty(window, "innerHeight", { + configurable: true, + value: 775, + }); + + const mockBuildLibs = withGlobalJquery(() => + mockPopsicle.mock(url => { + const { urlFetcher } = require("../../redux-test-utils").default; + return urlFetcher(url); + }) + ); + + const mockChartUtils = withGlobalJquery(() => (ctx, cfg) => { + const chartCfg = { ctx, cfg, data: cfg.data, destroyed: false }; + chartCfg.destroy = () => (chartCfg.destroyed = true); + chartCfg.getElementsAtXAxis = _evt => [{ _index: 0 }]; + chartCfg.getElementAtEvent = _evt => [{ _datasetIndex: 0, _index: 0, _chart: { config: cfg, data: cfg.data } }]; + return chartCfg; + }); + + jest.mock("popsicle", () => mockBuildLibs); + jest.mock("chart.js", () => mockChartUtils); + jest.mock("chartjs-plugin-zoom", () => ({})); + jest.mock("chartjs-chart-box-and-violin-plot/build/Chart.BoxPlot.js", () => ({})); + }); + + afterAll(() => { + window.location = location; + window.open = open; + window.opener = opener; + Object.defineProperty(HTMLElement.prototype, "offsetHeight", originalOffsetHeight); + Object.defineProperty(HTMLElement.prototype, "offsetWidth", originalOffsetWidth); + Object.defineProperty(window, "innerWidth", originalInnerWidth); + Object.defineProperty(window, "innerHeight", originalInnerHeight); + }); + + test("DataViewer: reshape pivot", done => { + const { DataViewer } = require("../../../dtale/DataViewer"); + const Reshape = require("../../../popups/reshape/Reshape").ReactReshape; + const { Pivot } = require("../../../popups/reshape/Pivot"); + + const store = reduxUtils.createDtaleStore(); + buildInnerHTML({ settings: "" }, store); + const result = mount( + + + , + { attachTo: document.getElementById("content") } + ); + + setTimeout(() => { + result.update(); + clickMainMenuButton(result, "Reshape"); + setTimeout(() => { + result.update(); + t.equal(result.find(Reshape).length, 1, "should show reshape"); + result + .find(ModalClose) + .first() + .simulate("click"); + t.equal(result.find(Reshape).length, 0, "should hide reshape"); + clickMainMenuButton(result, "Reshape"); + setTimeout(() => { + result.update(); + t.equal(result.find(Pivot).length, 1, "should show reshape pivot"); + const pivotComp = result.find(Pivot).first(); + const pivotInputs = pivotComp.find(Select); + pivotInputs + .first() + .instance() + .onChange({ value: "col1" }); + pivotInputs + .at(1) + .instance() + .onChange({ value: "col2" }); + pivotInputs + .at(2) + .instance() + .onChange({ value: "col3" }); + pivotInputs + .last() + .instance() + .onChange({ value: "count" }); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + setTimeout(() => { + result.update(); + t.equal(result.find(Reshape).length, 1, "should hide reshape"); + result + .find(Reshape) + .find("div.modal-body") + .find("button") + .last() + .simulate("click"); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + setTimeout(() => { + result.update(); + t.equal(result.find(Reshape).length, 0, "should hide reshape"); + done(); + }, 400); + }, 400); + }, 400); + }, 400); + }, 600); + }); + + test("DataViewer: reshape pivot errors", done => { + const { DataViewer } = require("../../../dtale/DataViewer"); + const Reshape = require("../../../popups/reshape/Reshape").ReactReshape; + const { Pivot } = require("../../../popups/reshape/Pivot"); + + const store = reduxUtils.createDtaleStore(); + buildInnerHTML({ settings: "" }, store); + const result = mount( + + + , + { attachTo: document.getElementById("content") } + ); + + setTimeout(() => { + result.update(); + clickMainMenuButton(result, "Reshape"); + setTimeout(() => { + result.update(); + t.equal(result.find(Reshape).length, 1, "should show reshape"); + result.update(); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + result.update(); + t.equal(result.find(RemovableError).text(), "Missing an index selection!", "should render error"); + const pivotComp = result.find(Pivot).first(); + const pivotInputs = pivotComp.find(Select); + pivotInputs + .first() + .instance() + .onChange({ value: "col1" }); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + t.equal(result.find(RemovableError).text(), "Missing a columns selection!", "should render error"); + pivotInputs + .at(1) + .instance() + .onChange({ value: "col2" }); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + t.equal(result.find(RemovableError).text(), "Missing a value(s) selection!", "should render error"); + pivotInputs + .at(2) + .instance() + .onChange([{ value: "col3" }]); + pivotInputs + .last() + .instance() + .onChange({ value: "count" }); + done(); + }, 400); + }, 600); + }); + + test("DataViewer: reshape pivot cfg validation", done => { + const { validatePivotCfg } = require("../../../popups/reshape/Pivot"); + const cfg = { index: null, columns: null, values: null }; + t.equal(validatePivotCfg(cfg), "Missing an index selection!"); + cfg.index = "x"; + t.equal(validatePivotCfg(cfg), "Missing a columns selection!"); + cfg.columns = "y"; + t.equal(validatePivotCfg(cfg), "Missing a value(s) selection!"); + done(); + }); +}); diff --git a/static/__tests__/dtale/reshape/DataViewer-reshape-transpose-test.jsx b/static/__tests__/dtale/reshape/DataViewer-reshape-transpose-test.jsx new file mode 100644 index 00000000..f1fa4a6a --- /dev/null +++ b/static/__tests__/dtale/reshape/DataViewer-reshape-transpose-test.jsx @@ -0,0 +1,209 @@ +import { mount } from "enzyme"; +import React from "react"; +import { Provider } from "react-redux"; +import Select from "react-select"; + +import { RemovableError } from "../../../RemovableError"; +import mockPopsicle from "../../MockPopsicle"; +import * as t from "../../jest-assertions"; +import reduxUtils from "../../redux-test-utils"; +import { buildInnerHTML, clickMainMenuButton, withGlobalJquery } from "../../test-utils"; + +const originalOffsetHeight = Object.getOwnPropertyDescriptor(HTMLElement.prototype, "offsetHeight"); +const originalOffsetWidth = Object.getOwnPropertyDescriptor(HTMLElement.prototype, "offsetWidth"); +const originalInnerWidth = Object.getOwnPropertyDescriptor(HTMLElement.prototype, "innerWidth"); +const originalInnerHeight = Object.getOwnPropertyDescriptor(HTMLElement.prototype, "innerHeight"); + +describe("DataViewer tests", () => { + const { location, open, opener } = window; + + beforeAll(() => { + delete window.location; + delete window.open; + delete window.opener; + window.location = { + reload: jest.fn(), + pathname: "/dtale/iframe/1", + assign: jest.fn(), + }; + window.open = jest.fn(); + window.opener = { code_popup: { code: "test code", title: "Test" } }; + Object.defineProperty(HTMLElement.prototype, "offsetHeight", { + configurable: true, + value: 500, + }); + Object.defineProperty(HTMLElement.prototype, "offsetWidth", { + configurable: true, + value: 800, + }); + Object.defineProperty(window, "innerWidth", { + configurable: true, + value: 1205, + }); + Object.defineProperty(window, "innerHeight", { + configurable: true, + value: 775, + }); + + const mockBuildLibs = withGlobalJquery(() => + mockPopsicle.mock(url => { + const { urlFetcher } = require("../../redux-test-utils").default; + return urlFetcher(url); + }) + ); + + const mockChartUtils = withGlobalJquery(() => (ctx, cfg) => { + const chartCfg = { ctx, cfg, data: cfg.data, destroyed: false }; + chartCfg.destroy = () => (chartCfg.destroyed = true); + chartCfg.getElementsAtXAxis = _evt => [{ _index: 0 }]; + chartCfg.getElementAtEvent = _evt => [{ _datasetIndex: 0, _index: 0, _chart: { config: cfg, data: cfg.data } }]; + return chartCfg; + }); + + jest.mock("popsicle", () => mockBuildLibs); + jest.mock("chart.js", () => mockChartUtils); + jest.mock("chartjs-plugin-zoom", () => ({})); + jest.mock("chartjs-chart-box-and-violin-plot/build/Chart.BoxPlot.js", () => ({})); + }); + + afterAll(() => { + window.location = location; + window.open = open; + window.opener = opener; + Object.defineProperty(HTMLElement.prototype, "offsetHeight", originalOffsetHeight); + Object.defineProperty(HTMLElement.prototype, "offsetWidth", originalOffsetWidth); + Object.defineProperty(window, "innerWidth", originalInnerWidth); + Object.defineProperty(window, "innerHeight", originalInnerHeight); + }); + + test("DataViewer: reshape transpose", done => { + const { DataViewer } = require("../../../dtale/DataViewer"); + const Reshape = require("../../../popups/reshape/Reshape").ReactReshape; + const { Transpose } = require("../../../popups/reshape/Transpose"); + + const store = reduxUtils.createDtaleStore(); + buildInnerHTML({ settings: "" }, store); + const result = mount( + + + , + { attachTo: document.getElementById("content") } + ); + + setTimeout(() => { + result.update(); + clickMainMenuButton(result, "Reshape"); + setTimeout(() => { + result.update(); + result + .find(Reshape) + .find("div.modal-body") + .find("button") + .at(2) + .simulate("click"); + t.equal(result.find(Transpose).length, 1, "should show reshape pivot"); + const transposeComp = result.find(Transpose).first(); + const transposeInputs = transposeComp.find(Select); + transposeInputs + .first() + .instance() + .onChange([{ value: "col1" }]); + transposeInputs + .last() + .instance() + .onChange([{ value: "col2" }]); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + setTimeout(() => { + result.update(); + t.equal(result.find(Reshape).length, 1, "should hide reshape"); + result + .find(Reshape) + .find("div.modal-body") + .find("button") + .last() + .simulate("click"); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + setTimeout(() => { + result.update(); + t.equal(result.find(Reshape).length, 0, "should hide reshape"); + done(); + }, 400); + }, 400); + }, 400); + }, 600); + }); + + test("DataViewer: reshape transpose errors", done => { + const { DataViewer } = require("../../../dtale/DataViewer"); + const Reshape = require("../../../popups/reshape/Reshape").ReactReshape; + const { Transpose } = require("../../../popups/reshape/Transpose"); + + const store = reduxUtils.createDtaleStore(); + buildInnerHTML({ settings: "" }, store); + const result = mount( + + + , + { attachTo: document.getElementById("content") } + ); + + setTimeout(() => { + result.update(); + clickMainMenuButton(result, "Reshape"); + setTimeout(() => { + result.update(); + t.equal(result.find(Reshape).length, 1, "should show reshape"); + result + .find(Reshape) + .find("div.modal-body") + .find("button") + .at(2) + .simulate("click"); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + result.update(); + t.equal(result.find(RemovableError).text(), "Missing an index selection!", "should render error"); + result + .find(Transpose) + .first() + .find(Select) + .first() + .instance() + .onChange({ value: "col1" }); + result + .find("div.modal-footer") + .first() + .find("button") + .first() + .simulate("click"); + setTimeout(() => { + result.update(); + done(); + }, 400); + }, 400); + }, 600); + }); + + test("DataViewer: reshape transpose cfg validation", done => { + const { validateTransposeCfg } = require("../../../popups/reshape/Transpose"); + const cfg = { index: null }; + t.equal(validateTransposeCfg(cfg), "Missing an index selection!"); + cfg.index = ["x"]; + t.equal(validateTransposeCfg(cfg), null); + done(); + }); +}); diff --git a/static/__tests__/iframe/DataViewer-base-test.jsx b/static/__tests__/iframe/DataViewer-base-test.jsx index 836a1624..9c09ec79 100644 --- a/static/__tests__/iframe/DataViewer-base-test.jsx +++ b/static/__tests__/iframe/DataViewer-base-test.jsx @@ -136,8 +136,8 @@ describe("DataViewer iframe tests", () => { .find("ul li span.font-weight-bold") .map(s => s.text()), _.concat( - ["Describe", "Filter", "Build Column", "Correlations", "Charts", "Resize", "Heat Map", "Instances 1"], - ["Code Export", "About", "Refresh", "Open Popup", "Shutdown"] + ["Describe", "Filter", "Build Column", "Reshape", "Correlations", "Charts", "Resize", "Heat Map"], + ["Instances 1", "Code Export", "About", "Refresh", "Open Popup", "Shutdown"] ), "Should render default menu options" ); diff --git a/static/__tests__/iframe/DataViewer-within-iframe-test.jsx b/static/__tests__/iframe/DataViewer-within-iframe-test.jsx index c1b5c19a..a9a1e1d0 100644 --- a/static/__tests__/iframe/DataViewer-within-iframe-test.jsx +++ b/static/__tests__/iframe/DataViewer-within-iframe-test.jsx @@ -85,8 +85,8 @@ describe("DataViewer within iframe tests", () => { .find("ul li span.font-weight-bold") .map(s => s.text()), _.concat( - ["Describe", "Filter", "Build Column", "Correlations", "Charts", "Resize", "Heat Map", "Instances 1"], - ["Code Export", "About", "Refresh", "Open Popup", "Shutdown"] + ["Describe", "Filter", "Build Column", "Reshape", "Correlations", "Charts", "Resize", "Heat Map"], + ["Instances 1", "Code Export", "About", "Refresh", "Open Popup", "Shutdown"] ), "Should render default iframe menu options" ); diff --git a/static/__tests__/popups/Instances-test.jsx b/static/__tests__/popups/Instances-test.jsx index 59006be6..18202068 100644 --- a/static/__tests__/popups/Instances-test.jsx +++ b/static/__tests__/popups/Instances-test.jsx @@ -198,7 +198,14 @@ describe("Instances tests", () => { expect(assignSpy).toHaveBeenCalledWith("http://localhost:8080/dtale/main/8083"); assignSpy.mockRestore(); global.window = origWindow; - done(); + result + .find(".ico-remove-circle") + .first() + .simulate("click"); + setTimeout(() => { + result.update(); + done(); + }, 400); }, 200); }, 200); }, 200); diff --git a/static/__tests__/redux-test-utils.jsx b/static/__tests__/redux-test-utils.jsx index 51625bba..a3d11369 100644 --- a/static/__tests__/redux-test-utils.jsx +++ b/static/__tests__/redux-test-utils.jsx @@ -203,11 +203,18 @@ function urlFetcher(url) { if (urlParams.name === "error") { return { error: "error test" }; } + return { success: true, url: "http://localhost:40000/dtale/main/1" }; + } else if (_.startsWith(url, "/dtale/reshape")) { + if (urlParams.index === "error") { + return { error: "error test" }; + } return { success: true }; } else if (_.startsWith(url, "/dtale/context-variables")) { return getDataId(url) === "error" ? { error: "Error loading context variables" } : CONTEXT_VARIABLES; } else if (_.startsWith(url, "/dtale/code-export")) { return { code: "test code" }; + } else if (_.startsWith(url, "/dtale/cleanup")) { + return { success: true }; } return {}; } diff --git a/static/dash/lib/custom.js b/static/dash/lib/custom.js index 2b97fca3..96eeca72 100644 --- a/static/dash/lib/custom.js +++ b/static/dash/lib/custom.js @@ -25,6 +25,11 @@ function copy(e) { .fadeOut(400); } +function exportChart(e, href) { + e.preventDefault(); + window.open(href + "&_id=" + new Date().getTime(), "_blank"); +} + window.onload = function() { $("body").click(function(e) { const target = $(e.target); @@ -32,6 +37,10 @@ window.onload = function() { openCodeSnippet(e); } else if (target.parent().is("a.copy-link-btn")) { copy(e); + } else if (target.is("a.export-chart-btn")) { + exportChart(e, target.attr("href")); + } else if (target.parent().is("a.export-chart-btn")) { + exportChart(e, target.parent().attr("href")); } }); }; diff --git a/static/dtale/DataViewerMenu.jsx b/static/dtale/DataViewerMenu.jsx index be0d3a24..e5f8eb01 100644 --- a/static/dtale/DataViewerMenu.jsx +++ b/static/dtale/DataViewerMenu.jsx @@ -56,6 +56,14 @@ class ReactDataViewerMenu extends React.Component { +
  • + + + +
  • + ); + }} + className="cell" + /> + ); + const cleanup = rowData => e => { + this.cleanup(rowData); + e.stopPropagation(); + }; + cleanupCol = ( + { + if (rowData.data_id === this.props.dataId) { + return null; + } + return ; + }} + className="cell" + /> + ); + } return (
    @@ -172,6 +230,7 @@ class Instances extends React.Component { onRowClick={_rowClick} className="instances" headerClassName="headerCell"> + {cleanupCol} - { - if (rowData.data_id === this.props.dataId) { - return null; - } - return ( - - ); - }} - className="cell" - /> + {previewCol} )} diff --git a/static/popups/Popup.jsx b/static/popups/Popup.jsx index 54acd05a..2bba00d8 100644 --- a/static/popups/Popup.jsx +++ b/static/popups/Popup.jsx @@ -14,6 +14,7 @@ import { Describe } from "./Describe"; import Instances from "./Instances"; import { Charts } from "./charts/Charts"; import { CreateColumn } from "./create/CreateColumn"; +import { Reshape } from "./reshape/Reshape"; class ReactPopup extends React.Component { constructor(props) { @@ -74,6 +75,15 @@ class ReactPopup extends React.Component { ); body = ; break; + case "reshape": + modalTitle = ( + + + {"Reshape Data"} + + ); + body = ; + break; case "about": modalTitle = ( diff --git a/static/popups/create/CreateBins.jsx b/static/popups/create/CreateBins.jsx index d1758dbf..b7044f02 100644 --- a/static/popups/create/CreateBins.jsx +++ b/static/popups/create/CreateBins.jsx @@ -38,12 +38,10 @@ function buildCode({ col, operation, bins, labels }) { return code; } -const BASE_STATE = { col: null, operation: "cut", bins: null, labels: null }; - class CreateBins extends React.Component { constructor(props) { super(props); - this.state = _.assignIn({}, BASE_STATE); + this.state = { col: null, operation: "cut", bins: null, labels: null }; this.updateState = this.updateState.bind(this); } @@ -51,8 +49,7 @@ class CreateBins extends React.Component { const currState = _.assignIn(this.state, state); const cfg = _.pick(currState, ["operation", "bins", "labels"]); cfg.col = _.get(currState, "col.value") || null; - const code = buildCode(currState); - this.setState(currState, () => this.props.updateState({ cfg, code })); + this.setState(currState, () => this.props.updateState({ cfg, code: buildCode(currState) })); } render() { diff --git a/static/popups/create/CreateColumn.jsx b/static/popups/create/CreateColumn.jsx index 214ca985..3c29372c 100644 --- a/static/popups/create/CreateColumn.jsx +++ b/static/popups/create/CreateColumn.jsx @@ -20,6 +20,7 @@ const BASE_STATE = { cfg: null, code: {}, loadingColumns: true, + loadingColumn: false, }; class ReactCreateColumn extends React.Component { @@ -71,15 +72,24 @@ class ReactCreateColumn extends React.Component { this.setState({ error: }); return; } + this.setState({ loadingColumn: true }); const createParams = { name, type, cfg: JSON.stringify(cfg) }; fetchJson(buildURLString(`/dtale/build-column/${this.props.dataId}?`, createParams), data => { if (data.error) { - this.setState({ error: }); - } else if (_.startsWith(window.location.pathname, "/dtale/popup/build")) { - window.opener.location.reload(); - } else { - this.props.chartData.propagateState({ refresh: true }, this.props.onClose); + this.setState({ + error: , + loadingColumn: false, + }); + return; } + this.setState({ loadingColumn: false }, () => { + if (_.startsWith(window.location.pathname, "/dtale/popup/build")) { + window.opener.location.reload(); + window.close(); + } else { + this.props.chartData.propagateState({ refresh: true }, this.props.onClose); + } + }); }); } @@ -168,8 +178,10 @@ class ReactCreateColumn extends React.Component { ,
    {codeMarkup} -
    , ]; diff --git a/static/popups/create/CreateDatetime.jsx b/static/popups/create/CreateDatetime.jsx index 0803543c..06d8fc81 100644 --- a/static/popups/create/CreateDatetime.jsx +++ b/static/popups/create/CreateDatetime.jsx @@ -35,17 +35,15 @@ function buildCode({ col, operation, property, conversion }) { return code; } -const BASE_STATE = { - col: null, - operation: "property", - property: null, - conversion: null, -}; - class CreateDatetime extends React.Component { constructor(props) { super(props); - this.state = _.assignIn({}, BASE_STATE); + this.state = { + col: null, + operation: "property", + property: null, + conversion: null, + }; this.updateState = this.updateState.bind(this); this.renderOperationOptions = this.renderOperationOptions.bind(this); } diff --git a/static/popups/create/CreateNumeric.jsx b/static/popups/create/CreateNumeric.jsx index 9564abd6..7ddf1a0a 100644 --- a/static/popups/create/CreateNumeric.jsx +++ b/static/popups/create/CreateNumeric.jsx @@ -62,16 +62,14 @@ function buildCode({ left, operation, right }) { return code; } -const BASE_STATE = { - left: { type: "col", col: null, val: null }, - operation: null, - right: { type: "col", col: null, val: null }, -}; - class CreateNumeric extends React.Component { constructor(props) { super(props); - this.state = _.assign({}, BASE_STATE); + this.state = { + left: { type: "col", col: null, val: null }, + operation: null, + right: { type: "col", col: null, val: null }, + }; this.updateState = this.updateState.bind(this); this.renderOperand = this.renderOperand.bind(this); } diff --git a/static/popups/reshape/Aggregate.jsx b/static/popups/reshape/Aggregate.jsx new file mode 100644 index 00000000..55ae8766 --- /dev/null +++ b/static/popups/reshape/Aggregate.jsx @@ -0,0 +1,276 @@ +import _ from "lodash"; +import PropTypes from "prop-types"; +import React from "react"; +import Select, { createFilter } from "react-select"; + +import { AGGREGATION_OPTS } from "./Pivot"; + +function validateAggregateCfg(cfg) { + const { index, agg } = cfg; + if (!_.size(index || [])) { + return "Missing an index selection!"; + } + const { type, cols, func } = agg; + if (type === "func" && _.isNil(func)) { + return "Missing an aggregation selection!"; + } else if (type === "col" && !_.size(_.pickBy(cols || {}, v => _.size(v || [])))) { + return "Missing an aggregation selection!"; + } + return null; +} + +function buildCode({ index, columns, agg }) { + if (!_.size(index || []) || _.isNil(agg)) { + return null; + } + const { type, cols, func } = agg; + let code = `df.groupby(['${_.join(_.map(index, "value"), "', '")}'])`; + if (type === "func") { + if (_.isNil(func)) { + return null; + } + if (_.size(columns || [])) { + code = `df.groupby(['${_.join(_.map(index, "value"), "', '")}'])['${_.join(_.map(columns, "value"), "', '")}']`; + } + code += `.${func.value}()`; + } else { + if (_.isNil(cols)) { + return null; + } + code += ".aggregate({"; + code += _.join( + _.map(cols || {}, (aggs, col) => `'${col}': ['${_.join(aggs || [], "', '")}']`), + ", " + ); + code += "})"; + } + return code; +} + +class Aggregate extends React.Component { + constructor(props) { + super(props); + this.state = { + shape: _.clone(props.columns), + index: null, + columns: null, + agg: { type: "col" }, + }; + this.updateState = this.updateState.bind(this); + this.renderSelect = this.renderSelect.bind(this); + this.renderAggregation = this.renderAggregation.bind(this); + } + + updateState(state) { + const currState = _.assignIn(this.state, state); + const cfg = _.pick(currState, ["index"]); + const { agg } = currState; + cfg.agg = _.get(currState, "agg.value") || null; + if (_.size(currState.index)) { + cfg.index = _.map(currState.index, "value"); + } else { + cfg.index = null; + } + const aggCfg = { type: agg.type }; + if (agg.type === "col") { + aggCfg.cols = _.pickBy(agg.cols, v => _.size(v || [])); + } else { + aggCfg.func = _.get(agg, "func.value", null); + aggCfg.cols = _.map(this.state.columns || [], "value"); + } + cfg.agg = aggCfg; + this.setState(currState, () => this.props.updateState({ cfg, code: buildCode(currState) })); + } + + renderSelect(prop, otherProps, isMulti = false, ref = null) { + const { shape } = this.state; + let finalOptions = _.map(shape, "name"); + const otherValues = _(this.state) + .pick(otherProps) + .values() + .concat() + .map("value") + .compact() + .value(); + finalOptions = _.reject(finalOptions, otherValues); + const props = { + isMulti, + value: this.state[prop], + onChange: selected => this.updateState({ [prop]: selected }), + }; + if (ref) { + props.ref = r => (this[ref] = r); + delete props.value; + delete props.onChange; + } + return ( + (this._curr_agg_func = r)} + className="Select is-clearable is-searchable Select--single" + classNamePrefix="Select" + options={AGGREGATION_OPTS} + getOptionLabel={_.property("label")} + getOptionValue={_.property("value")} + isClearable + isMulti + filterOption={createFilter({ ignoreAccents: false })} // required for performance reasons! + /> + +
    +
    +
  • , + ], + _.map(_.get(this.state, "agg.cols", {}), (aggs, col) => ( +
    +
    +
    + Col: + + {col} + +
    +
    +
    +
    + Func: + {_.join(aggs, ", ")} + +
    +
    +
    + )) + ); + } else { + input = ( +
    +
    +
    + Func: + ({ value: o }))} + getOptionLabel={_.property("value")} + getOptionValue={_.property("value")} + value={this.state[prop]} + onChange={selected => this.updateState({ [prop]: selected })} + isClearable + filterOption={createFilter({ ignoreAccents: false })} // required for performance reasons! + /> + ); + } + + render() { + return [ +
    + +
    +
    {this.renderSelect("index", ["columns", "values"])}
    +
    +
    , +
    + +
    +
    {this.renderSelect("columns", ["index", "values"])}
    +
    +
    , +
    + +
    +
    {this.renderSelect("values", ["index", "columns"], true)}
    +
    +
    , +
    + +
    +
    + ({ value: o }))} + getOptionLabel={_.property("value")} + getOptionValue={_.property("value")} + value={this.state[prop]} + onChange={selected => this.updateState({ [prop]: selected })} + isClearable + filterOption={createFilter({ ignoreAccents: false })} // required for performance reasons! + /> + ); + } + + render() { + return [ +
    + +
    +
    {this.renderSelect("index", ["columns"], true)}
    +
    +
    , +
    + +
    +
    {this.renderSelect("columns", ["index"], true)}
    +
    +
    , + ]; + } +} +Transpose.displayName = "Transpose"; +Transpose.propTypes = { + updateState: PropTypes.func, + columns: PropTypes.array, +}; + +export { Transpose, validateTransposeCfg, buildCode }; diff --git a/tests/conftest.py b/tests/conftest.py index 9428669b..b34afe85 100644 --- a/tests/conftest.py +++ b/tests/conftest.py @@ -1,10 +1,14 @@ import getpass +import random +import string import unittest as ut import numpy as np import pandas as pd import pytest from arctic import CHUNK_STORE, Arctic +from pandas.tseries.offsets import Day +from past.utils import old_div from six import PY3 @@ -76,6 +80,32 @@ def rolling_data(): return pd.DataFrame(data, index=ii) +@pytest.fixture(scope="module") +def custom_data(request): + rows = request.param.get('rows', 100) + columns = request.param.get('cols', 10) + no_of_dates = request.param.get('dates', 364) + + now = pd.Timestamp(pd.Timestamp('now').date()) + dates = pd.date_range(now - Day(no_of_dates), now) + num_of_securities = max(old_div(rows, len(dates)), 1) # always have at least one security + + def _add_date(date, security_data): + return {k: date if k == 'date' else security_data[k] for k in list(security_data.keys()) + ['date']} + securities = [ + dict(security_id=100000 + sec_id, int_val=random.randint(1, 100000000000), + str_val=random.choice(string.ascii_letters) * 5) + for sec_id in range(num_of_securities) + ] + data = pd.concat([ + pd.DataFrame([_add_date(date, sd) for sd in securities]) + for date in dates + ], ignore_index=True)[['date', 'security_id', 'int_val', 'str_val']] + col_names = ['Col{}'.format(c) for c in range(columns)] + data = pd.concat([data, pd.DataFrame(np.random.randn(len(data), columns), columns=col_names)], axis=1) + return data + + @pytest.fixture(scope="module") def builtin_pkg(): if PY3: diff --git a/tests/dtale/test_charts.py b/tests/dtale/test_charts.py index 9e90e0a5..4aacc61d 100644 --- a/tests/dtale/test_charts.py +++ b/tests/dtale/test_charts.py @@ -18,6 +18,30 @@ def test_date_freq_handler(): assert s[0].dt.strftime('%Y%m%d').values[0] == '20200131' +@pytest.mark.unit +def test_group_filter_handler(): + s = chart_utils.group_filter_handler('date|WD', 1, 'I') + assert s == 'date.dt.dayofweek == 1' + s = chart_utils.group_filter_handler('date|H2', 1, 'I') + assert s == 'date.dt.hour == 1' + s = chart_utils.group_filter_handler('date|H', '20190101', 'D') + assert s == "date.dt.date == '20190101' and date.dt.hour == 0" + s = chart_utils.group_filter_handler('date|D', '20190101', 'D') + assert s == "date.dt.date == '20190101'" + s = chart_utils.group_filter_handler('date|W', '20190101', 'D') + assert s == 'date.dt.year == 2019 and date.dt.week == 1' + s = chart_utils.group_filter_handler('date|M', '20191231', 'D') + assert s == "date.dt.year == 2019 and date.dt.month == 12" + s = chart_utils.group_filter_handler('date|Q', '20191231', 'D') + assert s == "date.dt.year == 2019 and date.dt.quarter == 4" + s = chart_utils.group_filter_handler('date|Y', '20191231', 'D') + assert s == "date.dt.year == 2019" + s = chart_utils.group_filter_handler('foo', 1, 'I') + assert s == "foo == 1" + s = chart_utils.group_filter_handler('foo', 'bar', 'S') + assert s == "foo == 'bar'" + + @pytest.mark.unit def test_build_agg_data(): with pytest.raises(NotImplementedError): diff --git a/tests/dtale/test_dash.py b/tests/dtale/test_dash.py index 9ce7652d..6f06995e 100644 --- a/tests/dtale/test_dash.py +++ b/tests/dtale/test_dash.py @@ -6,10 +6,11 @@ from dtale.app import build_app from dtale.dash_application.charts import (build_axes, build_figure_data, - build_spaced_ticks, chart_wrapper) + build_spaced_ticks, + chart_url_params, chart_wrapper, + get_url_parser) from dtale.dash_application.components import Wordcloud from dtale.dash_application.layout import update_label_for_freq -from dtale.dash_application.views import chart_url_params, get_url_parser if PY3: from contextlib import ExitStack @@ -453,8 +454,7 @@ def test_chart_building_bar_and_popup(unittest): {'barmode': 'group', 'legend': {'orientation': 'h', 'y': 1.2}, 'title': {'text': 'b, c by a'}, - 'xaxis': {'tickmode': 'array', 'ticktext': [1, 2, 3], 'tickvals': [0, 1, 2], 'tickformat': '.0f', - 'title': {'text': 'a'}}, + 'xaxis': {'tickmode': 'auto', 'nticks': 3, 'tickformat': '.0f', 'title': {'text': 'a'}}, 'yaxis': {'title': {'text': 'b'}, 'tickformat': '.0f'}, 'yaxis2': {'anchor': 'x', 'overlaying': 'y', 'side': 'right', 'title': {'text': 'c'}, 'tickformat': '.0f'}} @@ -496,6 +496,22 @@ def test_chart_building_line(unittest): resp_data = response.get_json()['response'] assert resp_data['chart-content']['children']['type'] == 'Div' + df = pd.DataFrame([dict(sec_id=i, y=1) for i in range(15500)]) + with app.test_client() as c: + with ExitStack() as stack: + df, _ = views.format_data(df) + stack.enter_context(mock.patch('dtale.global_state.DATA', {c.port: df})) + pathname = path_builder(c.port) + inputs = { + 'chart_type': 'line', 'x': 'sec_id', 'y': ['y'], 'z': None, 'group': None, 'agg': None, + 'window': None, 'rolling_comp': None + } + chart_inputs = {'cpg': False, 'barmode': 'group', 'barsort': None} + params = build_chart_params(pathname, inputs, chart_inputs) + response = c.post('/charts/_dash-update-component', json=params) + resp_data = response.get_json()['response'] + assert 'chart-content' in resp_data + @pytest.mark.unit def test_chart_building_pie(unittest): @@ -564,6 +580,7 @@ def test_chart_building_heatmap(unittest, test_data, rolling_data): params = build_chart_params(pathname, inputs, chart_inputs) response = c.post('/charts/_dash-update-component', json=params) chart_markup = response.get_json()['response']['chart-content']['children']['props']['children'][1] + print(chart_markup) unittest.assertEqual( chart_markup['props']['figure']['layout']['title'], {'text': 'b by a weighted by c'} @@ -851,7 +868,7 @@ def test_chart_wrapper(unittest): def test_build_spaced_ticks(unittest): ticks = range(50) cfg = build_spaced_ticks(ticks) - assert len(cfg['tickvals']) == 26 + assert cfg['nticks'] == 26 @pytest.mark.unit @@ -884,3 +901,24 @@ def test_build_chart_type(): @pytest.mark.unit def test_update_label_for_freq(unittest): unittest.assertEqual(update_label_for_freq(['date|WD', 'date|D', 'foo']), 'date (Weekday), date, foo') + + +@pytest.mark.unit +def test_chart_url_params_w_group_filter(unittest): + from dtale.dash_application.charts import chart_url_params, chart_url_querystring + + querystring = chart_url_querystring(dict(chart_type='bar', x='foo', y=['bar'], group=['baz']), + group_filter=dict(group="baz == 'bizzle'")) + parsed_params = chart_url_params(querystring) + unittest.assertEqual( + parsed_params, + {'chart_type': 'bar', 'x': 'foo', 'cpg': False, 'y': ['bar'], 'group': ['baz'], 'query': "baz == 'bizzle'"} + ) + + +@pytest.mark.unit +def test_build_series_name(): + from dtale.dash_application.charts import build_series_name + + handler = build_series_name(['foo', 'bar'], chart_per_group=False) + assert handler('foo', 'bizz')['name'] == 'bizz/foo' diff --git a/tests/dtale/test_instance.py b/tests/dtale/test_instance.py index e7d419d0..10b40190 100644 --- a/tests/dtale/test_instance.py +++ b/tests/dtale/test_instance.py @@ -5,6 +5,8 @@ import pytest from six import PY3 +from dtale.dash_application.charts import get_url_parser + if PY3: from contextlib import ExitStack else: @@ -77,16 +79,23 @@ def mock_requests_get(url, verify=True): instance = DtaleData(9999, 'http://localhost:9999') instance.notebook_correlations(col1='col1', col2='col2') mock_iframe.assert_called_once() - assert mock_iframe.call_args[0][0] == 'http://localhost:9999/dtale/popup/correlations/9999?col1=col1&col2=col2' - - instance.notebook_charts('col1', 'col2', group=['col3', 'col4'], aggregation='count') - charts_url = 'http://localhost:9999/dtale/popup/charts/9999?aggregation=count&group=col3,col4&x=col1&y=col2' - assert mock_iframe.call_args[0][0] == charts_url - - instance.notebook_charts('col1', 'col2', aggregation='count') - charts_url = 'http://localhost:9999/dtale/popup/charts/9999?aggregation=count&x=col1&y=col2' - assert mock_iframe.call_args[0][0] == charts_url - instance.notebook_charts('col1', 'col2', group=['col3', 'col4']) - charts_url = 'http://localhost:9999/dtale/popup/charts/9999?group=col3,col4&x=col1&y=col2' - assert mock_iframe.call_args[0][0] == charts_url + url_parser = get_url_parser() + [path, query] = mock_iframe.call_args[0][0].split('?') + assert path == 'http://localhost:9999/dtale/popup/correlations/9999' + assert dict(url_parser(query)) == dict(col1='col1', col2='col2') + + instance.notebook_charts(x='col1', y='col2', group=['col3', 'col4'], agg='count') + [path, query] = mock_iframe.call_args[0][0].split('?') + assert path == 'http://localhost:9999/charts/9999' + assert dict(url_parser(query)) == dict(chart_type='line', agg='count', group='["col3", "col4"]', x='col1', + y='["col2"]', cpg='false') + + instance.notebook_charts(x='col1', y='col2', agg='count') + [_path, query] = mock_iframe.call_args[0][0].split('?') + assert dict(url_parser(query)) == dict(chart_type='line', agg='count', x='col1', y='["col2"]', cpg='false') + + instance.notebook_charts(x='col1', y='col2', group=['col3', 'col4']) + [_path, query] = mock_iframe.call_args[0][0].split('?') + assert dict(url_parser(query)) == dict(chart_type='line', x='col1', y='["col2"]', group='["col3", "col4"]', + cpg='false') diff --git a/tests/dtale/test_offline_chart.py b/tests/dtale/test_offline_chart.py new file mode 100644 index 00000000..49739b00 --- /dev/null +++ b/tests/dtale/test_offline_chart.py @@ -0,0 +1,38 @@ +import mock +import pytest +from six import PY3 + +if PY3: + from contextlib import ExitStack +else: + from contextlib2 import ExitStack + + +@pytest.mark.unit +def test_build_file(test_data, builtin_pkg, unittest): + from dtale import offline_chart + if PY3: + from unittest.mock import mock_open + else: + from mock import mock_open + + with ExitStack() as stack: + stack.enter_context(mock.patch('dtale.views.in_ipython_frontend', mock.Mock(return_value=False))) + stack.enter_context(mock.patch('dtale.views.open', mock_open())) + + output = offline_chart(test_data, chart_type='bar', x='date', y='foo', agg='sum') + assert output is not None + output = offline_chart(test_data, chart_type='bar', x='date', y='foo', agg='sum', filepath='foo') + assert output is None + output = offline_chart(test_data, chart_type='bar', x='date', y='foo', agg='sum', filepath='foo.html') + assert output is None + + +@pytest.mark.unit +def test_build_notebook(test_data, unittest): + from dtale import offline_chart + + with ExitStack() as stack: + stack.enter_context(mock.patch('dtale.views.in_ipython_frontend', mock.Mock(return_value=True))) + output = offline_chart(test_data, chart_type='bar', x='date', y='foo', agg='sum') + assert output is None diff --git a/tests/dtale/test_views.py b/tests/dtale/test_views.py index a4635d10..e20d80e3 100644 --- a/tests/dtale/test_views.py +++ b/tests/dtale/test_views.py @@ -197,22 +197,26 @@ def test_update_settings(unittest): settings = json.dumps(dict(locked=['a', 'b'])) with app.test_client() as c: - with mock.patch( + with ExitStack() as stack: + stack.enter_context(mock.patch('dtale.global_state.DATA', {c.port: None})) + mock_render_template = stack.enter_context(mock.patch( 'dtale.views.render_template', mock.Mock(return_value=json.dumps(dict(success=True))) - ) as mock_render_template: - response = c.get('/dtale/update-settings/1', query_string=dict(settings=settings)) + )) + response = c.get('/dtale/update-settings/{}'.format(c.port), query_string=dict(settings=settings)) assert response.status_code == 200, 'should return 200 response' - c.get('/dtale/main/1') + c.get('/dtale/main/{}'.format(c.port)) _, kwargs = mock_render_template.call_args unittest.assertEqual(kwargs['settings'], settings, 'settings should be retrieved') settings = 'a' with app.test_client() as c: - response = c.get('/dtale/update-settings/1', query_string=dict(settings=settings)) - assert response.status_code == 200, 'should return 200 response' - response_data = json.loads(response.data) - assert 'error' in response_data + with ExitStack() as stack: + stack.enter_context(mock.patch('dtale.global_state.DATA', {c.port: None})) + response = c.get('/dtale/update-settings/{}'.format(c.port), query_string=dict(settings=settings)) + assert response.status_code == 200, 'should return 200 response' + response_data = json.loads(response.data) + assert 'error' in response_data @pytest.mark.unit @@ -457,6 +461,152 @@ def test_build_column_bins(unittest): assert dtypes[c.port][-1]['dtype'] == 'string' +@pytest.mark.unit +def test_cleanup_error(unittest): + with app.test_client() as c: + with ExitStack() as stack: + stack.enter_context(mock.patch('dtale.global_state.cleanup', mock.Mock(side_effect=Exception))) + resp = c.get('/dtale/cleanup/1') + assert 'error' in json.loads(resp.data) + + +@pytest.mark.unit +@pytest.mark.parametrize('custom_data', [dict(rows=1000, cols=3)], indirect=True) +def test_reshape(custom_data, unittest): + from dtale.views import build_dtypes_state + + with app.test_client() as c: + data = {c.port: custom_data} + dtypes = {c.port: build_dtypes_state(custom_data)} + settings = {c.port: {}} + with ExitStack() as stack: + stack.enter_context(mock.patch('dtale.global_state.DATA', data)) + stack.enter_context(mock.patch('dtale.global_state.DTYPES', dtypes)) + stack.enter_context(mock.patch('dtale.global_state.SETTINGS', settings)) + reshape_cfg = dict(index='date', columns='security_id', values=['Col0']) # , aggfunc=None + resp = c.get( + '/dtale/reshape/{}'.format(c.port), + query_string=dict(output='new', type='pivot', cfg=json.dumps(reshape_cfg)) + ) + response_data = json.loads(resp.data) + new_key = str(int(c.port) + 1) + assert response_data['url'] == 'http://localhost:40000/dtale/main/{}'.format(new_key) + assert len(data.keys()) == 2 + unittest.assertEqual([d['name'] for d in dtypes[new_key]], ['date', '100000', '100001']) + assert len(data[new_key]) == 365 + assert settings[new_key].get('startup_code') is not None + + resp = c.get('/dtale/cleanup/{}'.format(new_key)) + assert json.loads(resp.data)['success'] + assert len(data.keys()) == 1 + + reshape_cfg['aggfunc'] = 'sum' + resp = c.get( + '/dtale/reshape/{}'.format(c.port), + query_string=dict(output='new', type='pivot', cfg=json.dumps(reshape_cfg)) + ) + response_data = json.loads(resp.data) + assert response_data['url'] == 'http://localhost:40000/dtale/main/{}'.format(new_key) + assert len(data.keys()) == 2 + unittest.assertEqual([d['name'] for d in dtypes[new_key]], ['date', '100000', '100001']) + assert len(data[new_key]) == 365 + assert settings[new_key].get('startup_code') is not None + c.get('/dtale/cleanup/{}'.format(new_key)) + + reshape_cfg['values'] = ['Col0', 'Col1'] + resp = c.get( + '/dtale/reshape/{}'.format(c.port), + query_string=dict(output='new', type='pivot', cfg=json.dumps(reshape_cfg)) + ) + response_data = json.loads(resp.data) + assert response_data['url'] == 'http://localhost:40000/dtale/main/{}'.format(new_key) + assert len(data.keys()) == 2 + unittest.assertEqual( + [d['name'] for d in dtypes[new_key]], + ['date', 'Col0 100000', 'Col0 100001', 'Col1 100000', 'Col1 100001'] + ) + assert len(data[new_key]) == 365 + assert settings[new_key].get('startup_code') is not None + c.get('/dtale/cleanup/{}'.format(new_key)) + + reshape_cfg = dict(index='date', agg=dict(type='col', cols={'Col0': ['sum', 'mean'], 'Col1': ['count']})) + resp = c.get( + '/dtale/reshape/{}'.format(c.port), + query_string=dict(output='new', type='aggregate', cfg=json.dumps(reshape_cfg)) + ) + response_data = json.loads(resp.data) + assert response_data['url'] == 'http://localhost:40000/dtale/main/{}'.format(new_key) + assert len(data.keys()) == 2 + unittest.assertEqual([d['name'] for d in dtypes[new_key]], ['date', 'Col0 sum', 'Col0 mean', 'Col1 count']) + assert len(data[new_key]) == 365 + assert settings[new_key].get('startup_code') is not None + c.get('/dtale/cleanup/{}'.format(new_key)) + + reshape_cfg = dict(index='date', agg=dict(type='func', func='mean', cols=['Col0', 'Col1'])) + resp = c.get( + '/dtale/reshape/{}'.format(c.port), + query_string=dict(output='new', type='aggregate', cfg=json.dumps(reshape_cfg)) + ) + response_data = json.loads(resp.data) + assert response_data['url'] == 'http://localhost:40000/dtale/main/{}'.format(new_key) + assert len(data.keys()) == 2 + unittest.assertEqual([d['name'] for d in dtypes[new_key]], ['date', 'Col0', 'Col1']) + assert len(data[new_key]) == 365 + assert settings[new_key].get('startup_code') is not None + c.get('/dtale/cleanup/{}'.format(new_key)) + + reshape_cfg = dict(index='date', agg=dict(type='func', func='mean')) + resp = c.get( + '/dtale/reshape/{}'.format(c.port), + query_string=dict(output='new', type='aggregate', cfg=json.dumps(reshape_cfg)) + ) + response_data = json.loads(resp.data) + assert response_data['url'] == 'http://localhost:40000/dtale/main/{}'.format(new_key) + assert len(data.keys()) == 2 + unittest.assertEqual( + [d['name'] for d in dtypes[new_key]], + ['date', 'security_id', 'int_val', 'Col0', 'Col1', 'Col2'] + ) + assert len(data[new_key]) == 365 + assert settings[new_key].get('startup_code') is not None + c.get('/dtale/cleanup/{}'.format(new_key)) + + reshape_cfg = dict(index=['security_id'], columns=['Col0']) + resp = c.get( + '/dtale/reshape/{}'.format(c.port), + query_string=dict(output='new', type='transpose', cfg=json.dumps(reshape_cfg)) + ) + response_data = json.loads(resp.data) + assert 'error' in response_data + + min_date = custom_data['date'].min().strftime('%Y-%m-%d') + settings[c.port] = dict(query="date == '{}'".format(min_date)) + reshape_cfg = dict(index=['date', 'security_id'], columns=['Col0']) + resp = c.get( + '/dtale/reshape/{}'.format(c.port), + query_string=dict(output='new', type='transpose', cfg=json.dumps(reshape_cfg)) + ) + response_data = json.loads(resp.data) + assert response_data['url'] == 'http://localhost:40000/dtale/main/{}'.format(new_key) + assert len(data.keys()) == 2 + print([d['name'] for d in dtypes[new_key]]) + unittest.assertEqual( + [d['name'] for d in dtypes[new_key]], + ['{} 00:00:00 100000'.format(min_date), '{} 00:00:00 100001'.format(min_date)] + ) + assert len(data[new_key]) == 1 + assert settings[new_key].get('startup_code') is not None + c.get('/dtale/cleanup/{}'.format(new_key)) + + reshape_cfg = dict(index=['date', 'security_id']) + resp = c.get( + '/dtale/reshape/{}'.format(c.port), + query_string=dict(output='override', type='transpose', cfg=json.dumps(reshape_cfg)) + ) + response_data = json.loads(resp.data) + assert response_data['url'] == 'http://localhost:40000/dtale/main/{}'.format(c.port) + + @pytest.mark.unit def test_dtypes(test_data): from dtale.views import build_dtypes_state, format_data @@ -1088,9 +1238,10 @@ def test_get_chart_data(unittest, test_data, rolling_data): response_data = json.loads(response.data) assert response_data['min']['security_id'] == 24.5 assert response_data['max']['security_id'] == 24.5 - assert response_data['data']['baz']['x'][-1] == '2000-01-05' - assert len(response_data['data']['baz']['security_id']) == 5 - assert sum(response_data['data']['baz']['security_id']) == 122.5 + series_key = "baz == 'baz'" + assert response_data['data'][series_key]['x'][-1] == '2000-01-05' + assert len(response_data['data'][series_key]['security_id']) == 5 + assert sum(response_data['data'][series_key]['security_id']) == 122.5 df, _ = views.format_data(rolling_data) with app.test_client() as c: @@ -1193,6 +1344,80 @@ def test_version_info(): assert 'unknown' in str(response.data) +@pytest.mark.unit +@pytest.mark.parametrize('custom_data', [dict(rows=1000, cols=3)], indirect=True) +def test_chart_exports(custom_data): + import dtale.views as views + + with app.test_client() as c: + with ExitStack() as stack: + stack.enter_context(mock.patch('dtale.global_state.DATA', {c.port: custom_data})) + stack.enter_context( + mock.patch('dtale.global_state.DTYPES', {c.port: views.build_dtypes_state(custom_data)}) + ) + params = dict(chart_type='invalid') + response = c.get('/dtale/chart-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'application/json' + + params = dict(chart_type='line', x='date', y=json.dumps(['Col0']), agg='sum', query='Col5 == 50') + response = c.get('/dtale/chart-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'application/json' + + params = dict(chart_type='bar', x='date', y=json.dumps(['Col0']), agg='sum') + response = c.get('/dtale/chart-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'text/html; charset=utf-8' + + response = c.get('/dtale/chart-csv-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'text/csv; charset=utf-8' + + params = dict(chart_type='line', x='date', y=json.dumps(['Col0']), agg='sum') + response = c.get('/dtale/chart-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'text/html; charset=utf-8' + + response = c.get('/dtale/chart-csv-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'text/csv; charset=utf-8' + + params = dict(chart_type='scatter', x='Col0', y=json.dumps(['Col1'])) + response = c.get('/dtale/chart-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'text/html; charset=utf-8' + + response = c.get('/dtale/chart-csv-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'text/csv; charset=utf-8' + + params = dict(chart_type='3d_scatter', x='date', y=json.dumps(['security_id']), z='Col0') + response = c.get('/dtale/chart-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'text/html; charset=utf-8' + + response = c.get('/dtale/chart-csv-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'text/csv; charset=utf-8' + + params = dict(chart_type='surface', x='date', y=json.dumps(['security_id']), z='Col0') + response = c.get('/dtale/chart-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'text/html; charset=utf-8' + + response = c.get('/dtale/chart-csv-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'text/csv; charset=utf-8' + + params = dict(chart_type='pie', x='security_id', y=json.dumps(['Col0']), agg='sum', + query='security_id >= 100000 and security_id <= 100010') + response = c.get('/dtale/chart-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'text/html; charset=utf-8' + + response = c.get('/dtale/chart-csv-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'text/csv; charset=utf-8' + + params = dict(chart_type='heatmap', x='date', y=json.dumps(['security_id']), z='Col0') + response = c.get('/dtale/chart-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'text/html; charset=utf-8' + + response = c.get('/dtale/chart-csv-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'text/csv; charset=utf-8' + + del params['x'] + response = c.get('/dtale/chart-csv-export/{}'.format(c.port), query_string=params) + assert response.content_type == 'application/json' + + @pytest.mark.unit def test_main(): import dtale.views as views @@ -1201,6 +1426,7 @@ def test_main(): test_data, _ = views.format_data(test_data) with app.test_client() as c: with ExitStack() as stack: + stack.enter_context(mock.patch('dtale.global_state.DATA', {c.port: test_data})) stack.enter_context(mock.patch('dtale.global_state.METADATA', {c.port: dict(name='test_name')})) stack.enter_context(mock.patch('dtale.global_state.SETTINGS', {c.port: dict(locked=[])})) response = c.get('/dtale/main/{}'.format(c.port)) @@ -1212,6 +1438,7 @@ def test_main(): with app.test_client() as c: with ExitStack() as stack: + stack.enter_context(mock.patch('dtale.global_state.DATA', {c.port: test_data})) stack.enter_context(mock.patch('dtale.global_state.METADATA', {c.port: dict()})) stack.enter_context(mock.patch('dtale.global_state.SETTINGS', {c.port: dict(locked=[])})) response = c.get('/dtale/main/{}'.format(c.port)) @@ -1220,12 +1447,14 @@ def test_main(): @pytest.mark.unit def test_200(): - paths = ['/dtale/main/1', '/dtale/iframe/1', '/dtale/popup/test/1', 'site-map', 'version-info', 'health', - '/charts/1', '/charts/popup/1', '/dtale/code-popup'] + paths = ['/dtale/main/{port}', '/dtale/iframe/{port}', '/dtale/popup/test/{port}', 'site-map', 'version-info', + 'health', '/charts/{port}', '/charts/popup/{port}', '/dtale/code-popup'] with app.test_client() as c: - for path in paths: - response = c.get(path) - assert response.status_code == 200, '{} should return 200 response'.format(path) + with ExitStack() as stack: + stack.enter_context(mock.patch('dtale.global_state.DATA', {c.port: None})) + for path in paths: + response = c.get(path.format(port=c.port)) + assert response.status_code == 200, '{} should return 200 response'.format(path) @pytest.mark.unit @@ -1282,7 +1511,7 @@ def test_jinja_output(): stack.enter_context(mock.patch('dtale.global_state.DATA', {c.port: df})) stack.enter_context(mock.patch('dtale.global_state.DTYPES', {c.port: views.build_dtypes_state(df)})) stack.enter_context(mock.patch('dtale.global_state.DATA', {c.port: df})) - response = c.get('/dtale/main/1') + response = c.get('/dtale/main/{}'.format(c.port)) assert 'span id="forkongithub"' in str(response.data) response = c.get('/charts/{}'.format(c.port)) assert 'span id="forkongithub"' in str(response.data)