You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
the function _plotly_utils.basevalidators.copy_to_readonly_numpy_array performs a full copy of pd.Series objects which contain existing np.ndarray data as the values attribute. we could utilize the values attribute to dramatically speed up trace generation, especially for large dataframes.
environment: plotly version 3.1.0. macos high sierra 10.13.6. plotly installation via conda
working example:
importplotlyimport_plotly_utils.basevalidatorsimportnumpyprint('plotly version: {}'.format(plotly.__version__))
df=pd.DataFrame({'x': np.random.randint(0, 100, 1000000)})
# using `ipython` time magicprint('\ncoercing series')
%timev1=_plotly_utils.basevalidators.copy_to_readonly_numpy_array(df.x)
print('\naccessing np values directly')
%timev2=_plotly_utils.basevalidators.copy_to_readonly_numpy_array(df.x.values)
example output:
plotly version: 3.1.0
coercing series
CPU times: user 987 ms, sys: 35.5 ms, total: 1.02 s
Wall time: 854 ms
accessing np values directly
CPU times: user 1.45 ms, sys: 17 µs, total: 1.46 ms
Wall time: 1.49 ms
so a performance difference of approx 1000x
The text was updated successfully, but these errors were encountered:
I may work on a pull request for this today, but I believe this can be directly addressed by adding an extra elseif level checking if the type of v is pd.Series and if the type of v.values is np.ndarray (it could be pd.Categorical, e.g.)
the function
_plotly_utils.basevalidators.copy_to_readonly_numpy_array
performs a full copy ofpd.Series
objects which contain existingnp.ndarray
data as thevalues
attribute. we could utilize thevalues
attribute to dramatically speed up trace generation, especially for large dataframes.environment:
plotly
version3.1.0
. macos high sierra 10.13.6.plotly
installation viaconda
working example:
example output:
so a performance difference of approx 1000x
The text was updated successfully, but these errors were encountered: