We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There is an error during saving a very large pipeline that exceeds 4 GB.
Exception:
RuntimeError: File size unexpectedly exceeded ZIP64 limit
No exception.
It can probably be fixed if during archive creation in Zipfile.open option force_zip64=True will be used.
force_zip64=True
It is an artificial example:
import numpy as np from etna.models import CatBoostMultiSegmentModel from etna.datasets import TSDataset, generate_ar_df from etna.transforms import LagTransform from etna.pipeline import Pipeline HORIZON = 7 def main(): df = generate_ar_df(n_segments=10, start_time="2020-01-01", periods=100) df_wide = TSDataset.to_dataset(df) ts = TSDataset(df=df_wide, freq="D") model = CatBoostMultiSegmentModel() model.some_attr = np.zeros((100_000, 10_000)) transforms = [ LagTransform(in_column="target", lags=list(range(HORIZON, 50)), out_column="lags") ] pipeline = Pipeline(model=model, transforms=transforms, horizon=HORIZON) pipeline.fit(ts) pipeline.save("fitted.zip") if __name__ == "__main__": main()
No response
The text was updated successfully, but these errors were encountered:
Mr-Geekman
Successfully merging a pull request may close this issue.
🐛 Bug Report
There is an error during saving a very large pipeline that exceeds 4 GB.
Exception:
Expected behavior
No exception.
It can probably be fixed if during archive creation in Zipfile.open option
force_zip64=True
will be used.How To Reproduce
It is an artificial example:
Environment
No response
Additional context
No response
Checklist
The text was updated successfully, but these errors were encountered: