Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support stream reading/parsing #351

Open
mlartz opened this issue Jan 29, 2022 · 2 comments
Open

Support stream reading/parsing #351

mlartz opened this issue Jan 29, 2022 · 2 comments

Comments

@mlartz
Copy link

mlartz commented Jan 29, 2022

Given an Ion file of a series of records (e.g. an object in S3), we'd like to be able to stream read/parse this file, ultimately creating an async (Tokio) Stream of Ion records. Due to memory constraints (Lambda, Fargate, etc), we cannot read the entire file into memory and then iterate through the records.

@zslayton
Copy link
Contributor

zslayton commented Feb 7, 2022

To clarify a bit: the Reader is already capable of reading a file without pulling the entire thing into memory. However, if the input source it is reading from contains incomplete data, the Reader would either block (e.g. waiting on a socket) or fail (e.g. at the end of a Vec<u8>).

We'd like to offer an API that allows the Reader to be resilient to such a use case. For example, calls to next() or read_string() might return an IonError::InsufficientData (or IonError::WouldBlock?), indicating that the user should try again once more data is available.

@zslayton
Copy link
Contributor

We now have non-blocking text and binary readers, but they can't be constructed with the ReaderBuilder API yet. Once that's done (see #484), we can close this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants