Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Infinite loop (and memory usage) #1815

Open
jneem opened this issue Feb 9, 2024 · 8 comments
Open

Infinite loop (and memory usage) #1815

jneem opened this issue Feb 9, 2024 · 8 comments

Comments

@jneem
Copy link
Member

jneem commented Feb 9, 2024

When trying to export the following file, nickel gets stuck and starts using up all my memory:

{ y = { foo = y } }
@jneem jneem added the type: bug label Feb 9, 2024
@yannham
Copy link
Member

yannham commented Feb 14, 2024

I mean, it is an infinite loop, right? Although we should probably be able to detect that without black-holing, and I'm not sure why we don't.

What behavior do you think would be right?

@jneem
Copy link
Member Author

jneem commented Feb 14, 2024

I'd expect the same infinite recursion error that you get from, say let rec y = y in y

@yannham
Copy link
Member

yannham commented Feb 16, 2024

I haven't tested this hypothesis, but I believe it's to be expected that black-holing doesn't detect such loops. In fact, as long as you don't use deep_seq (but both nickel eval and nickel export do use it, which is why it's looping), this term doesn't loop, in that it's always productive: if we evaluate y, we get a lazy record {foo = y}. If we then evaluate foo, we get in one step the record {foo = y}, and so on. It's basically an infinite record { y = { foo = {foo = { foo = ...., which is - at each step - at most one variable substitution away from being in weak head normal form.

So, evaluating y always terminates, and doesn't trigger the infinite recursion detection.

However, this is different when forcing/deep_sequing the record, of course, because we are now trying to recurse into an infinite data structure. I wonder if we can extend the blackholing mechanism to handle this case, by forbidding thunks to be evaluated again while we're still forcing the children of this thunk.

@D-Brox
Copy link

D-Brox commented Mar 19, 2024

Related to #1722

@yannham
Copy link
Member

yannham commented Sep 26, 2024

I think also related to #1967 - both the issue and the solution should be pretty similar.

@jneem
Copy link
Member Author

jneem commented Sep 27, 2024

I had a look at this today, but I couldn't figure out a version of:

forbidding thunks to be evaluated again while we're still forcing the children of this thunk

that works. For example, this input is fine:

{ y = { foo = z }, z = 1 }

but when forcing the child { foo = z }, we end up needing to re-evaluate the outermost thunk.

I guess one thing that might work (and maybe helps in #1967 too) is to have special handling for a field that evaluates to a parent thunk. I don't see how to fit this into the existing black-holing framework though; it seems like it might need an extra stack?

@yannham
Copy link
Member

yannham commented Sep 27, 2024

I have been looking at #1967 on my side actually 😅. My idea is to blackhole the thunk you're currently evaluating, that is the thunk of the field, not the thunk of the whole record. That is, before starting to evaluate y, peek at the value - unless it's a very simple constant it must be a Closure (it's an invariant of the evaluation) - and blackhole it. Of course it's a bit more complicated than that: if you just do that the evaluation will cause an immediate infinite recursion error, but I think we can make it work. I'll experiment and report back

@yannham
Copy link
Member

yannham commented Oct 2, 2024

Reproducing @jneem remark about using the result of #2055 to fix this issue as well but without a performance penalty: #2055 (review)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants