-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Type instability in unflatten for medium / large tuples #600
Comments
I think this is an issue that requires some discussion (preferably involving @devmotion and @yebai too). We've previously been a bit too aggressive with sprinkling Now, whether that is indeed the correct approach, is unclear 🙃 For example, I think maybe just 10 arguments is a bit "too few", while I'd be hard pressed to think of models where I can see us ending up with And yeah, regarding I guess a super-easy approach: make the changes and make a PR, and then see how it affects compile times? 🤷 |
Yeah, that's fair. I agree it would have to be a rather large model for there to be more than 32 tilde statements. That being said, it is worth asking what it is that we would like to happen when that happens -- do we want inference to fall over, or do we want compilation to take slightly longer? |
My general stance is that Of course, the concrete case is slightly different because there might be instances where the default compiler thresholds seem just a tiny bit too low. Even in such cases I try to stay away from |
I agree with this in general. However, I'm unaware of another approach to handling largish collections of heterogeneous data in a type stable way. Is there an alternative that I'm unaware of? Put differently, what should
I agree that this is a something you have to keep on top of.
How should we get around using generated functions in such cases? Again, I might just be missing a good approach here... |
Naive question, what is |
See here :) |
MWE:
Note that if you change the length of
x
to 10, this infers correctly.The use of
ntuple
the implementation of this method of unflatten, here, means that inference bails out after 10 statements.You'll also hit another inference bailout at length 32 I believe, when
map
andcumsum
hit their inference heuristics.Assuming that in both instances you wish to avoid having inference bail out, you'll need to make use of generated functions. I've had to tackle this a fair bit in Tapir, and I've found that an effective strategy is to write a small number of generated functions which are higher-order, and to make use of them throughout. For example, I define a function called
tuple_map
, which is basically justmap
, but restricted to tuples, and which forces the compiler to specialise for any length of tuple. You also needcumsum
to specialise, so you'll need something more than just amap
replacement of course.I'd be happy to take a punt at this if you'd be interested @torfjelde ?
The text was updated successfully, but these errors were encountered: