You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In an early phase (adjs.lua:387), the Céu compiler detects the _ and replaces it with a hardwired int.
Type compatibility is checked much later (stmts.lua:506).
Alas, I'm not familiar enough with Céu's compiler internals or the Lua language to know whether adjs.lua could leave a dummy, or supply a fake type that will get overwritten later.
Also, I'm not familiar at all with type inference in this compiler in the presence of literals. What type of integer should be introduced in the following cases?
[ 0 -> 10],1 (I hope for the best: u8)
[ 0 -> 256],1 (u16?)
[ -10 <- 10 ],1 (s8?)
native x,y,s; loop _ in [ x -> y ], s do ... ??
I believe that in the latter case, Céu doesn't know anything about the actual types.
Should that be an error?
A warning ("unknown types, defaulting to int at your own risk")?
Use GCC's typeof(x) extension? (That would not work with native/const literals in C, only variables and derived expressions.)
The following loop doesn't compile:
Error:
invalid control variable : types mismatch : "int" <= "uint"
Version: 7beb2ad
The text was updated successfully, but these errors were encountered: