-
Notifications
You must be signed in to change notification settings - Fork 17
Make comment tokens available in parser #25
Comments
Hm this is an excellent point! Implementation-wise it might be pretty hard to use literal comments here because of how they're handled currently (entirely skipped). The parser would have to grow a good bit of logic to always skip ahead to the next non-comment token, but only sometimes because comments are only sometimes ignored. I'll be honest though I don't really have any past experience writing parsers that handle comments in a great way (other than simply skipping), so there might be a pretty clever way to encode this that I'm not thinking of! Stylistically though I can definitely see the appeal of just using comments because that has some pretty concise definitions! Perhaps a leaf could be taken out of Rust's book here? Could a convention be used for "doc comments" which are only allowed to appear in some locations, and other comments continue to get ignored? I think it all sort of depends on what syntax you're thinking of using. It'd be neat to do something like Mind gisting what you're thinking in terms of syntax? |
A "doc comment" convention seems fine. How about That would look something like this: ;;; Close a file descriptor.
;;; Note: This is similar to `close` in POSIX.
(@interface func (export "fd_close")
(param $fd $fd_t)
(result $error $errno_t)
) |
This commit adds functionality to parse comments in the source to `Parser`, allowing access to `Comment` tokens from a `Cursor`. This in turn should allow scraping comments on items, perhaps for documentation and other purposes. All existing methods on `Cursor` will implicitly skip whitespace/comments (as they do today). Additionally `Parser::is_empty` will only take `Token` tokens into account. Finally `cur_span` also will only return the span for `Token` tokens, rather than comment tokens. These latter pieces may want to get tweaked over time... Closes #25
This commit adds functionality to parse comments in the source to `Parser`, allowing access to `Comment` tokens from a `Cursor`. This in turn should allow scraping comments on items, perhaps for documentation and other purposes. All existing methods on `Cursor` will implicitly skip whitespace/comments (as they do today). Additionally `Parser::is_empty` will only take `Token` tokens into account. Finally `cur_span` also will only return the span for `Token` tokens, rather than comment tokens. These latter pieces may want to get tweaked over time... Closes #25
Ok after thinking about this I don't think it'll actually be all that hard! I've added a sample implementation of this at https://github.com/alexcrichton/wat/pull/26 with a test which shows a bit how it might work. The comment token text includes the leading Mind taking a look at that and seeing if it'd suffice for what y'all are thinking? |
This is excellent, thank you! I left out some of the details, like inspecting for |
👍 Thanks for prototyping! |
This commit adds functionality to parse comments in the source to `Parser`, allowing access to `Comment` tokens from a `Cursor`. This in turn should allow scraping comments on items, perhaps for documentation and other purposes. All existing methods on `Cursor` will implicitly skip whitespace/comments (as they do today). Additionally `Parser::is_empty` will only take `Token` tokens into account. Finally `cur_span` also will only return the span for `Token` tokens, rather than comment tokens. These latter pieces may want to get tweaked over time... Closes #25
@sunfishcode and I talked about putting the canonical Wasi docs into the witx file format as comments, in the style of Rust doc comments. Alternatively, we could put the docs into the s-expression syntax as strings.
If we were to go down the doc-comment route, how do you feel about exposing comments to the parser?
The text was updated successfully, but these errors were encountered: