Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documention #2

Open
24 of 49 tasks
TheDan64 opened this issue Jun 29, 2017 · 33 comments
Open
24 of 49 tasks

Documention #2

TheDan64 opened this issue Jun 29, 2017 · 33 comments

Comments

@TheDan64
Copy link
Owner

TheDan64 commented Jun 29, 2017

  • Full documentation:
    • attributes.rs: 8ecc815
    • basic_block.rs: ed7175c
    • builder.rs
    • comdat.rs: e185481
    • context.rs: e4c218c
    • data_layout.rs
    • debug_info.rs
    • execution_engine.rs
    • lib.rs
    • memory_buffer.rs
    • module.rs: 28a1de3
    • object_file.rs
    • passes.rs
    • support/
      • error_handling.rs: 8ecc815
      • mod.rs
    • targets.rs
    • types/
    • values/
      • array_value.rs e4fc903
      • basic_value_use.rs
      • call_site_value.rs: ff941af
      • enums.rs
      • float_value.rs
      • fn_value.rs
      • generic_value.rs
      • global_value.rs
      • instruction_value.rs
      • int_value.rs
      • metadata_value.rs
      • mod.rs
      • phi_value.rs
      • ptr_value.rs
      • struct_value.rs
      • traits.rs
      • vec_value.rs
  • Tari's example converted to Inkwell in the README
  • Google Sheet Notes on what is or isn't supported yet: link
  • Complete a brief (general overview) CHANGELOG.md for 0.1.0
  • Kaleidoscope example
  • Docs on GH pages: https://thedan64.github.io/inkwell/
@TheDan64 TheDan64 added this to the 0.1.0 milestone Jun 29, 2017
@TheDan64 TheDan64 changed the title Examples in Documention Documention Jul 13, 2017
@71
Copy link
Contributor

71 commented Sep 29, 2017

I'm currently working on the Kaleidoscope tutorial using Inkwell, and I think it would make a great self-documenting example. Any chance this would get added to the repo, once it fully works?

@TheDan64
Copy link
Owner Author

TheDan64 commented Sep 29, 2017

Definitely! I was actually thinking the same thing. I just cant decided if it should be a test or if it should live as its own binary in a /examples directory. The former has the advantage of being run in every build, so we can easily ensure it compiles every time. But the latter has the advantage of being more easily noticeable when someone is browsing the repo or root directory. They can also build it themselves and try it out if we add a simple CLI.

For reference, the example in the readme is also duplicated as a test but I think it's small enough that it isn't a big deal. A Kaleidoscope example would be much larger, for sure. I will add Kaleidoscope to the checklist in the original post.

@71
Copy link
Contributor

71 commented Sep 29, 2017

I think living in the /examples directory is definitely a plus. Furthermore, cargo test automatically compiles examples without running them (or at least can be configured to do so), which removes this problem.

I'm actually almost done with the first three chapters of the tutorial: the basic lexer, parser, and compiler are more-or-less working.

@TheDan64
Copy link
Owner Author

Oh neat, I didn't know that. Let's do that then. You can put it in /examples/kaleidoscope

@Michael-F-Bryan
Copy link
Contributor

Are there any plans on getting Travis to upload docs for master to GitHub Pages after every commit? It'd probably make hacking on the repository or importing inkwell directly from git a lot easier.

As an example, this is what we've got set up for mdbook.

@TheDan64
Copy link
Owner Author

TheDan64 commented Mar 8, 2018

No, I've never even thought about doing this as I am not really familiar with how github pages work.

That said, it does sound like an excellent idea, so thanks! I'll add it to the above checklist and try and prioritize it.

So if I understand correctly, you're basically committing the docs from cargo doc to the gh-pages branch? And GH pages will automatically host that?

Could you also clarify how it would make importing inkwell from git easier?

@Michael-F-Bryan
Copy link
Contributor

So if I understand correctly, you're basically committing the docs from cargo doc to the gh-pages branch? And GH pages will automatically host that?

Yep. Essentially everything in the gh-pages branch will be served up by GitHub Pages automatically. You do need to generate an API token (I'm pretty sure you need the public_repo permissions) to allow Travis to push stuff to github though.

I recently found out Travis made the process even easier by adding the GitHub Pages provider.

deploy:
  provider: pages
  skip-cleanup: true
  github-token: $GITHUB_TOKEN  # Set in travis-ci.org dashboard, marked secure
  keep-history: false  # this tends to really bloat your .git directory with useless docs churn
  local-dir: target/doc/
  on:
    branch: master

Could you also clarify how it would make importing inkwell from git easier?

Having access to an online version of the docs is useful when you need to link to things or want to browse the docs without needing to run cargo doc --open from the command line. As someone who doesn't mind trying out bleeding edge crates, it's nice to be able to browse a library's API before downloading it.

That said, this looks like a really cool! I'll try it out for a toy programming language I've been making in my spare time and see if there's anything I can do to help out.

@TheDan64
Copy link
Owner Author

Thanks for all this helpful info! I will be sure to look into doing this! And any assistance you can provide would be greatly appreciated.

@Michael-F-Bryan
Copy link
Contributor

I thought I'd make a more in-depth guide to using inkwell for making programming languages. It's not as simple as @6a's kaleidoscope example (I want to briefly mention parsing with lalrpop and linking to C libraries), but it'll probably serve as a good example of how to use inkwell in a wider program.

Depending on how far I get, it may be a useful link to add to inkwell's README.

@TheDan64
Copy link
Owner Author

That is super cool! Please feel free to file issues for any stumbling blocks you find along the way.

@71
Copy link
Contributor

71 commented Mar 10, 2018

Oh, that's great! I hope you manage to get FFI through LLVM working, because I didn't manage to using the recommended methods.

@Michael-F-Bryan
Copy link
Contributor

I've finished the parsing section so next up is the fun part... Generating LLVM IR 😁

Do you guys have any suggestions for turning an AST into a LLVM Module? The compile_expr() function in examples/kaleidoscope/main.rs is essentially a 300-odd line match statement, which makes it pretty hard to understand how the IR builder is being used.

@71
Copy link
Contributor

71 commented Mar 10, 2018

I went with a monolithic block of code because I stored expressions under a single enum, but if you're doing this differently you could go with the Visitor pattern, which splits the code gen into multiple functions.

Update: Looks like that's what you went with. You could still go with a single function and a match to take care of it, since your language is much simpler than the kaleidoscope one. Throwing out the if-then-else expression and for-in statements would make the compile_expr method much smaller.

@Michael-F-Bryan
Copy link
Contributor

Michael-F-Bryan commented Mar 11, 2018

Another important point that should be documented is inkwell's lifetimes/ownership story. Here are a couple questions I found myself asking when working on the AST->IR translation:

  • Is it safe for my Module to outlive the Context it was created with? (e.g. create a temporary Context in a function, use it to create a Module then return it, allowing the Context to be dropped)
  • Does LLVM use something like shared_ptr internally to make sure things don't get dropped too early?
  • Who owns the value and type nodes, and how long do they live for? For example you can point to a FloatValue or BasicBlock multiple times so presumably under the hood we're creating some sort of complex graph. Does LLVM use some sort of arena strategy where all nodes are owned by the Context and use raw pointers to link the graph nodes?
  • I got a segfault when trying to create an execution engine from a Module, before I'd even reached my unsafe code. Was this because I've somehow invalidated an assumption or requirement of the execution engine? If so, how can we encode this assumption in the type system so it's statically impossible to happen?

@Michael-F-Bryan
Copy link
Contributor

I also like enabling several lints on my projects, the missing_docs lint is awesome because it makes sure you don't accidentally leave parts of your API undocumented. This is the list of lints I usually use (taken from here):

#![deny(missing_docs,
        missing_debug_implementations, missing_copy_implementations,
        trivial_casts, trivial_numeric_casts,
        unsafe_code,
        unstable_features,
        unused_import_braces, unused_qualifications)]

@TheDan64
Copy link
Owner Author

TheDan64 commented Mar 12, 2018

Is it safe for my Module to outlive the Context it was created with? (e.g. create a temporary Context in a function, use it to create a Module then return it, allowing the Context to be dropped)

Yes, in Inkwell it is safe to do so because we ref count non global Contexts so that they stay around in their Modules even if the Context struct is dropped. Not 100% sure if this is right, but it seems to work:
https://github.com/TheDan64/inkwell/blob/master/tests/test_module.rs#L162-L185

Does LLVM use something like shared_ptr internally to make sure things don't get dropped too early?

I don't know. I think LLVM does a lot of different and weird things. I'm sure they do use shared_ptrs, but I doubt that's the only memory management strategy they have going on.

Who owns the value and type nodes, and how long do they live for? For example you can point to a FloatValue or BasicBlock multiple times so presumably under the hood we're creating some sort of complex graph. Does LLVM use some sort of arena strategy where all nodes are owned by the Context and use raw pointers to link the graph nodes?

As best as I've been able to tell, types and values are sort of LLVM singletons, but owned by a context, IIRC they don't segfault if you drop that Context though (we should test this if we don't already...) and try to use them

I got a segfault when trying to create an execution engine from a Module, before I'd even reached my unsafe code. Was this because I've somehow invalidated an assumption or requirement of the execution engine? If so, how can we encode this assumption in the type system so it's statically impossible to happen?

Not sure without seeing code. It would be great if you could open an issue with a sample of code that replicates the segfault. But yeah - this would ideally be encoded into the type system

@TheDan64
Copy link
Owner Author

@Michael-F-Bryan I'm trying to get docs hosted on pages as you suggested but for some reason it only displays the inkwell readme: https://thedan64.github.io/inkwell/

..but the gh-pages branch doesn't have a readme just the inkwell docs: https://github.com/TheDan64/inkwell/tree/gh-pages
Travis config: https://github.com/TheDan64/inkwell/blob/pages_docs/.travis.yml#L82-L89

Any ideas? I'm a bit at a loss here

@Michael-F-Bryan
Copy link
Contributor

I'm pretty sure I know what it is. If you go to your project's settings and scroll down, there's an option which tells GitHub Pages which branch to use when hosting stuff.

So you go to this tab:

screenshot_2018-03-18_14-47-33

And towards the end, before the "danger zone" section, you should see something like this:

screenshot_2018-03-18_14-46-46

@TheDan64
Copy link
Owner Author

That was it, thanks!!

@TheDan64
Copy link
Owner Author

@Michael-F-Bryan Docs are hosted here though they only show the latest LLVM version since there's a flaw in rustdoc regarding documenting multiple conflicting features. rust-lang/rust#43781 might work but it's not yet stable, and I don't wish to make inkwell nightly for just that. Thanks again for the suggestion!

@TheDan64 TheDan64 pinned this issue Apr 25, 2020
@imbrem
Copy link

imbrem commented Jun 23, 2020

If you want, I can help by writing some docs, since I've been using this library a lot. Tell me what needs documenting most and I'll submit a pull request.

@TheDan64
Copy link
Owner Author

Thanks for offering to help! Feel free to pick anything from the list in the OP. The files in the values dir are probably the most straightforward/simplest to document.

@Michael-F-Bryan
Copy link
Contributor

I've found the easiest way to do this sort of thing is by adding #![deny(missing_docs)] to the top of lib.rs and following the compile errors. Then after you've documented a couple dozen methods and are starting to get bored, you remove the attribute so the crate compiles again so you can merge the updated docs into master.

@Dominilk
Copy link

Hey,

I have been looking through the documentation for quite a while now and I just can’t find any about lazy JIT-compilation. Is there any available or can lazy JIT-compilation be implemented using this wrapper library?

Thanks,
Dominik

@TheDan64
Copy link
Owner Author

I don't believe the LLVM C API supports any lazy JIT. Unless you mean Orc JIT - I do see LLVMOrcCreateLazyCompileCallback, but Orc isn't really beyond the experimental stage in inkwell

@Dominilk
Copy link

I don't believe the LLVM C API supports any lazy JIT. Unless you mean Orc JIT - I do see LLVMOrcCreateLazyCompileCallback, but Orc isn't really beyond the experimental stage in inkwell

Yes I do mean Orc JIT and I think it's an important part of LLVM as JIT compiling everything in advance (including sections that are not even used) is extremely inefficient. So to what extend is it already useable in inkwell?

@TheDan64
Copy link
Owner Author

@Dominilk
Copy link

Are there any plans to improve support soon?

@Dominilk
Copy link

Hi, I really need the ORC layer for my project and I think it's also quite an important feature. Should I create an issue so that people actually notice that this is still work to be done?

@TheDan64
Copy link
Owner Author

Unfortunately I do not currently have the free time to implement additional ORC support. I would certainly be willing and able to accept PRs which improve the existing implementation and add additional tests

ayazhafiz pushed a commit to ayazhafiz/inkwell that referenced this issue Mar 10, 2022
@TheDan64 TheDan64 modified the milestones: 0.1.0, 0.2.0 Mar 29, 2022
@TheDan64 TheDan64 unpinned this issue Jul 16, 2022
folkertdev added a commit to folkertdev/inkwell that referenced this issue Dec 19, 2022
@Boscop
Copy link

Boscop commented Apr 8, 2023

@Dominilk

Yes I do mean Orc JIT and I think it's an important part of LLVM as JIT compiling everything in advance (including sections that are not even used) is extremely inefficient.

Yes, do you know which crate would be better to use when on-request compilation is desired? Is only llvm-sys capable of this right now?

@TheDan64 Btw, for a use case where on-request compilation is not required, but it is required to link a pre-compiled runtime.ll file together with the generated LLVM IR, can this be done with inkwell?
(Or can it not be used for JIT-linking?)
The runtime.ll would result from compiling a runtime lib written in Rust with rustc --emit=llvm-ir (or cargo asm --llvm --lib), whereas the other IR code comes from compiling a custom language to LLVM IR. The runtime contains code that spawns a thread-pool and processes reducible expressions in parallel that are coming from a work-stealing queue. It's the same for every program in this custom language, so it can be written in Rust and pre-compiled, but then it needs to be JIT-linked.
What do you think about this approach in general, is there a better way to link to a multi-threaded runtime written in Rust when compiling/JITing a custom language that needs this runtime?

@KKould
Copy link

KKould commented Jan 1, 2024

@Dominilk

Yes I do mean Orc JIT and I think it's an important part of LLVM as JIT compiling everything in advance (including sections that are not even used) is extremely inefficient.

Yes, do you know which crate would be better to use when on-request compilation is desired? Is only llvm-sys capable of this right now?

@TheDan64 Btw, for a use case where on-request compilation is not required, but it is required to link a pre-compiled runtime.ll file together with the generated LLVM IR, can this be done with inkwell? (Or can it not be used for JIT-linking?) The runtime.ll would result from compiling a runtime lib written in Rust with rustc --emit=llvm-ir (or cargo asm --llvm --lib), whereas the other IR code comes from compiling a custom language to LLVM IR. The runtime contains code that spawns a thread-pool and processes reducible expressions in parallel that are coming from a work-stealing queue. It's the same for every program in this custom language, so it can be written in Rust and pre-compiled, but then it needs to be JIT-linked. What do you think about this approach in general, is there a better way to link to a multi-threaded runtime written in Rust when compiling/JITing a custom language that needs this runtime?

I'm also looking for a similar library for rust

@TheDan64
Copy link
Owner Author

TheDan64 commented Jan 1, 2024

@Boscop I'm not sure unfortunately. But if it's possible in the llvm C API we should be able to do it in inkwell

Please all, let's keep this thread on topic to documentation. Please feel free to create a question issue if there isn't already one for this matter

@TheDan64 TheDan64 modified the milestones: 0.5.0, 0.6.0 Jul 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants