Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

jbuild plugins #56

Closed
ghost opened this issue Apr 10, 2017 · 12 comments
Closed

jbuild plugins #56

ghost opened this issue Apr 10, 2017 · 12 comments

Comments

@ghost
Copy link

ghost commented Apr 10, 2017

It is currently impossible for users to extend the jbuild syntax. The ml syntax offers an escape hatch but you have to switch the whole file. One possible way to allow this would be:

(plugin:foo <arg>)

where foo would be a command that would take the S-expression <arg> on stdin and output some jbuild syntax on stdout, which would be more or less inlined in the file. Possibly it could use a different version of the spec.

For that to work well with the jenga integration, the command would have to output what files it reads and globs it evaluates.

@rgrinberg
Copy link
Member

I had a discussion with @lpw25 about the subject of jbuilder plugins and he has
an idea that is basically a superset of this. I've summarized his ideas below:

We extend the notion of plugins to be dynlinked libaries rather than just
executables. Such dynlinked libraries would register a handler. A plugin foo
registering a handler for bar would handle sexps of the form:

(foo:bar ...)

This would be the registration and plugin api:

module V1 : sig
  type 'a context =
    | Sexp : Sexp.t context
    | Ordered_set_lang : Ordered_set_lang.t context
    | Rules : Rules.t context

  type handler = {
    handle : 'a. 'a context -> Sexp.t -> 'a;
  }
  
  val interpret_set : Sexp.t -> Ordered_set_lang.t
  val interpret_sexp : Sexp.t -> Sexp.t
  val interpret_rules : Sexp.t -> Rules.t

  val register_extension : string -> (Version.t -> handler) -> unit
end

The idea is that everyone one of the types in context has a jbuilder function
handling it. This function would now be extensible by calling out to the
appropriate extension whenever it encounters a tagged sexp.

interpret_{rules,set,sexp} are useful for extensions that would like to reuse
other plugins to generate rules. They could simply generate appropriate tagged
sexps and call this function.

As an example, here's how a simplified library stanza would be implemented as a
plugin. Note how calling Jbuilder.interpret_{sexp,set} leaves our plugin
extensible by other plugins.

module Jbuilder = Jbuilder.V1

type lib = ...

let add_name : string -> lib -> lib = ...

let add_flags : Jbuilder.Ordered_set_lang.t -> lib -> lib = ...

let rules_for_lib : lib -> Jbuilder.Rules.t = ...

let handle_library_arg acc arg =
  match Jbuilder.interpret_sexp arg with
  | Atom "name" :: Atom name -> add_name name acc
  | Atom "flags" :: List flags -> add_flags (Jbuilder.interpret_set flags) acc
  | sexp -> Jbuilder.Error.unexpectd sexp "library stanza"

let rules_handler sexp =
  match sexp with
  | Atom "library" :: args ->
     let lib = List.fold_left handle_library_arg empty_lib in
     rules_for_lib lib
  | ...

let handler (type a) (context : a Jbuilder.context) sexp : a =
  match context with
  | Rules -> rules_handler sexp
  | ...

let () = Jbuilder.register_extension "jbuild" (fun _version -> { handler })

You would import jbuild plugins with a stanza like:

(lang (ext1 version) (ext2 version))

The first plugin listed will handle stanzas of the form (bar ..). This is a
useful generalization because it means that the old constructs defined by
jbuilder are no longer special. They are just a language that is built into
jbuilder.

This is important because we'd like to statically know which plugins are going
to be used by a jbuild file. Plugins may call out to other plugins by generating
appropriate sexps (for example), so it's important to know all the plugins that
will be used ahead of time.

Any library will be usable as a plugin, without any prior declaration. This is
so that plugins are usable from findlib libraries.

@ghost
Copy link
Author

ghost commented Dec 7, 2017

Well, that's more a discussion about the technical API. For me the first question is the following: when you write (run foo), foo is looked in the workspace and for that you need to interpret all jbuild files. If plugins are arbitrary libraries, you need to build them to interpret jbuild files, and to build them you need to interpret jbuild files, you see the issue...

Another thing to discuss is that if we make the system too generalist, jbuild files are essentially going to become blackboxes, and we won't even be able to lint them without building a lot of stuff.

Relying entirely on dynlink is a bit annoying as well.

The question of plugins is a bit complicated if we want to get it right, I think we should have another brainstorming session about it.

@lpw25
Copy link

lpw25 commented Dec 7, 2017

looked in the workspace and for that you need to interpret all jbuild files

You don't need to intepret all jbuild files -- you only need to interpret the ones needed for building foo. Now you don't know which ones those are in advance, but you just assume that the things using foo in their jbuilds aren't required for building foo and delay their loading until you've built foo.

Another thing to discuss is that if we make the system too generalist, jbuild files are essentially going to become blackboxes, and we won't even be able to lint them without building a lot of stuff.

That's going to be true not matter how you approach plugins -- you can't lint things using plugins without building the plugin. But clearly most jbuilds will just use the builtin "jbuild" language and can be linted without building anything.

@ghost
Copy link
Author

ghost commented Dec 7, 2017

To me this boils down to two choices: whether plugins and libraries should live in the same namespace. There are benefit to both approaches but one is clearly simpler than the other. And if we start with separate namespaces, we can always migrate to a single namespace latter. So we should discuss all this a bit more before starting the work on plugins.

That's going to be true not matter how you approach plugins -- you can't lint things using plugins without building the plugin. But clearly most jbuilds will just use the builtin "jbuild" language and can be linted without building anything.

Not necessarily, the current syntax could be entirely described using a static DSL, and we could require plugins to do the same. Maybe that's not worth it, but I think we should discuss it before we commit to one particular solution.

@rgrinberg
Copy link
Member

rgrinberg commented Dec 7, 2017 via email

@rgrinberg
Copy link
Member

rgrinberg commented Dec 7, 2017 via email

@ghost
Copy link
Author

ghost commented Dec 7, 2017

If we have only one namespace then plugins can be used inside normal libraries and normal libraries can use plugins if they want to.

If we have two namespaces, normal libraries cannot use plugins and plugins cannot use normal libraries. Plugins can only use other plugins.

For instance, you could imagine that all plugins live under jbuild-plugin directories and all these jbuild-plugin directories are considered as a separate universe that is built before the rest of the tree. The implementation is a lot simpler, and the story is much simpler when you don't have dynlink. On the other hand you can't use arbitrary libraries inside plugins. Personally I don't think it's too bad and we could always start this way and later move to a single namespace.

@rgrinberg
Copy link
Member

OK, I see. Thanks.

, and the story is much simpler when you don't have dynlink.

Aren't plugins unusable without dynlink in either situation? How does the absence of dynlink simplify things between these different universe assumptions.

Also, about from your original comment in this issue:

For that to work well with the jenga integration, the command would have to output what files it reads and globs it evaluates.

Could you explain why that is necessary for jenga integration? Would it still be necessary for dynlink based plugins.

Finally (apologies for the barrage of questions), would we still want to implement plugins as in the original proposal (executables) alongside this new proposal?

@ghost
Copy link
Author

ghost commented Dec 7, 2017

Aren't plugins unusable without dynlink in either situation? How does the absence of dynlink simplify things between these different universe assumptions.

With the two separate namespaces, the workflow is as follow:

  1. you consider only the jbuild-plugin directories and you build all the plugins, this step doesn't require plugins
  2. you link the plugins into jbuilder
  3. you restart the build considering the whole tree

step 2 can be done either with dynlink, by dynlinking a cmxs into the running jbuilder, or by linking a new jbuilder binary with the plugins statically linked in and chaining the execution to this new binary. The two code paths are similar, so it's simple to support both.

With a single namespace it is more complicated if you don't have dynlink.

Could you explain why that is necessary for jenga integration? Would it still be necessary for dynlink based plugins.

Finally (apologies for the barrage of questions), would we still want to implement plugins as in the original proposal (executables) alongside this new proposal?

I don't remember the details off-hand, but no, let's not go in all directions. Plugins as libraries is a better model, so we should only support that.

@lpw25
Copy link

lpw25 commented Dec 7, 2017

One possibility to get the separate namespaces but still make it easy to use plugins from ocamlfind/opam is to split the namespace with a prefix or suffix. So e.g. every library called foo-jbuild-plugin is considered as the plugin foo and building it cannot use plugins or depend on things other than other plugins.

@avsm
Copy link
Member

avsm commented Dec 15, 2017

It would be good to ensure that we don't require natdynlink support to use a native-code jbuilder out of the box. The link-executable-and-exec will work on non-natdynlink platforms as well. Once we have that, what are the benefits of maintaining a dynlink codepath at all? Once the executable is linked, it will be faster for subsequent runs than a repeated dynlink on every build.

@rgrinberg
Copy link
Member

We now have a fresh plugin proposal #1855 that makes this one obsolete

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants