Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Builder API for Creating Spark DSLs #51

Open
erikareads opened this issue Aug 9, 2023 · 13 comments
Open

Builder API for Creating Spark DSLs #51

erikareads opened this issue Aug 9, 2023 · 13 comments
Labels
enhancement New feature or request

Comments

@erikareads
Copy link
Contributor

Is your feature request related to a problem? Please describe.
Now, to create a DSL with Spark, you need to manually manipulate Entity and Section structs.

Describe the solution you'd like
I propose a builder API that allows the specification of all the information that a DSL needs procedurally, leaning heavily on the pipe operator.

Describe alternatives you've considered
A meta DSL would be another way to do this.
As @jimsynz and I discussed in #47, a new API gives us the opportunity to improve the usability and understandability of Spark. Including radical changes that don't need to be backwards compatible, since the API is new.
A Builder API will be easier to incrementally design and can serve as the backbone of what we do with the "Code as Data" from the meta-DSL.

Express the feature either with a change to resource syntax, or with a change to the resource interface

For example

entity = Spark.Builder.Entity.new(:my_entity, MyEntity)
  |> Spark.Builder.Entity.field(Spark.Builder.Field.new(...))
  |> Spark.Builder.Entity.field(...)
  |> Spark.Builder.Entity.subentity(...)
  ...

With builder API modules and functions for all of the data needed for a complete DSL.

@erikareads erikareads added the enhancement New feature or request label Aug 9, 2023
@jimsynz
Copy link
Contributor

jimsynz commented Aug 9, 2023

I can't wait to see this @erikareads

@jimsynz
Copy link
Contributor

jimsynz commented Aug 9, 2023

An approach I've seen and I like for building nested structures, which you can take or leave as you please:

entity = Builder.Entity.new(:my_entity, MyEntity)
  |> Builder.Entity.field(name, fn field ->
    field
    |> Builder.Field.type(...)
    |> Builder.Field.default(nil)
  end)
  |> Builder.Entity.field(...)
  ...

@erikareads
Copy link
Contributor Author

My plan was to use new at each layer:

entity = Builder.Entity.new(:my_entity, MyEntity)
  |> Builder.Entity.required_field(
    Builder.Field.new(:field_name, Builder.Type.any())
    |> Builder.Field.default(...)
  )
  |> Builder.Entity.field(...)
  ...

Either way, Field will need both a name and a type as part of its constructor:

entity = Builder.Entity.new(:my_entity, MyEntity)
  |> Builder.Entity.required_field(:field_name, Builder.Type.any(), fn field ->
    field
    |> Builder.Field.default(...)
  end)
  |> Builder.Entity.field(...)
  ...

I'll need to think about what that looks like for a complex type specification.

@erikareads
Copy link
Contributor Author

Perhaps a combination, the new constructor of the Field will take an optional anonymous function:

entity = Builder.Entity.new(:my_entity, MyEntity)
  |> Builder.Entity.required_field(
    Builder.Field.new(:field_name, Builder.Type.type(
      Builder.Type.keyword_list(Builder.Type.string()),
      fn type ->
        type
        |> Builder.Type.or(
            Builder.Type.tuple(
              [Builder.Type.string(), Builder.Type.string()]
            )
           )
        |> Builder.Type.or(Builder.Type.string())
      end),
      fn field ->
        field
        |> Builder.Field.default(...)
        |> Builder.Field.documentation(...)
    end)
   )
   ...

@erikareads
Copy link
Contributor Author

This opens the door for:

entity = Builder.Entity.new(:my_entity, MyEntity, fn entity ->
  ...
end)

Which is relevant since entities will be nested in Sections, sections nested in the top-level DSL.

@erikareads
Copy link
Contributor Author

What are all the ways to fill fields in a targeted Entity struct?

I'm thinking about cleaning up the interface by forcing a type_field() convention.

For example:

Builder.Entity.required_field(entity, name, type, fn field -> ... end)

Builder.Entity.optional_field(entity, name, type, fn field -> ... end)

Builder.Entity.subentity_field(entity, subentity, relationship: :one_to_one | :one_to_many) 

# Alternatively:

Builder.Entity.subentity_field(entity, :one_to_one, subentity, fn subentity -> ... end)

Builder.Entity.recursive_field(entity, recursive_as)

Builder.Entity.preset_field(entity, literal)

# Alternatively:

Builder.Entity.required_field(entity, name, type, fn field ->
  field
  |> Builder.Field.default(literal)
end)

# Some variation of:

Builder.Field.argument() # Adds a field's name to the args list

With this arrangement, we can infer the schema directly while using a consistent interface.

@erikareads
Copy link
Contributor Author

Looking through the fields of https://hexdocs.pm/spark/Spark.Dsl.Entity.html#t:t/0 that I don't understand:

  • identifier: Is this an attribute on a field, or a field type unto itself?
  • imports: What does this do?
  • links: What does this do?
  • modules: What does this do?
  • no_depend_modules: What does this do?

@erikareads
Copy link
Contributor Author

Do we have any invalid states represented between hide, identifier, deprecated, and auto_set_fields? Can a field have all of those at the same time?

@erikareads
Copy link
Contributor Author

Looking through the fields of https://hexdocs.pm/spark/Spark.Dsl.Entity.html#t:t/0 that I don't understand:

* `identifier`: Is this an attribute on a field, or a field type unto itself?

* `imports`: What does this do?

* `links`: What does this do?

* `modules`: What does this do?

* `no_depend_modules`: What does this do?

Answering my own questions:

identifier is its own kind of field, with special logic for unique identifying every created entity from that DSL macro. I plan to handle it as Builder.Entity.identifier_field(entity, name)

imports This is a list of modules that is auto-imported when the given entity or section is brought into scope. For example, it's used by Ash.Resource to bring in helper functions: https://github.com/ash-project/ash/blob/main/lib/ash/resource/validation/builtins.ex. I plan to handle this as Builder.Entity.import(entity, Module1)

links does nothing, so far as I can tell. It is ignored by Spark.

modules and no_depend_modules are both special cases of the :module type for a field with special handling for expansion into a fully qualified module name from an alias. In the latter case with some workarounds avoiding adding a dependency on the named module. Since they're special handling for a type of field, I plan to handle them internally when a field is declared as Builder.Type.module().

@erikareads
Copy link
Contributor Author

Question: Is there ever a case where a field_name can represent both a schema and a subentity? Is this something we would like to allow?

That is, is subentity_field a special case of required_field or optional_field, or a distinct case that is neither of the two?

@joshprice
Copy link
Contributor

Mentioned this briefly to @jimsynz earlier. What's stopping us defining a Spark DSL in a Spark DSL?

Agree that it's a very strange feeling building a DSL in a struct when what you're trying to give your DSL users is all the bells and whistles you need to do that. I don't think a pipeline approach as described here goes far enough from struct editing, still seems less readable than a full blown DSL.

@joshprice
Copy link
Contributor

Just answered my own question by immediately discovering #47 and #48 after I wrote the above comment. Is this issue still relevant?

@zachdaniel
Copy link
Contributor

The ultimate goal is to support leveraging these structures for runtime things (like runtime dynamic resourcs), so I think the builder api is still very relevant.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants