-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Builder API for Creating Spark DSLs #51
Comments
I can't wait to see this @erikareads |
An approach I've seen and I like for building nested structures, which you can take or leave as you please: entity = Builder.Entity.new(:my_entity, MyEntity)
|> Builder.Entity.field(name, fn field ->
field
|> Builder.Field.type(...)
|> Builder.Field.default(nil)
end)
|> Builder.Entity.field(...)
... |
My plan was to use entity = Builder.Entity.new(:my_entity, MyEntity)
|> Builder.Entity.required_field(
Builder.Field.new(:field_name, Builder.Type.any())
|> Builder.Field.default(...)
)
|> Builder.Entity.field(...)
... Either way, entity = Builder.Entity.new(:my_entity, MyEntity)
|> Builder.Entity.required_field(:field_name, Builder.Type.any(), fn field ->
field
|> Builder.Field.default(...)
end)
|> Builder.Entity.field(...)
... I'll need to think about what that looks like for a complex type specification. |
Perhaps a combination, the entity = Builder.Entity.new(:my_entity, MyEntity)
|> Builder.Entity.required_field(
Builder.Field.new(:field_name, Builder.Type.type(
Builder.Type.keyword_list(Builder.Type.string()),
fn type ->
type
|> Builder.Type.or(
Builder.Type.tuple(
[Builder.Type.string(), Builder.Type.string()]
)
)
|> Builder.Type.or(Builder.Type.string())
end),
fn field ->
field
|> Builder.Field.default(...)
|> Builder.Field.documentation(...)
end)
)
... |
This opens the door for: entity = Builder.Entity.new(:my_entity, MyEntity, fn entity ->
...
end) Which is relevant since entities will be nested in |
What are all the ways to fill fields in a targeted Entity struct? I'm thinking about cleaning up the interface by forcing a For example: Builder.Entity.required_field(entity, name, type, fn field -> ... end)
Builder.Entity.optional_field(entity, name, type, fn field -> ... end)
Builder.Entity.subentity_field(entity, subentity, relationship: :one_to_one | :one_to_many)
# Alternatively:
Builder.Entity.subentity_field(entity, :one_to_one, subentity, fn subentity -> ... end)
Builder.Entity.recursive_field(entity, recursive_as)
Builder.Entity.preset_field(entity, literal)
# Alternatively:
Builder.Entity.required_field(entity, name, type, fn field ->
field
|> Builder.Field.default(literal)
end)
# Some variation of:
Builder.Field.argument() # Adds a field's name to the args list With this arrangement, we can infer the schema directly while using a consistent interface. |
Looking through the fields of https://hexdocs.pm/spark/Spark.Dsl.Entity.html#t:t/0 that I don't understand:
|
Do we have any invalid states represented between |
Answering my own questions:
|
Question: Is there ever a case where a field_name can represent both a schema and a subentity? Is this something we would like to allow? That is, is |
Mentioned this briefly to @jimsynz earlier. What's stopping us defining a Spark DSL in a Spark DSL? Agree that it's a very strange feeling building a DSL in a struct when what you're trying to give your DSL users is all the bells and whistles you need to do that. I don't think a pipeline approach as described here goes far enough from struct editing, still seems less readable than a full blown DSL. |
The ultimate goal is to support leveraging these structures for runtime things (like runtime dynamic resourcs), so I think the builder api is still very relevant. |
Is your feature request related to a problem? Please describe.
Now, to create a DSL with Spark, you need to manually manipulate Entity and Section structs.
Describe the solution you'd like
I propose a builder API that allows the specification of all the information that a DSL needs procedurally, leaning heavily on the pipe operator.
Describe alternatives you've considered
A meta DSL would be another way to do this.
As @jimsynz and I discussed in #47, a new API gives us the opportunity to improve the usability and understandability of Spark. Including radical changes that don't need to be backwards compatible, since the API is new.
A Builder API will be easier to incrementally design and can serve as the backbone of what we do with the "Code as Data" from the meta-DSL.
Express the feature either with a change to resource syntax, or with a change to the resource interface
For example
With builder API modules and functions for all of the data needed for a complete DSL.
The text was updated successfully, but these errors were encountered: