-
Notifications
You must be signed in to change notification settings - Fork 6
Convert some association lists to homogeneous maps? #27
Comments
The most competing for me is
I believe this is useful for dhall-to-json and dhall-to-yaml, not just for dhall-to-terraform, due to the expectation that these structures exist already and are expected in the json-consuming world. I was evaluating extending the algebra to add new language extensions in use-case-specific compilers, but it seems as though the parser and algebra are not designed for this. I expect this was somewhat on purpose to prevent fragmentation; if this was the case, maybe it makes sense to add a function that performs this transformation, but is only enabled in certain contexts, similar to how only flat structures can be rendered to JSON. Thoughts? |
Yeah, I agree that this should be part of My preference is to turn it on by default for reserved key/value field names. The reason why I want to standardize on the field names is to ensure that people can reuse or share utilities for programmatically generating homogeneous maps (which requires consensus on what the key name is). The reason why I think it should be on by default is to ensure that users don't need to transmit information out-of-band about what command line flags they used when sharing code with each other. I usually reserve command-line flags for things that do not affect semantics (i.e. error messages or formatting). Ideally a Dhall expression is self-contained and doesn't require additional information or context for a user to correctly consume. |
Having a unique type to represent this data structure seems like the correct approach to the problem. I would personally hope for a data type that does not have any overlap with an otherwise valid type, even if coersion between types exists. e.g. Shy of that, I would have to recommend against choosing reserved key/value names that are "too" obvious. Dhall is far from the first application of JSON to encounter this specific sort of limitation. http://opentsdb.net/docs/build/html/api_http/search/lookup.html#example-request Perhaps |
Having a particular function that enables the conversion would be sufficient for my needs, though it really begs
just as a quick example. I needed this when representing some structures in terraform that were almost entirely will typed other than the user generated content. |
Yeah, there isn't too much of an issue using a somewhat long or obscure key/value field name because if people have an issue with it they can always define/share a helper function to convert from convenient non-reserved names to less convenient reserved names, i.e.: convert
: ∀(a : Type) → List { key : Text, value : a } → List { mapKey : Text, mapValue : a } Regarding I think it's appropriate for My inclination is to go with something like Another close contender was Some other names I considered:
|
I agree with your position on almost all points; What would you say to a structure that wraps a list, providing this functionality when serializing into any language, but that exposes a similar interface to a list? I believe this offers the best of both worlds:
I think |
@blast-hardcheese: If I understand correctly, I think that you are proposing that the user could write code that optionally assumes three inputs like this: λ(Record : Type → Type)
→ λ(wrap : ∀(a : Type) → List { key : Text, value : a } → Record a)
→ λ(unwrap : ∀(a : Type) → Record a → List { key : Text, value : a })
→ { users = wrap { admin : Bool }
[ { key = "john", value = { admin = False } }
, { key = "mary", value = { admin = True } }
, { key = "alice", value = { admin = False } }
]
} ... where the user can name the I like that idea because it doesn't require any magic at all. All conversions are explicit and it doesn't collide with any existing namespace. If the user doesn't declare those function inputs of those types then Other languages that this concept would map onto are Python/Ruby/Perl where this sort of idiom is also common (and technically a JavaScript integration, which would be a superset of the current JSON integration). |
That's exactly what I was thinking, save for having an explicit function provided by the environment that does the wrapping and unwrapping, though I guess this isn't strictly necessary. How would you propose implementing this? |
The main reason I propose the user's code accepts the "built-ins" as function arguments is so that the code is compatible with other interpreters (i.e. you could reuse the same code with the I can take care of implementing this (I've done this sort of thing before), but the way this works is:
|
To me, this still seems somewhat magic, but at least it's more explicit magic; you definitely seem to be strongly considering some tradeoffs. This implementation additionally opens the door for more domain-specific, type-driven functions without polluting the base language, which could also be good. I guess using this feature when loading the script into Haskell could just be |
Some more thoughts:
is extremely generic, would the implementation accidentally remove some existing functionality?
|
@blast-hardcheese: You can actually already load Dhall's association lists into Haskell However, on more reflection I think we should go back to the original plan of using reserved key/value field names (i.e. |
I'd be fine going back to even |
Chiming in just to say that I'm hitting this same problem - porting Terraform config to Dhall and realizing just now that it's not possible to express this idiom in the language so far - and that I like where this thread is going; my 2c on some points:
|
Alright, then I'll set the default behavior so that most code in the Dhall ecosystem is compatible with each other but then allow people to opt out or change the reserved key/value field names. |
I'm still expecting we'll need to revisit this later, but maybe having some code will help further the discussion. Thanks for your patience through the back and forth here. |
Reflecting a bit more on this, I think I'll go with making a small wrapper (we can call it Examples:
|
@f-f: I'll probably implement this anyway because I think it's generally useful regardless of whether it completely solves the You probably want to do the processing using the Haskell API and then emit JSON from that using I don't think type-checking is an issue here. The post-processing that I'm proposing is after import-resolution/type-checking/normalization but before emitting JSON. I will have an implementation up soon so that you can see exactly what I have in mind. |
Alright, I have an example implementation showing how this would work: https://github.com/dhall-lang/dhall-json/tree/gabriel/homogeneous_maps I still need to refactor to the command-line API first to support the requested ability to opt out or modify the behavior appropriately |
This switches to using the lower-level `optparse-applicative` for command-line option parsing. The main motivation for this is to support more complex command-line option parsing that may be necessary for #27
Fixes #27 This adds support for converting Dhall lists of the form: ```haskell List { mapKey : Text, mapValue : v } ``` ... to JSON records. For example, this Dhall association list: ```json [ { mapKey = "foo", mapValue = 1 } , { mapKey = "bar", mapValue = 2 } ] ``` ... converts to this JSON record: ```json { "foo": 1 , "bar": 2 } ``` This functionality can be customized via the command line API. For example, you can use `--noMaps` to disable this entirely or you can use the `--key` and `--value` options to rename the expected fields for the association list.
Now there is a pull request with the fully-implemented feature: @blast-hardcheese: Give it a try and let me know if this works for you |
Using the MVP:
I'm still shaky on how projects should be organized for modularity and reuse, but this definitely unblocks me for now. Thanks for the quick turnaround! |
@blast-hardcheese: Wow, the latter link actually looks a lot like an HCL file :) I'll go ahead and merge #29 then |
Fixes #27 This adds support for converting Dhall lists of the form: ```haskell List { mapKey : Text, mapValue : v } ``` ... to JSON records. For example, this Dhall association list: ```json [ { mapKey = "foo", mapValue = 1 } , { mapKey = "bar", mapValue = 2 } ] ``` ... converts to this JSON record: ```json { "foo": 1 , "bar": 2 } ``` This functionality can be customized via the command line API. For example, you can use `--noMaps` to disable this entirely or you can use the `--key` and `--value` options to rename the expected fields for the association list.
Another success story here, thanks @Gabriel439 :) However, I won't share my snippet here as the one that @blast-hardcheese posted looks much better 😅 (I took a slightly different approach, mostly data instead of lambdas). |
@f-f I'm thinking the best we can get would be as described in https://github.com/blast-hardcheese/dhall-terraform/blob/master/CONTRIBUTING.md, published as a small library set published to ipfs or tracked as a git submodule or something. It'll always be a race between components supported in Terraform and whatever providers and features are tracked in https://github.com/blast-hardcheese/dhall-terraform/. It would be ideal for each terraform module were to expose JSON-schema or similar when queried, though that would require buy-in from Hashicorp that seems unlikely. |
@blast-hardcheese: The way I see it, if the Dhall to Terraform bindings are the only place to get a schema for Terraform features that will encourage more people to use Dhall 🙂 |
@Gabriel439 Have you found any barrier to adoption by distributing via ipfs, or should I continue down that route? |
@blast-hardcheese: I have run into issues using IPFS. The main problem is that the latest version seems to have a memory leak of some sort (possibly the same as ipfs/kubo#3532), meaning that I have to periodically restart the IPFS server every week or two I still continue to host the IPFS mirror to avoid disruption to existing documentation, but I wouldn't recommend others use it yet until that issue disappears. I'd recommend a simple static file server for now |
Hum. Another concern is some enterprise environments aren't big on having critical infrastructure hosted externally. A caching proxy would be fine, though clunky. Offline development would also be tricky. IPFS itself seems to lend itself to multiple resolution sources/caching layers, though; maybe a resolution hierarchy would help here. Is this a dhall-lang discussion or dhall-haskell? (I should say, whatever solution is discovered here should also apply to all URLs and possibly files, if that's desired/determined to not drastically increase complexity from a usage standpoint) |
Yeah, that was the reason I originally liked IPFS (and still buy into the vision despite the issues). It allows anybody to transparently increase the resiliency by just pinning the same expressions, builds integrity checks in the URI, and provides a way to mount any IPFS resource locally using If you're willing to deal with the maintenance costs then I would say go for it, but just be willing to over-provision or restart the server if you are affected by the memory leak |
The context for this is: dhall-lang/dhall-lang#136
Many JSON schemas expect homogeneous maps (i.e. the JSON equivalent of Haskell's
Data.Map.Map
), such as in the following contrived JSON configuration:... where you would not necessarily know the set of users statically in advance.
The idiomatic Dhall type corresponding to the above JSON configuration would be something like:
The question here is whether or not
dhall-to-json
anddhall-to-yaml
should support automatically converting association lists with a reserved field name to homogeneous maps.The easiest way to explain this is to show the following example encoding of the above JSON configuration:
Some variations on this proposal:
dhall-to-json
/dhall-to-yaml
and instead implement this in more specific integrations (likedhall-to-terraform
)value
; any field name is fine as long as the other field is namedkey
The text was updated successfully, but these errors were encountered: