-
Notifications
You must be signed in to change notification settings - Fork 206
Consider switching back to aeson-generated json serialization? #109
Comments
For the code generation, there are two cases
For the second case, it may be useful to get hie to be able to spit out a http://swagger.io/ description, or use http://json-schema.org/ (as previously suggested). |
The shape of the protocol should be the source for the code, not other way around. Rationale: interface part is and will be the most stable because changing it requires simultaneous change to multiple editors. This is hard and therefore will not happen often. If you want to generate something from something else then generate Haskell from JSON schema, not the other way around. |
@gracjan, I think we are in agreement on the protocol format being the driver for the code, but the question is how to manage its stability. @mgsloan has proposed a way of detecting changes so that there can be a version bump, indicating changes. I think the unstated question is whether we want "beautiful" JSON or whether the straight aeson instances are ok. And this to some extent comes down to the experience that the IDE plugin writers will have, and I do not have much insight into that. Any comments? |
Generating Haskell from JSON schema is a reasonable solution! It'd be interesting to see how well that works out. One of the reasons I like the idea of generating schema from haskell datatypes is that this way you can use more specific types (and so use newtypes that have additional checks, etc). I imagine that it's tricky to have a "schema --> Haskell" generator which supports the full gamet of possible datatypes, but maybe that isn't necessary. |
As someone who has working on IDE bindings, I don’t particularly care how they look. We need documentation and examples either way (I hope I get to the generating examples automatically part soon) so I don’t having json that looks a bit unconventional is actually a problem. |
Ok, so it seems that the important things are
On Sat, Nov 28, 2015 at 11:03 AM, Moritz Kiefer notifications@github.com
|
I don't think it's a good idea. Having automatic serialization will always leak some of our Haskell into the protocol. I saw it perfectly with the parameters serialization, where we ended up with objects called rp and op, which meant nothing for somebody that doesn't know the Haskell code. I don't buy the argument that the protocol can be ugly because it's machine to machine. An IDE writer will consider integrating HIE and will look at the protocol. If it makes no sense or is too ugly, he's going to go "WTF is that mess" and move on to better things. Of course we need to have explicit versioning and solid tests, but we should give us the possibility to change the way the Haskell is written without changing the JSON protocol, so we'll end up with a mix of automatic and generated JSON code anyway. |
Ok, based on this and IRC discussion it seems reasonable to leave it the way it is. It would be ideal to specify the protocol datatypes as few times as possible, and generate as much code as possible. However, this ideal requires an implementation of such code generation, which does not yet exist. So, it seems pragmatic to continue with the manually written JSON instances. Sorry for the distraction, but perhaps this will be an interesting alternative to revisit in the future. |
just for the record (not to reopen the issue) |
I think there are two separate issue
The solution to the one should not affect the other, and it is up to the specific IDE integrator to decide what is the best way to tackle the task. |
Regarding code generation:Again, I really understand that we can't force everyone to buy the Regarding your sentence:
I do not completely agree. I think it just depends on the architecture you want. Some related thought I had yesterday while looking at the codeping @mgsloan Imagine if we were:
then
In such a scenario, code generation would make more sense in many ways. Also, ide-integrators wouldn't have to pick between several available linting engines, or type info engines, but would benefit from us having made all the selection and unification work. (really, I'm just trying to be helpfull and give new ideas before the project matures too much) ⛵ |
It all sounds great, but means that there is a global namespace of commands, which means the task of a plugin-writer gets harder, and it is harder for people to experiment with private plugins which may then grow up to be full-featured ones in future. I do agree with
I am not sure about the other mapping suggestions though, unless they can be done in an extensible way, allowing local plugins. |
I share your interrogation, I'm still thinking about all alternatives. here is some "answer elements" to your points To mitigate some of the cons, we could imagine:
to insist on the consExtensibility may really be a little bit more difficult, and we will need more concensus about canonical types and commands |
Perhaps we should do both, over time. I think the current approach will allow us to identify the semantic types Once we have a handle on this and the way it works, we can see about They can possible run concurrently, or have a translation layer at various On Mon, Nov 30, 2015 at 4:04 PM, Rémi Vion notifications@github.com wrote:
|
#107 caused me to consider code generation for other editors to be an advantage of generated JSON instances. Once more editor code is written, it'd be good not to have sweeping changes to the protocol, so we ought to be sure the current approach is best. Despite the consensus in #32, it seems worthwhile to revisit the decision.
Pros
Cons
Solvable Cons
Solution: Generate a file every compilation, possibly by
pprint
ing all of the types, sorted by name. Stuff that isn't relative to serialization, such as which instances are derived, would get set to some default ([]
).The file would be part of the repo and would store a protocol version number. When this file differs from the HEAD version of the file, it'll force a version bump. Some differences may be benign. In this case, you can force the version number back to what it was, and commit the new version of the file.
The file could also store an aeson version number. While this usually shouldn't make a difference to the protocol, it would be a good idea to make sure that aeson didn't change any serialization details when upgrading it.
FromJSON
instance take old formats.ToJSON
instance, and add an explicitFromJSON
instance. This way, code generation will still work.The text was updated successfully, but these errors were encountered: