-
Notifications
You must be signed in to change notification settings - Fork 108
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add new Postgresql.JSON.Experimental #283
base: master
Are you sure you want to change the base?
Add new Postgresql.JSON.Experimental #283
Conversation
@parsonsmatt when you get a chance, would you mind taking a look at this. I added the typeclasses with the intention being to get as much reuse between MySQL and postgres as possible but then MySQL json support is still working it's way through the underlying libraries. |
viewFieldBuilder :: SqlExpr (Entity val) -> IdentInfo -> FieldDef -> TLB.Builder | ||
viewFieldBuilder (ERaw m f) info fieldDef | ||
| Just alias <- sqlExprMetaAlias m = | ||
fst (f Never info) <> ("." <> useIdent info (aliasedEntityColumnIdent alias fieldDef)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't fst (f Never info)
literally sourceIdent info
?
Maybe define sourceIdent
as fst (f Never info)
and use in both places?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a better question, does this branch ever differ from the otherwise branch? both branches appear to produce identical results for the Just alias
case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, very true. fieldIdent
's Just
case will never get triggered 🙃 (And is indeed the same)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great ideas but we gotta nail down the details. Some of this is pretty tricky stuff to get really right.
Since there are changes to types of exposed functions, this is a breaking change, so we can slot it in with 3.6
-- | (Internal) Coerce a SqlExpr from any arbitrary a to any arbitrary b | ||
-- You should /not/ use this function unless you know what you're doing! | ||
veryVeryUnsafeCoerceSqlExpr :: SqlExpr a -> SqlExpr b | ||
veryVeryUnsafeCoerceSqlExpr = coerce |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wait we have a Coercible
instance on SqlExpr
? That's no good :|
where | ||
ed = entityDef $ Proxy @a | ||
baseFields = fmap (\fieldDef -> | ||
( unsafeSqlValue $ TLB.fromText $ "'" <> unFieldNameHS (fieldHaskell fieldDef) <> "'" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't seem right. We're using the Haskell names for the database representation of the Entity
, which isn't going to be how the database wants to handle the JSON object.
This is one of the big problems I ran into with trying to have jsonb_agg
on Entity a
. The {To,From}JSON
instances don't work out. So I ended up writing a newtype JSONViaPostgres a
that would dig into the EntityDef
and figure out how to decode the Postgres JSON. But that also requires that you have a more-or-less standardly defined JSON
instance for the record in question.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does persistent support a way to read the to/from json field name given a field?
We can sidestep the issue of to/from by using a custom decoder. As long as we are consistent at the boundaries we could even avoid the need to specify JSON instances for the types.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Am I right when I say this SqlToJSON
instance tries to put all the columns in their own field of a JSON object? (Just checking if I'm reading this right)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct, the goal being to make something that automatically just works with the default JSON parser that gets generated.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
which isn't going to be how the database wants to handle the JSON object.
@parsonsmatt I am not actually clear on what you mean by this? We aren't letting postgres do its default conversion.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I think I'm unsure that this is the right approach for converting something to a JSON representation.
I think I'd rather see something like:
newtype JsonEntity a = JsonEntity (Entity a)
instance PersistEntity a => FromJSON (JsonEntity a) where
parseJSON = withObject "JsonEntity" $ \o -> do
let edef = entityDef (Proxy @a)
...
which definitely parses how a database would convert things to JSON. Then we aren't having to worry about custom encoders or decoders.
persistent-2.14
's tabulateEntityA
could be useful here, but we may also need the FieldDict
class from prairie
to make it work how we want.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This could work sure, if we used a more general newtype SqlJson a = SqlJson a
we can have it default to the underlying FromJSON
and avoid surfacing the actual newtype to the end user though. The entity instance could then be defined as overlapping that default.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So there isn't really a great way to do this in general without that FieldDict
class since we need to know that the value on the other side has a FromJSON
dictionary.
Is there an example of someone using an exotic JSON parser? The thing that I need the default parser for is the underlying record not the Entity record
since the entityIdFromJSON
delegates to the FromJSON
instance on the record.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It should be noted that using the QQ's json
tag will actually only allow you to configure the outer encoder/decoder but the record its self is always just haskellName: someValue
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think it even has to be an exotic parse, just a different set of options. Like,
mkPersist sqlSettings [persistLowerCase|
User
name Text
age Int
|]
deriveJSON defaultOptions ''User
This would expect the JSON record to have the type name prefixes. I don't think the haskellName
has those. Any other modification to those options would break parsing the records from the database.
So I think we do need a newtype
with a custom FromJSON
instance, and we also need the FieldDict
class. I'll see about re-exporting that from persistent
and generating those instances.
, ERaw noMeta $ \_ info -> (viewFieldBuilder ent info fieldDef, []) | ||
)) (getEntityFields ed) | ||
idField = fmap (\fieldDef -> | ||
( unsafeSqlValue "'id'" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This isn't necessarily proper, as you can specify the primary column name:
User
Id UUID sql="my_user_id"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The actual field name in the SQL is irrelevant, it's the representation that FromJSON expects that matters. Entity assumes a key of id
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it not possible to use the database names? 🤔 To me that sounds more robust/portable.
Though I guess if it only happens within one query, it doesn't really matter.
EDIT: Oh, that might also make it possible to get back easily using jsonb_to_record
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indeed the db names could be used and a custom JSON decoder could be used to extract the fields.
@@ -49,21 +49,21 @@ share [mkPersist sqlSettings, mkMigrate "migrateAll"] [persistUpperCase| | |||
YetAnother | |||
argh ShoopId | |||
|
|||
Person | |||
Person json |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To give this a real whirl we'd want to also customize the json
options in MkPersistSettings
(or whichever thing allows you to do that)
, JSONE.multiset $ do | ||
posts <- Experimental.from $ table @BlogPost | ||
where_ $ posts ^. BlogPostAuthorId ==. person ^. PersonId | ||
pure ( posts | ||
, JSONE.multiset $ do | ||
comments <- Experimental.from $ table @Comment | ||
where_ $ comments ^. CommentBlog ==. posts ^. BlogPostId | ||
pure comments | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is dope
@belevy And maybe also |
where | ||
ed = entityDef $ Proxy @a | ||
baseFields = fmap (\fieldDef -> | ||
( unsafeSqlValue $ TLB.fromText $ "'" <> unFieldNameHS (fieldHaskell fieldDef) <> "'" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Am I right when I say this SqlToJSON
instance tries to put all the columns in their own field of a JSON object? (Just checking if I'm reading this right)
, ERaw noMeta $ \_ info -> (viewFieldBuilder ent info fieldDef, []) | ||
)) (getEntityFields ed) | ||
idField = fmap (\fieldDef -> | ||
( unsafeSqlValue "'id'" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it not possible to use the database names? 🤔 To me that sounds more robust/portable.
Though I guess if it only happens within one query, it doesn't really matter.
EDIT: Oh, that might also make it possible to get back easily using jsonb_to_record
class JsonBuildArray jsonValue where | ||
unsafeJsonbBuildArray :: UnsafeSqlFunctionArgument a => a -> SqlExpr (jsonValue b) | ||
|
||
class JsonBuildObject jsonValue where |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are the jsonValue
type variables all here in these class definitions so this can later be ported over to the JSONB
type, because this is considered experimental at the moment so you want to keep it separate?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
MySQL supports JSON as well. The intention is to add support in a db agnostic manner with as much independent of db specifics as possible
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, my only quibble here is over returning stuff that isn't going to parse how we want. We should expect people to be doing interesting things with their FromJSON
instances and I don't think we should require that the JSON is in any particular shape that we can't control ourselves.
Committing to a JSON format for this stuff is going to make changing that format and/or testing regressions difficult.
where | ||
ed = entityDef $ Proxy @a | ||
baseFields = fmap (\fieldDef -> | ||
( unsafeSqlValue $ TLB.fromText $ "'" <> unFieldNameHS (fieldHaskell fieldDef) <> "'" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ohhh, I see what's going on. We're explicitly unwrapping the table and making the (key, value)
pair ourselves in this. Neat!
Still, if there's a custom To/FromJSON
instance, then it may not serialize or deserialize correctly. I wonder if there's a good way we can ensure we're puling it out in the wa we'd expecy based on the EntityDef
🤔
where | ||
ed = entityDef $ Proxy @a | ||
baseFields = fmap (\fieldDef -> | ||
( unsafeSqlValue $ TLB.fromText $ "'" <> unFieldNameHS (fieldHaskell fieldDef) <> "'" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I think I'm unsure that this is the right approach for converting something to a JSON representation.
I think I'd rather see something like:
newtype JsonEntity a = JsonEntity (Entity a)
instance PersistEntity a => FromJSON (JsonEntity a) where
parseJSON = withObject "JsonEntity" $ \o -> do
let edef = entityDef (Proxy @a)
...
which definitely parses how a database would convert things to JSON. Then we aren't having to worry about custom encoders or decoders.
persistent-2.14
's tabulateEntityA
could be useful here, but we may also need the FieldDict
class from prairie
to make it work how we want.
…nb on newtype JsonValue. Also added multiset and multisetAgg convienience functions
ac9cd12
to
1d2fc6d
Compare
sqlSelectProcessRowJSON :: (Applicative f, Aeson.FromJSON r) | ||
=> [PersistValue] -> Either Text (f r) | ||
sqlSelectProcessRowJSON [PersistByteString bs] = | ||
case Aeson.eitherDecode $ LBS.fromStrict bs of |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No need to import Data.ByteString.Lazy
, aeson
's got your back:
either Text.pack pure $
Aeson.eitherDecodeStrict bs
I still don't see why the
|
@Vlix that would require anyone using the JSONB type to agree on how it works. If there are any inconsistencies between different implementations then the type would need to be split then. As for reusing the type internally with postgres, it's possible but would require ensuring that the existing operators all work as expected there. |
#282
toJsonb
function onValue
,Entity
, and tuples therof (up to 8)Before submitting your PR, check that you've:
@since
declarations to the Haddock.stylish-haskell
and otherwise adhered to the style guide.After submitting your PR: