Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add new Postgresql.JSON.Experimental #283

Open
wants to merge 5 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions esqueleto.cabal
Original file line number Diff line number Diff line change
Expand Up @@ -33,11 +33,13 @@ library
Database.Esqueleto.Legacy
Database.Esqueleto.Experimental
Database.Esqueleto.Internal.Internal
Database.Esqueleto.Internal.JSON
Database.Esqueleto.Internal.ExprParser
Database.Esqueleto.MySQL
Database.Esqueleto.PostgreSQL
Database.Esqueleto.PostgreSQL.JSON
Database.Esqueleto.Record
Database.Esqueleto.PostgreSQL.JSON.Experimental
Database.Esqueleto.SQLite
Database.Esqueleto.Experimental.From
Database.Esqueleto.Experimental.From.CommonTableExpression
Expand Down
53 changes: 27 additions & 26 deletions src/Database/Esqueleto/Internal/Internal.hs
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@
import Data.Kind (Type)
import qualified Data.List as List
import qualified Data.Map.Strict as Map
import qualified Data.Monoid as Monoid

Check warning on line 63 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.2.2)

The qualified import of ‘Data.Monoid’ is redundant

Check warning on line 63 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.2.2)

The qualified import of ‘Data.Monoid’ is redundant

Check warning on line 63 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 8.6.5)

The qualified import of ‘Data.Monoid’ is redundant

Check warning on line 63 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 8.8.4)

The qualified import of ‘Data.Monoid’ is redundant

Check warning on line 63 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 8.8.4)

The qualified import of ‘Data.Monoid’ is redundant

Check warning on line 63 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.4.5)

The qualified import of ‘Data.Monoid’ is redundant

Check warning on line 63 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.4.5)

The qualified import of ‘Data.Monoid’ is redundant

Check warning on line 63 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.6.2)

The qualified import of ‘Data.Monoid’ is redundant

Check warning on line 63 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.6.2)

The qualified import of ‘Data.Monoid’ is redundant

Check warning on line 63 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 8.10.4)

The qualified import of ‘Data.Monoid’ is redundant

Check warning on line 63 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 8.10.4)

The qualified import of ‘Data.Monoid’ is redundant

Check warning on line 63 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.0.2)

The qualified import of ‘Data.Monoid’ is redundant

Check warning on line 63 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.0.2)

The qualified import of ‘Data.Monoid’ is redundant
import Data.Proxy (Proxy(..))
import Data.Set (Set)
import qualified Data.Set as Set
Expand All @@ -70,7 +70,7 @@
import Data.Typeable (Typeable)
import Database.Esqueleto.Internal.ExprParser (TableAccess(..), parseOnExpr)
import Database.Esqueleto.Internal.PersistentImport
import Database.Persist (EntityNameDB(..), FieldNameDB(..), SymbolToField(..))

Check warning on line 73 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.2.2)

The import of ‘Database.Persist’ is redundant

Check warning on line 73 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.2.2)

The import of ‘Database.Persist’ is redundant

Check warning on line 73 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 8.8.4)

The import of ‘Database.Persist’ is redundant

Check warning on line 73 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 8.8.4)

The import of ‘Database.Persist’ is redundant

Check warning on line 73 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.4.5)

The import of ‘Database.Persist’ is redundant

Check warning on line 73 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.4.5)

The import of ‘Database.Persist’ is redundant

Check warning on line 73 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.6.2)

The import of ‘Database.Persist’ is redundant

Check warning on line 73 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.6.2)

The import of ‘Database.Persist’ is redundant

Check warning on line 73 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 8.10.4)

The import of ‘Database.Persist’ is redundant

Check warning on line 73 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 8.10.4)

The import of ‘Database.Persist’ is redundant

Check warning on line 73 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.0.2)

The import of ‘Database.Persist’ is redundant

Check warning on line 73 in src/Database/Esqueleto/Internal/Internal.hs

View workflow job for this annotation

GitHub Actions / build (3.8.1.0, 9.0.2)

The import of ‘Database.Persist’ is redundant
import qualified Database.Persist
import Database.Persist.Sql.Util
( entityColumnCount
Expand Down Expand Up @@ -560,20 +560,17 @@
-- 'subSelectMaybe'. For the most common safe use of this, see 'subSelectCount'.
--
-- @since 3.2.0
subSelectUnsafe :: PersistField a => SqlQuery (SqlExpr (Value a)) -> SqlExpr (Value a)
subSelectUnsafe :: SqlSelect (SqlExpr a) r => SqlQuery (SqlExpr a) -> SqlExpr a
subSelectUnsafe = sub SELECT

-- | Project a field of an entity.
(^.) :: forall typ val . (PersistEntity val, PersistField typ)
=> SqlExpr (Entity val)
-> EntityField val typ
-> SqlExpr (Value typ)
ERaw m f ^. field
ent ^. field
| isIdField field = idFieldValue
| Just alias <- sqlExprMetaAlias m =
ERaw noMeta $ \_ info ->
f Never info <> ("." <> useIdent info (aliasedEntityColumnIdent alias fieldDef), [])
| otherwise = ERaw noMeta $ \_ info -> (dot info $ persistFieldDef field, [])
| otherwise = ERaw noMeta $ \_ info -> (viewFieldBuilder ent info fieldDef, [])
where
fieldDef =
if isIdField field then
Expand All @@ -583,25 +580,25 @@
persistFieldDef field
idFieldValue =
case getEntityKeyFields ed of
idField :| [] ->
ERaw noMeta $ \_ info -> (dot info idField, [])
idField :| [] -> ERaw noMeta $ \_ info -> (viewFieldBuilder ent info idField, [])

idFields ->
let renderedFields info = dot info <$> NEL.toList idFields
let renderedFields info = viewFieldBuilder ent info <$> NEL.toList idFields
in ERaw noMeta{ sqlExprMetaCompositeFields = Just renderedFields} $
\p info -> (parensM p $ uncommas $ renderedFields info, [])

ed = entityDef $ getEntityVal (Proxy :: Proxy (SqlExpr (Entity val)))

dot info fieldDef' =
sourceIdent info <> "." <> fieldIdent
where
sourceIdent = fmap fst $ f Never
fieldIdent
| Just baseI <- sqlExprMetaAlias m =
useIdent info $ aliasedEntityColumnIdent baseI fieldDef'
| otherwise =
fromDBName info (coerce $ fieldDB fieldDef')
viewFieldBuilder :: SqlExpr (Entity val) -> IdentInfo -> FieldDef -> TLB.Builder
viewFieldBuilder (ERaw m f) info fieldDef =
sourceIdent info <> "." <> fieldIdent
where
sourceIdent = fst <$> f Never
fieldIdent
| Just baseI <- sqlExprMetaAlias m =
useIdent info $ aliasedEntityColumnIdent baseI fieldDef
| otherwise =
fromDBName info (coerce $ fieldDB fieldDef)

-- | Project an SqlExpression that may be null, guarding against null cases.
withNonNull
Expand Down Expand Up @@ -2374,7 +2371,7 @@
(valueToSet, valueVals) = valueF Parens info
in (fieldName info field <> " = " <> valueToSet, valueVals)

sub :: PersistField a => Mode -> SqlQuery (SqlExpr (Value a)) -> SqlExpr (Value a)
sub :: (SqlSelect (SqlExpr a) r) => Mode -> SqlQuery (SqlExpr a) -> SqlExpr a
sub mode query = ERaw noMeta $ \_ info -> first parens $ toRawSql mode info query

fromDBName :: IdentInfo -> DBName -> TLB.Builder
Expand Down Expand Up @@ -2521,7 +2518,7 @@
-- from 'unsafeSqlBinOp' applies to this function as well.
unsafeSqlFunction
:: UnsafeSqlFunctionArgument a
=> TLB.Builder -> a -> SqlExpr (Value b)
=> TLB.Builder -> a -> SqlExpr b
unsafeSqlFunction name arg =
ERaw noMeta $ \_ info ->
let (argsTLB, argsVals) =
Expand Down Expand Up @@ -2557,7 +2554,7 @@

-- | (Internal) An explicit SQL type cast using CAST(value as type).
-- See 'unsafeSqlBinOp' for warnings.
unsafeSqlCastAs :: T.Text -> SqlExpr (Value a) -> SqlExpr (Value b)
unsafeSqlCastAs :: T.Text -> SqlExpr a -> SqlExpr b
unsafeSqlCastAs t (ERaw _ f) = ERaw noMeta $ \_ -> ((first (\value -> "CAST" <> parens (value <> " AS " <> TLB.fromText t))) . f Never)

-- | (Internal) This class allows 'unsafeSqlFunction' to work with different
Expand All @@ -2576,8 +2573,8 @@
instance UnsafeSqlFunctionArgument () where
toArgList _ = []

instance (a ~ Value b) => UnsafeSqlFunctionArgument (SqlExpr a) where
toArgList = (:[]) . veryUnsafeCoerceSqlExprValue
instance UnsafeSqlFunctionArgument (SqlExpr a) where
toArgList = (:[]) . veryVeryUnsafeCoerceSqlExpr

instance UnsafeSqlFunctionArgument a => UnsafeSqlFunctionArgument [a] where
toArgList = concatMap toArgList
Expand Down Expand Up @@ -2684,18 +2681,22 @@
) => UnsafeSqlFunctionArgument (a, b, c, d, e, f, g, h, i, j) where
toArgList = toArgList . from10

-- | (Internal) Coerce a SqlExpr from any arbitrary a to any arbitrary b
-- You should /not/ use this function unless you know what you're doing!
veryVeryUnsafeCoerceSqlExpr :: SqlExpr a -> SqlExpr b
veryVeryUnsafeCoerceSqlExpr = coerce
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wait we have a Coercible instance on SqlExpr? That's no good :|


-- | (Internal) Coerce a value's type from 'SqlExpr (Value a)' to
-- 'SqlExpr (Value b)'. You should /not/ use this function
-- unless you know what you're doing!
veryUnsafeCoerceSqlExprValue :: SqlExpr (Value a) -> SqlExpr (Value b)
veryUnsafeCoerceSqlExprValue = coerce
veryUnsafeCoerceSqlExprValue = veryVeryUnsafeCoerceSqlExpr


-- | (Internal) Coerce a value's type from 'SqlExpr (ValueList
-- a)' to 'SqlExpr (Value a)'. Does not work with empty lists.
veryUnsafeCoerceSqlExprValueList :: SqlExpr (ValueList a) -> SqlExpr (Value a)
veryUnsafeCoerceSqlExprValueList = coerce
veryUnsafeCoerceSqlExprValueList = veryVeryUnsafeCoerceSqlExpr


----------------------------------------------------------------------
Expand Down Expand Up @@ -3381,7 +3382,7 @@
sqlSelectProcessRow pvs = Value <$> fromPersistValue (PersistList pvs)

-- | Materialize a @SqlExpr (Value a)@.
materializeExpr :: IdentInfo -> SqlExpr (Value a) -> (TLB.Builder, [PersistValue])
materializeExpr :: IdentInfo -> SqlExpr a -> (TLB.Builder, [PersistValue])
materializeExpr info (ERaw m f)
| Just fields <- sqlExprMetaCompositeFields m = (uncommas $ fmap parens $ fields info, [])
| Just alias <- sqlExprMetaAlias m
Expand Down
196 changes: 196 additions & 0 deletions src/Database/Esqueleto/Internal/JSON.hs
Original file line number Diff line number Diff line change
@@ -0,0 +1,196 @@
{-# LANGUAGE FlexibleContexts #-}
{-# LANGUAGE FlexibleInstances #-}
{-# LANGUAGE FunctionalDependencies #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE ScopedTypeVariables #-}
{-# LANGUAGE TypeApplications #-}
{-# LANGUAGE TypeFamilies #-}
{-# LANGUAGE UndecidableInstances #-}
module Database.Esqueleto.Internal.JSON
where

import qualified Data.Aeson as Aeson
import Data.Bifunctor (first)
import qualified Data.ByteString.Lazy as LBS
import Data.Proxy (Proxy(..))
import Data.Text (Text)
import qualified Data.Text as Text
import Data.Text.Encoding (encodeUtf8)
import qualified Data.Text.Lazy.Builder as TLB
import Database.Esqueleto.Internal.Internal
import Database.Persist

class SqlToJson jsonValue a b | jsonValue a -> b where
toJson :: a -> SqlExpr (jsonValue b)

class JsonAgg jsonValue where
jsonAgg :: SqlExpr (jsonValue a) -> SqlExpr (jsonValue [a])

class JsonBuildArray jsonValue where
unsafeJsonbBuildArray :: UnsafeSqlFunctionArgument a => a -> SqlExpr (jsonValue b)

class JsonBuildObject jsonValue where
Copy link
Contributor

@Vlix Vlix Mar 10, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are the jsonValue type variables all here in these class definitions so this can later be ported over to the JSONB type, because this is considered experimental at the moment so you want to keep it separate?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MySQL supports JSON as well. The intention is to add support in a db agnostic manner with as much independent of db specifics as possible

unsafeJsonbBuildObject :: [(SqlExpr (Value Text), SqlExpr SomeValue)] -> SqlExpr (jsonValue a)

multiset :: forall jsonValue a b r.
( Aeson.FromJSON b
, SqlToJson jsonValue a b
, JsonAgg jsonValue
, SqlSelect (SqlExpr (jsonValue [b])) r
)
=> SqlQuery a -> SqlExpr (jsonValue [b])
multiset q =
subSelectUnsafe $ jsonAgg . toJson <$> q

instance SqlToJson jsonValue (SqlExpr a) b
=> SqlToJson jsonValue (SqlExpr (Maybe a)) (Maybe b) where
toJson =
let unMaybe :: SqlExpr (Maybe a) -> SqlExpr a
unMaybe = veryVeryUnsafeCoerceSqlExpr
in veryVeryUnsafeCoerceSqlExpr . toJson @jsonValue . unMaybe


instance forall a jsonValue . (PersistEntity a, JsonBuildObject jsonValue)
=> SqlToJson jsonValue (SqlExpr (Entity a)) (Entity a) where
toJson ent =
unsafeJsonbBuildObject fields
where
ed = entityDef $ Proxy @a
baseFields = fmap (\fieldDef ->
( unsafeSqlValue $ TLB.fromText $ "'" <> unFieldNameHS (fieldHaskell fieldDef) <> "'"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't seem right. We're using the Haskell names for the database representation of the Entity, which isn't going to be how the database wants to handle the JSON object.

This is one of the big problems I ran into with trying to have jsonb_agg on Entity a. The {To,From}JSON instances don't work out. So I ended up writing a newtype JSONViaPostgres a that would dig into the EntityDef and figure out how to decode the Postgres JSON. But that also requires that you have a more-or-less standardly defined JSON instance for the record in question.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does persistent support a way to read the to/from json field name given a field?

We can sidestep the issue of to/from by using a custom decoder. As long as we are consistent at the boundaries we could even avoid the need to specify JSON instances for the types.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Am I right when I say this SqlToJSON instance tries to put all the columns in their own field of a JSON object? (Just checking if I'm reading this right)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct, the goal being to make something that automatically just works with the default JSON parser that gets generated.

Copy link
Collaborator Author

@belevy belevy Mar 14, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

which isn't going to be how the database wants to handle the JSON object.

@parsonsmatt I am not actually clear on what you mean by this? We aren't letting postgres do its default conversion.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I think I'm unsure that this is the right approach for converting something to a JSON representation.

I think I'd rather see something like:

newtype JsonEntity a = JsonEntity (Entity a)

instance PersistEntity a => FromJSON (JsonEntity a) where
    parseJSON = withObject "JsonEntity" $ \o -> do
        let edef = entityDef (Proxy @a)
        ...

which definitely parses how a database would convert things to JSON. Then we aren't having to worry about custom encoders or decoders.

persistent-2.14's tabulateEntityA could be useful here, but we may also need the FieldDict class from prairie to make it work how we want.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This could work sure, if we used a more general newtype SqlJson a = SqlJson a we can have it default to the underlying FromJSON and avoid surfacing the actual newtype to the end user though. The entity instance could then be defined as overlapping that default.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So there isn't really a great way to do this in general without that FieldDict class since we need to know that the value on the other side has a FromJSON dictionary.

Is there an example of someone using an exotic JSON parser? The thing that I need the default parser for is the underlying record not the Entity record since the entityIdFromJSON delegates to the FromJSON instance on the record.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should be noted that using the QQ's json tag will actually only allow you to configure the outer encoder/decoder but the record its self is always just haskellName: someValue

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it even has to be an exotic parse, just a different set of options. Like,

mkPersist sqlSettings [persistLowerCase|
    User
        name Text
        age Int
|]

deriveJSON defaultOptions ''User

This would expect the JSON record to have the type name prefixes. I don't think the haskellName has those. Any other modification to those options would break parsing the records from the database.

So I think we do need a newtype with a custom FromJSON instance, and we also need the FieldDict class. I'll see about re-exporting that from persistent and generating those instances.

, ERaw noMeta $ \_ info -> (viewFieldBuilder ent info fieldDef, [])
)) (getEntityFields ed)
idField = fmap (\fieldDef ->
( unsafeSqlValue "'id'"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This isn't necessarily proper, as you can specify the primary column name:

User
    Id UUID sql="my_user_id"

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The actual field name in the SQL is irrelevant, it's the representation that FromJSON expects that matters. Entity assumes a key of id

Copy link
Contributor

@Vlix Vlix Mar 10, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it not possible to use the database names? 🤔 To me that sounds more robust/portable.

Though I guess if it only happens within one query, it doesn't really matter.

EDIT: Oh, that might also make it possible to get back easily using jsonb_to_record

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed the db names could be used and a custom JSON decoder could be used to extract the fields.

, ERaw noMeta $ \_ info -> (viewFieldBuilder ent info fieldDef, [])
)) (getEntityIdField ed)

fields = maybe baseFields (:baseFields) idField


instance ( SqlToJson jsonValue a a'
, SqlToJson jsonValue b b'
, JsonBuildArray jsonValue
)
=> SqlToJson jsonValue (a, b) (a', b') where
toJson (a, b) =
unsafeJsonbBuildArray
( toJson @jsonValue a
, toJson @jsonValue b
)

instance ( SqlToJson jsonValue a a'
, SqlToJson jsonValue b b'
, SqlToJson jsonValue c c'
, JsonBuildArray jsonValue
)
=> SqlToJson jsonValue (a, b, c) (a', b', c') where
toJson (a, b, c) =
unsafeJsonbBuildArray
( toJson @jsonValue a
, toJson @jsonValue b
, toJson @jsonValue c
)

instance ( SqlToJson jsonValue a a'
, SqlToJson jsonValue b b'
, SqlToJson jsonValue c c'
, SqlToJson jsonValue d d'
, JsonBuildArray jsonValue
)
=> SqlToJson jsonValue (a, b, c, d) (a', b', c', d') where
toJson (a, b, c, d) =
unsafeJsonbBuildArray
( toJson @jsonValue a
, toJson @jsonValue b
, toJson @jsonValue c
, toJson @jsonValue d
)
instance ( SqlToJson jsonValue a a'
, SqlToJson jsonValue b b'
, SqlToJson jsonValue c c'
, SqlToJson jsonValue d d'
, SqlToJson jsonValue e e'
, JsonBuildArray jsonValue
)
=> SqlToJson jsonValue (a, b, c, d, e) (a', b', c', d', e') where
toJson (a, b, c, d, e) =
unsafeJsonbBuildArray
( toJson @jsonValue a
, toJson @jsonValue b
, toJson @jsonValue c
, toJson @jsonValue d
, toJson @jsonValue e
)
instance ( SqlToJson jsonValue a a'
, SqlToJson jsonValue b b'
, SqlToJson jsonValue c c'
, SqlToJson jsonValue d d'
, SqlToJson jsonValue e e'
, SqlToJson jsonValue f f'
, JsonBuildArray jsonValue
)
=> SqlToJson jsonValue (a, b, c, d, e, f) (a', b', c', d', e', f') where
toJson (a, b, c, d, e, f) =
unsafeJsonbBuildArray
( toJson @jsonValue a
, toJson @jsonValue b
, toJson @jsonValue c
, toJson @jsonValue d
, toJson @jsonValue e
, toJson @jsonValue f
)
instance ( SqlToJson jsonValue a a'
, SqlToJson jsonValue b b'
, SqlToJson jsonValue c c'
, SqlToJson jsonValue d d'
, SqlToJson jsonValue e e'
, SqlToJson jsonValue f f'
, SqlToJson jsonValue g g'
, JsonBuildArray jsonValue
)
=> SqlToJson jsonValue (a, b, c, d, e, f, g) (a', b', c', d', e', f', g') where
toJson (a, b, c, d, e, f, g) =
unsafeJsonbBuildArray
( toJson @jsonValue a
, toJson @jsonValue b
, toJson @jsonValue c
, toJson @jsonValue d
, toJson @jsonValue e
, toJson @jsonValue f
, toJson @jsonValue g
)
instance ( SqlToJson jsonValue a a'
, SqlToJson jsonValue b b'
, SqlToJson jsonValue c c'
, SqlToJson jsonValue d d'
, SqlToJson jsonValue e e'
, SqlToJson jsonValue f f'
, SqlToJson jsonValue g g'
, SqlToJson jsonValue h h'
, JsonBuildArray jsonValue
)
=> SqlToJson jsonValue (a, b, c, d, e, f, g, h) (a', b', c', d', e', f', g', h') where
toJson (a, b, c, d, e, f, g, h) =
unsafeJsonbBuildArray
( toJson @jsonValue a
, toJson @jsonValue b
, toJson @jsonValue c
, toJson @jsonValue d
, toJson @jsonValue e
, toJson @jsonValue f
, toJson @jsonValue g
, toJson @jsonValue h
)

sqlSelectProcessRowJSON :: (Applicative f, Aeson.FromJSON r)
=> [PersistValue] -> Either Text (f r)
sqlSelectProcessRowJSON [PersistByteString bs] =
case Aeson.eitherDecode $ LBS.fromStrict bs of
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to import Data.ByteString.Lazy, aeson's got your back:

either Text.pack pure $
    Aeson.eitherDecodeStrict bs

Right r -> Right $ pure r
Left e -> Left $ Text.pack e
sqlSelectProcessRowJSON [PersistText t] =
first (<> (" " <> t)) $ sqlSelectProcessRowJSON [PersistByteString (encodeUtf8 t)]

sqlSelectProcessRowJSON _ = Left "Expected ByteString but database returned unexpected type"

Loading
Loading