Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for OFFSET #451

Merged
merged 32 commits into from
Oct 12, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
32 commits
Select commit Hold shift + click to select a range
38f01c6
Changed adding default timezone offset to TIME WITH TIME ZONE from pa…
lziq Sep 22, 2021
ea06c69
Remove unnecessary files
lziq Sep 23, 2021
9728308
Modified comments
lziq Sep 23, 2021
49d23b1
Fix issue 410
lziq Sep 27, 2021
b86a559
Fix issue 410
lziq Sep 27, 2021
82147af
Provide an option in EvaluationSession to configure default timezone.
lziq Sep 28, 2021
dab08fc
Added curly braces for line 2419 in SqlParser
lziq Sep 28, 2021
2bd6c56
Added "DataTimeFormatter" qualifier for line 2420 and 2423 in SqlParser
lziq Sep 28, 2021
a407439
Removed git merge conflict in ask.kt
lziq Sep 28, 2021
1fe1b27
Removed git merge conflict in ask.kt
lziq Sep 28, 2021
27e167d
Changed code style
lziq Sep 28, 2021
5297a5e
Changed code style
lziq Sep 28, 2021
d5bfa8b
Changed code style
lziq Sep 28, 2021
6580491
Changed code style
lziq Sep 28, 2021
0e5d497
Changed code style
lziq Sep 28, 2021
fb754b3
Changed code style
lziq Sep 28, 2021
572acc8
Added more test cases for time evaluation
lziq Sep 28, 2021
27101f4
Changed new field type from Environment to EvaluationSession for `Exp…
lziq Sep 28, 2021
3353c93
Update lang/test/org/partiql/lang/eval/EvaluatingCompilerCastTest.kt
lziq Sep 28, 2021
5882c2e
Update lang/test/org/partiql/lang/eval/EvaluatingCompilerDateTimeTest…
lziq Sep 28, 2021
9748e44
Removed unused package import
lziq Sep 29, 2021
ff122ee
Provided kdoc for `defaultTimezoneOffset` in `EvaluationSession` and …
lziq Sep 29, 2021
919f725
Made changes to `settingDefaultTimezoneOffset` in `EvaluationSessionT…
lziq Sep 29, 2021
ba59139
Merge branch 'configure-default-timezone' of https://github.com/parti…
lziq Sep 29, 2021
f92ee58
Created helper function for default timezone offset cast test cases.
lziq Sep 29, 2021
b7ee8bf
Created helper function for default timezone offset time evaluation t…
lziq Sep 29, 2021
6740ce1
Merge branch 'main' of https://github.com/partiql/partiql-lang-kotlin…
lziq Sep 29, 2021
fded498
Added OFFSET in PIG domain.
lziq Sep 30, 2021
7ad23ee
Reverse the order of limit and offset for visitors.
lziq Sep 30, 2021
1523c50
Modified comments for EvaluationOrder in VisitorTransformBase.
lziq Sep 30, 2021
96f781f
Add offset to parser and ExprNode conversions (#453)
lziq Oct 6, 2021
ebc5977
Added OFFSET in evaluator (#455)
lziq Oct 12, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 5 additions & 2 deletions docs/dev/RewriterToVisitorTransformGuide.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,11 +78,11 @@ In the rewriter code, you may encounter [`innerRewriteSelect`](https://github.co

However, currently there are some slight differences between the order of transforms in `innerRewriteSelect` and `transformExprSelect`. `innerRewriteSelect` follows the traversal order (SQL semantic order) that is

FROM → (FROM LET) → (WHERE) → (GROUP BY) → (HAVING) → PROJECTION → (LIMIT)
FROM → (FROM LET) → (WHERE) → (GROUP BY) → (HAVING) → PROJECTION → (OFFSET) → (LIMIT)

`transformExprSelect` transforms the clauses in the order they are written for a PartiQL query, that is

PROJECTION → FROM → (FROM LET) → (WHERE) → (GROUP BY) → (HAVING) → (LIMIT)
PROJECTION → FROM → (FROM LET) → (WHERE) → (GROUP BY) → (HAVING) → (LIMIT) → (OFFSET)

This slight difference can lead to some different behaviors when translating `innerRewriteSelect` (e.g. StaticTypeRewriter). So to completely solve this issue, you can have your VisitorTransform implement [`VisitorTransformBase`](https://github.com/partiql/partiql-lang-kotlin/blob/master/lang/src/org/partiql/lang/eval/visitors/VisitorTransformBase.kt) and call `transformExprSelectEvaluationOrder` rather than `transformExprSelect` to get the same behavior.

Expand All @@ -97,6 +97,7 @@ fun transformExprSelectEvaluationOrder(node: PartiqlAst.Expr.Select): PartiqlAst
val having = transformExprSelect_having(node)
val setq = transformExprSelect_setq(node)
val project = transformExprSelect_project(node)
val offset = transformExprSelect_offset(node)
val limit = transformExprSelect_limit(node)
val metas = transformExprSelect_metas(node)
return PartiqlAst.build {
Expand All @@ -109,6 +110,7 @@ fun transformExprSelectEvaluationOrder(node: PartiqlAst.Expr.Select): PartiqlAst
group = group,
having = having,
limit = limit,
offset = offset,
metas = metas)
}
}
Expand Down Expand Up @@ -158,6 +160,7 @@ private fun copyProjectionToSelect(node: PartiqlAst.Expr.Select, newProjection:
group = node.group,
having = node.having,
limit = node.limit,
offset = node.offset,
metas = node.metas)
}
}
Expand Down
3 changes: 2 additions & 1 deletion lang/resources/org/partiql/type-domains/partiql.ion
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,8 @@
(group (? group_by))
(having (? expr))
(order (? order_by))
(limit (? expr))))
(limit (? expr))
(offset (? expr))))
// end of sum expr

// Time
Expand Down
7 changes: 6 additions & 1 deletion lang/src/org/partiql/lang/ast/AstSerialization.kt
Original file line number Diff line number Diff line change
Expand Up @@ -207,7 +207,12 @@ private class AstSerializerImpl(val astVersion: AstVersion, val ion: IonSystem):
}

private fun IonWriterContext.writeSelect(expr: Select) {
val (setQuantifier, projection, from, fromLet, where, groupBy, having, orderBy, limit, _: MetaContainer) = expr
val (setQuantifier, projection, from, fromLet, where, groupBy, having, orderBy, limit, offset, _: MetaContainer) = expr

if (offset != null){
throw UnsupportedOperationException("OFFSET clause is not supported by the V0 AST")
}

if (orderBy != null) {
throw UnsupportedOperationException("ORDER BY clause is not supported by the V0 AST")
}
Expand Down
1 change: 1 addition & 0 deletions lang/src/org/partiql/lang/ast/ExprNodeToStatement.kt
Original file line number Diff line number Diff line change
Expand Up @@ -160,6 +160,7 @@ fun ExprNode.toAstExpr(): PartiqlAst.Expr {
group = node.groupBy?.toAstGroupSpec(),
having = node.having?.toAstExpr(),
limit = node.limit?.toAstExpr(),
offset = node.offset?.toAstExpr(),
metas = metas)
is Struct -> struct(node.fields.map { exprPair(it.name.toAstExpr(), it.expr.toAstExpr()) }, metas)
is Seq ->
Expand Down
1 change: 1 addition & 0 deletions lang/src/org/partiql/lang/ast/StatementToExprNode.kt
Original file line number Diff line number Diff line change
Expand Up @@ -190,6 +190,7 @@ private class StatementTransformer(val ion: IonSystem) {
having = having?.toExprNode(),
orderBy = order?.toOrderBy(),
limit = limit?.toExprNode(),
offset = offset?.toExprNode(),
metas = metas
)
is Expr.Date ->
Expand Down
3 changes: 2 additions & 1 deletion lang/src/org/partiql/lang/ast/ast.kt
Original file line number Diff line number Diff line change
Expand Up @@ -440,9 +440,10 @@ data class Select(
val having: ExprNode? = null,
val orderBy: OrderBy? = null,
val limit: ExprNode? = null,
val offset: ExprNode? = null,
override val metas: MetaContainer
) : ExprNode() {
override val children: List<AstNode> = listOfNotNull(projection, from, fromLet, where, groupBy, having, orderBy, limit)
override val children: List<AstNode> = listOfNotNull(projection, from, fromLet, where, groupBy, having, orderBy, limit, offset)
}

//********************************
Expand Down
7 changes: 6 additions & 1 deletion lang/src/org/partiql/lang/ast/passes/AstRewriterBase.kt
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,8 @@ open class AstRewriterBase : AstRewriter {
* 5. `HAVING`
* 6. *projection*
* 7. `ORDER BY` (to be implemented)
* 8. `LIMIT`
* 8. `OFFSET`
* 9. `LIMIT`
*/
protected open fun innerRewriteSelect(selectExpr: Select): Select {
val from = rewriteFromSource(selectExpr.from)
Expand All @@ -164,6 +165,7 @@ open class AstRewriterBase : AstRewriter {
val having = selectExpr.having?.let { rewriteSelectHaving(it) }
val projection = rewriteSelectProjection(selectExpr.projection)
val orderBy = selectExpr.orderBy?.let { rewriteOrderBy(it) }
val offset = selectExpr.offset?.let { rewriteSelectOffset(it) }
val limit = selectExpr.limit?.let { rewriteSelectLimit(it) }
val metas = rewriteSelectMetas(selectExpr)

Expand All @@ -177,6 +179,7 @@ open class AstRewriterBase : AstRewriter {
having = having,
orderBy = orderBy,
limit = limit,
offset = offset,
metas = metas)
}

Expand All @@ -186,6 +189,8 @@ open class AstRewriterBase : AstRewriter {

open fun rewriteSelectLimit(node: ExprNode): ExprNode = rewriteExprNode(node)

open fun rewriteSelectOffset(node: ExprNode): ExprNode = rewriteExprNode(node)

open fun rewriteSelectMetas(selectExpr: Select): MetaContainer = rewriteMetas(selectExpr)

open fun rewriteSelectProjection(projection: SelectProjection): SelectProjection =
Expand Down
3 changes: 2 additions & 1 deletion lang/src/org/partiql/lang/ast/passes/AstWalker.kt
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ open class AstWalker(private val visitor: AstVisitor) {
}
}
is Select -> case {
val (_, projection, from, fromLet, where, groupBy, having, orderBy, limit, _: MetaContainer) = expr
val (_, projection, from, fromLet, where, groupBy, having, orderBy, limit, offset, _: MetaContainer) = expr
walkSelectProjection(projection)
walkFromSource(from)
walkExprNode(where)
Expand All @@ -107,6 +107,7 @@ open class AstWalker(private val visitor: AstVisitor) {
}
}
walkExprNode(limit)
walkExprNode(offset)
}
is DataManipulation -> case {
val (dmlOperation, from, where, returning, _: MetaContainer) = expr
Expand Down
13 changes: 13 additions & 0 deletions lang/src/org/partiql/lang/errors/ErrorCode.kt
Original file line number Diff line number Diff line change
Expand Up @@ -654,6 +654,19 @@ enum class ErrorCode(private val category: ErrorCategory,
LOCATION,
"LIMIT must not be negative"),

EVALUATOR_NON_INT_OFFSET_VALUE (
ErrorCategory.EVALUATOR,
LOCATION + setOf(Property.ACTUAL_TYPE),
"") {
override fun getErrorMessage(errorContext: PropertyValueMap?): String =
"OFFSET value must be an integer but found ${errorContext.getProperty(Property.ACTUAL_TYPE)}"
},

EVALUATOR_NEGATIVE_OFFSET(
ErrorCategory.EVALUATOR,
LOCATION,
"OFFSET must not be negative"),

EVALUATOR_DIVIDE_BY_ZERO(
ErrorCategory.EVALUATOR,
LOCATION,
Expand Down
87 changes: 58 additions & 29 deletions lang/src/org/partiql/lang/eval/EvaluatingCompiler.kt
Original file line number Diff line number Diff line change
Expand Up @@ -910,7 +910,7 @@ internal class EvaluatingCompiler(
private fun evalLimit(limitThunk: ThunkEnv, env: Environment, limitLocationMeta: SourceLocationMeta?): Long {
val limitExprValue = limitThunk(env)

if(limitExprValue.type != ExprValueType.INT) {
if (limitExprValue.type != ExprValueType.INT) {
err("LIMIT value was not an integer",
ErrorCode.EVALUATOR_NON_INT_LIMIT_VALUE,
errorContextFrom(limitLocationMeta).also {
Expand All @@ -927,7 +927,7 @@ internal class EvaluatingCompiler(
// We throw an exception here if the value exceeds the supported range (say if we change that
// restriction or if a custom [ExprValue] is provided which exceeds that value).
val limitIonValue = limitExprValue.ionValue as IonInt
if(limitIonValue.integerSize == IntegerSize.BIG_INTEGER) {
if (limitIonValue.integerSize == IntegerSize.BIG_INTEGER) {
err("IntegerSize.BIG_INTEGER not supported for LIMIT values",
errorContextFrom(limitLocationMeta),
internal = true)
Expand All @@ -947,6 +947,43 @@ internal class EvaluatingCompiler(
return limitValue
}

private fun evalOffset(offsetThunk: ThunkEnv, env: Environment, offsetLocationMeta: SourceLocationMeta?): Long {
val offsetExprValue = offsetThunk(env)

if (offsetExprValue.type != ExprValueType.INT) {
err("OFFSET value was not an integer",
ErrorCode.EVALUATOR_NON_INT_OFFSET_VALUE,
errorContextFrom(offsetLocationMeta).also {
it[Property.ACTUAL_TYPE] = offsetExprValue.type.toString()
},
internal = false)
}

// `Number.toLong()` (used below) does *not* cause an overflow exception if the underlying [Number]
// implementation (i.e. Decimal or BigInteger) exceeds the range that can be represented by Longs.
// This can cause very confusing behavior if the user specifies a OFFSET value that exceeds
// Long.MAX_VALUE, because no results will be returned from their query. That no overflow exception
// is thrown is not a problem as long as PartiQL's restriction of integer values to +/- 2^63 remains.
// We throw an exception here if the value exceeds the supported range (say if we change that
// restriction or if a custom [ExprValue] is provided which exceeds that value).
val offsetIonValue = offsetExprValue.ionValue as IonInt
if (offsetIonValue.integerSize == IntegerSize.BIG_INTEGER) {
err("IntegerSize.BIG_INTEGER not supported for OFFSET values",
errorContextFrom(offsetLocationMeta),
internal = true)
}

val offsetValue = offsetExprValue.numberValue().toLong()

if (offsetValue < 0) {
err("negative OFFSET",
ErrorCode.EVALUATOR_NEGATIVE_OFFSET,
errorContextFrom(offsetLocationMeta),
internal = false)
}

return offsetValue
}

private fun compileSelect(selectExpr: Select): ThunkEnv {
selectExpr.orderBy?.let {
Expand Down Expand Up @@ -980,15 +1017,28 @@ internal class EvaluatingCompiler(
.union(pigGeneratedAst.fromLet?.let { fold.walkLet(pigGeneratedAst.fromLet, emptySet()) } ?: emptySet())

return nestCompilationContext(ExpressionContext.NORMAL, emptySet()) {
val (setQuantifier, projection, from, fromLet, _, groupBy, having, _, limit, metas: MetaContainer) = selectExpr
val (setQuantifier, projection, from, fromLet, _, groupBy, having, _, limit, offset, metas: MetaContainer) = selectExpr

val fromSourceThunks = compileFromSources(from)
val letSourceThunks = fromLet?.let { compileLetSources(it) }
val sourceThunks = compileQueryWithoutProjection(selectExpr, fromSourceThunks, letSourceThunks)

val limitThunk = limit?.let { compileExprNode(limit) }
val offsetThunk = offset?.let { compileExprNode(it) }
val offsetLocationMeta = offset?.metas?.sourceLocationMeta
val limitThunk = limit?.let { compileExprNode(it) }
val limitLocationMeta = limit?.metas?.sourceLocationMeta

fun <T> rowsWithOffsetAndLimit (rows: Sequence<T>, env: Environment): Sequence<T> {
val rowsWithOffset = when (offsetThunk){
null -> rows
else -> rows.drop(evalOffset(offsetThunk, env, offsetLocationMeta))
}
return when (limitThunk) {
null -> rowsWithOffset
else -> rowsWithOffset.take(evalLimit(limitThunk, env, limitLocationMeta))
}
}

// Returns a thunk that invokes [sourceThunks], and invokes [projectionThunk] to perform the projection.
fun getQueryThunk(selectProjectionThunk: ThunkEnvValue<List<ExprValue>>): ThunkEnv {
val (_, groupByItems, groupAsName) = groupBy ?: GroupBy(GroupingStrategy.FULL, listOf())
Expand All @@ -1009,12 +1059,7 @@ internal class EvaluatingCompiler(
// wrap the ExprValue to use ExprValue.equals as the equality
SetQuantifier.DISTINCT -> projectedRows.filter(createUniqueExprValueFilter())
SetQuantifier.ALL -> projectedRows
}.let { rows ->
when (limitThunk) {
null -> rows
else -> rows.take(evalLimit(limitThunk, env, limitLocationMeta))
}
}
}.let { rowsWithOffsetAndLimit(it, env) }

valueFactory.newBag(quantifiedRows.map {
// TODO make this expose the ordinal for ordered sequences
Expand Down Expand Up @@ -1056,12 +1101,7 @@ internal class EvaluatingCompiler(
// Create a closure that groups all the rows in the FROM source into a single group.
thunkFactory.thunkEnv(metas) { env ->
// Evaluate the FROM clause
val fromProductions: Sequence<FromProduction> = sourceThunks(env).let { rows ->
when (limitThunk) {
null -> rows
else -> rows.take(evalLimit(limitThunk, env, limitLocationMeta))
}
}
val fromProductions: Sequence<FromProduction> = rowsWithOffsetAndLimit(sourceThunks(env), env)
val registers = createRegisterBank()

// note: the group key can be anything here because we only ever have a single
Expand Down Expand Up @@ -1144,12 +1184,7 @@ internal class EvaluatingCompiler(
val projectedRows = env.groups.mapNotNull { g ->
val groupByEnv = getGroupEnv(env, g.value)
filterHavingAndProject(groupByEnv, g.value)
}.asSequence().let { rows ->
when (limitThunk) {
null -> rows
else -> rows.take(evalLimit(limitThunk, env, limitLocationMeta))
}
}
}.asSequence().let { rowsWithOffsetAndLimit(it, env) }

valueFactory.newBag(projectedRows)
}
Expand Down Expand Up @@ -1177,13 +1212,7 @@ internal class EvaluatingCompiler(
val asThunk = compileExprNode(asExpr)
val atThunk = compileExprNode(atExpr)
thunkFactory.thunkEnv(metas) { env ->
val sourceValue = sourceThunks(env).asSequence().let { rows ->
when (limitThunk) {
null -> rows
else -> rows.take(evalLimit(limitThunk, env, limitLocationMeta))
}
}

val sourceValue = rowsWithOffsetAndLimit(sourceThunks(env).asSequence(), env)
val seq = sourceValue
.map { (_, env) -> Pair(asThunk(env), atThunk(env)) }
.filter { (name, _) -> name.type.isText }
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,7 @@ class GroupByPathExpressionVisitorTransform(
val groupBy = unshadowedTransformer.transformExprSelect_group(node)
val having = currentAndUnshadowedTransformer.transformExprSelect_having(node)
val order = currentAndUnshadowedTransformer.transformExprSelect_order(node)
val offset = unshadowedTransformer.transformExprSelect_offset(node)
val limit = unshadowedTransformer.transformExprSelect_limit(node)
val metas = unshadowedTransformer.transformExprSelect_metas(node)

Expand All @@ -116,6 +117,7 @@ class GroupByPathExpressionVisitorTransform(
group = groupBy,
having = having,
order = order,
offset = offset,
limit = limit,
metas = metas)
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ class SelectStarVisitorTransform : VisitorTransformBase() {
having = node.having,
order = node.order,
limit = node.limit,
offset = node.offset,
metas = node.metas)
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,9 @@ abstract class VisitorTransformBase : PartiqlAst.VisitorTransform() {
* 4. `GROUP BY`
* 5. `HAVING`
* 6. *projection*
* 7. `LIMIT`
* 8. The metas.
* 7. `OFFSET`
* 8. `LIMIT`
* 9. The metas.
*
* This differs from [transformExprSelect], which executes following the written order of clauses.
*/
Expand All @@ -41,6 +42,7 @@ abstract class VisitorTransformBase : PartiqlAst.VisitorTransform() {
val setq = transformExprSelect_setq(node)
val project = transformExprSelect_project(node)
val order = transformExprSelect_order(node)
val offset = transformExprSelect_offset(node)
val limit = transformExprSelect_limit(node)
val metas = transformExprSelect_metas(node)
return PartiqlAst.build {
Expand All @@ -53,6 +55,7 @@ abstract class VisitorTransformBase : PartiqlAst.VisitorTransform() {
group = group,
having = having,
order = order,
offset = offset,
limit = limit,
metas = metas)
}
Expand Down
1 change: 1 addition & 0 deletions lang/src/org/partiql/lang/syntax/LexerConstants.kt
Original file line number Diff line number Diff line change
Expand Up @@ -258,6 +258,7 @@ internal val DATE_PART_KEYWORDS: Set<String> = DatePart.values()
"pivot",
"unpivot",
"limit",
"offset",
"tuple",
"remove",
"index",
Expand Down
Loading