-
Notifications
You must be signed in to change notification settings - Fork 285
Description
What is the problem the feature request solves?
Note: This issue was generated with AI assistance. The specification details have been extracted from Spark documentation and may need verification.
Comet does not currently support the Spark array_position function, causing queries using this function to fall back to Spark's JVM execution instead of running natively on DataFusion.
The ArrayPosition expression finds the position of the first occurrence of a specified element within an array. It returns a 1-based index of the element's position, or 0 if the element is not found in the array.
Supporting this expression would allow more Spark workloads to benefit from Comet's native acceleration.
Describe the potential solution
Spark Specification
Syntax:
array_position(array, element)// DataFrame API usage
import org.apache.spark.sql.functions._
df.select(array_position(col("array_column"), lit(value)))Arguments:
| Argument | Type | Description |
|---|---|---|
| array | ArrayType | The input array to search within |
| element | Any (matching array element type) | The element to find within the array |
Return Type: LongType - Returns a 1-based position index as a long integer, or 0 if the element is not found.
Supported Data Types:
- Array element types must be orderable (support comparison operations)
- The search element type must be compatible with the array's element type through type coercion
- Null types are explicitly rejected and will cause a type mismatch error
Edge Cases:
- Null array input: Returns null due to
nullIntolerant = true - Null search element: Returns null due to
nullIntolerant = true - Null elements in array: Skipped during comparison, never match the search element
- Empty array: Returns 0 (no elements to match)
- Element type mismatch: Compile-time error with detailed type mismatch information
- Multiple occurrences: Only returns the position of the first occurrence
Examples:
-- Find position of element in array
SELECT array_position(array(312, 773, 708, 708), 414);
-- Returns: 0
SELECT array_position(array(312, 773, 708, 708), 773);
-- Returns: 2
SELECT array_position(array('a', 'b', 'c', 'b'), 'b');
-- Returns: 2 (first occurrence)
-- With null values
SELECT array_position(array(1, null, 3, null), 3);
-- Returns: 3// DataFrame API examples
import org.apache.spark.sql.functions._
// Find position of specific value
df.select(array_position(col("numbers"), lit(42)))
// Find position with column reference
df.select(array_position(col("items"), col("search_value")))
// Using in filter conditions
df.filter(array_position(col("tags"), lit("important")) > 0)Implementation Approach
See the Comet guide on adding new expressions for detailed instructions.
- Scala Serde: Add expression handler in
spark/src/main/scala/org/apache/comet/serde/ - Register: Add to appropriate map in
QueryPlanSerde.scala - Protobuf: Add message type in
native/proto/src/proto/expr.protoif needed - Rust: Implement in
native/spark-expr/src/(check if DataFusion has built-in support first)
Additional context
Difficulty: Medium
Spark Expression Class: org.apache.spark.sql.catalyst.expressions.ArrayPosition
Related:
array_contains- Check if array contains an element (boolean result)element_at- Get element at specific position in arrayarray_remove- Remove all occurrences of element from arraysize- Get the size/length of an array
This issue was auto-generated from Spark reference documentation.