Skip to content

Conversation

@cloud-fan
Copy link
Contributor

What changes were proposed in this pull request?

This is a followup of #28534 , to make TIMESTAMP_SECONDS function support fractional input as well.

Why are the changes needed?

Previously the cast function can cast fractional values to timestamp. Now we suggest users to ues these new functions, and we need to cover all the cast use cases.

Does this PR introduce any user-facing change?

Yes, now TIMESTAMP_SECONDS function accepts fractional input.

How was this patch tested?

new tests

}
}

test("SPARK-31710:Adds TIMESTAMP_SECONDS, " +
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test is rewritten to increase test coverage.

@cloud-fan
Copy link
Contributor Author

@SparkQA
Copy link

SparkQA commented Jul 1, 2020

Test build #124646 has finished for PR 28956 at commit f0256db.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • case class SecondsToTimestamp(child: Expression) extends UnaryExpression

extends NumberToTimestampBase {
// scalastyle:on line.size.limit
case class SecondsToTimestamp(child: Expression) extends UnaryExpression
with ExpectsInputTypes with NullIntolerant{
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: NullIntolerant{ -> NullIntolerant {.


-- [SPARK-31710] TIMESTAMP_SECONDS, TIMESTAMP_MILLISECONDS and TIMESTAMP_MICROSECONDS to timestamp transfer
select TIMESTAMP_SECONDS(1230219000),TIMESTAMP_SECONDS(-1230219000),TIMESTAMP_SECONDS(null);
select TIMESTAMP_SECONDS(1.23), TIMESTAMP_SECONDS(1.23d);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since this has Decimal and Double, can we have Float together by using TIMESTAMP_SECONDS(CAST(1.23 AS FLOAT))?

select TIMESTAMP_MILLIS(92233720368547758);
select TIMESTAMP_MILLIS(-92233720368547758);
-- truncate exception
select TIMESTAMP_SECONDS(0.1234567);
Copy link
Member

@dongjoon-hyun dongjoon-hyun Jul 1, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shall we have a test case for allow truncation together because this PR allows truncation for double type?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR aims to allow truncation for both ANSI and legacy mode. Did I understand correctly?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, float/double are approximate values and truncation should be always allowed.

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM. Thank you for updating, @cloud-fan .

@SparkQA
Copy link

SparkQA commented Jul 1, 2020

Test build #124747 has finished for PR 28956 at commit e41095d.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@dongjoon-hyun
Copy link
Member

Merged to master for Apache Spark 3.1.0.

@HyukjinKwon
Copy link
Member

+1 looks good to me too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants