-
Notifications
You must be signed in to change notification settings - Fork 227
Upgrade spark 2.0 and prepare 0.4.0-SNAPSHOT #150
Conversation
README.md
Outdated
| ## Requirements | ||
|
|
||
| This library requires Spark 1.3+ | ||
| This library requires Spark 2.0+ for 0.4.x. For Spark 1.3.+, 0.3.x version works with it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will create a branch for 0.3.x soon and correct this documentation.
|
Hi @liancheng and @JoshRosen , this PR resembles the ones in |
Current coverage is 87.85% (diff: 89.28%)@@ master #150 diff @@
==========================================
Files 15 15
Lines 697 708 +11
Methods 638 641 +3
Messages 0 0
Branches 59 67 +8
==========================================
- Hits 631 622 -9
- Misses 66 86 +20
Partials 0 0
|
| def saveAsXmlFile( | ||
| path: String, parameters: Map[String, String] = Map(), | ||
| compressionCodec: Class[_ <: CompressionCodec] = null): Unit = { | ||
| val options = XmlOptions(parameters.toMap) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I moved below codes to src/main/scala/com/databricks/spark/xml/util/XmlFile.scala with deprecating this.
|
@JoshRosen @liancheng Could you please take a quick look? |
| case (ByteType, v: Byte) => writer.writeCharacters(v.toString) | ||
| case (BooleanType, v: Boolean) => writer.writeCharacters(v.toString) | ||
| case (DateType, v) => writer.writeCharacters(v.toString) | ||
| case (udt: UserDefinedType[_], v) => writeElement(udt.sqlType, udt.serialize(v)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had to remove this because it was hidden from 2.0.
|
@JoshRosen @liancheng Could you just quickly scan this out for some not sensible changes here please? |
|
Maybe I will merge this after double-checking this by myself if you don't have time to review this (but I will wait more just in case) - @JoshRosen @liancheng |
|
Will this be released soon? |
|
Yeap, I am thinking doing this on this weekend. |
|
Thanks for the information On Thu, Sep 8, 2016 at 3:03 PM, Hyukjin Kwon notifications@github.com
|
|
I am going to merge this as soon as the tests pass. |
This PR prepares the release for 0.4.0. This will include the changes below: - Support for PERMISSIVE/DROPMALFORMED mode and corrupt record option. #107 - Change default values for valueTag and attributePrefix to avoid always require field escape for some apis #142 - deprecates saveAsXmlFile and promote the usage of write(). #150 - deprecates xmlFile and promote the usage of read(). #150 - drops 1.x compatibility from 0.4.0. #150 - makes not supporting UserDefinedType as it became private. #150 Author: hyukjinkwon <gurwls223@gmail.com> Closes #176 from HyukjinKwon/version-0.4.0.
|
I am using |
This PR migrates spark-xml to Spark 2.0.
This PR,
saveAsXmlFileand promote the usage ofwrite().xmlFileand promote the usage ofread().UserDefinedTypeas it became private.