-
Notifications
You must be signed in to change notification settings - Fork 200
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[SNAP-1136] Pooled version of Kryo serializer which works for closures (
#426) - new PooledKryoSerializer that does pooling of Kryo objects (else performance is bad if new instance is created for every call which needs to register and walk tons of classes) - has an overridden version for ASCII strings to fix (EsotericSoftware/kryo#128); currently makes a copy but will be modified to use one extra byte to indicate end of string - optimized external serializers for StructType, and Externalizable having readResolve() method; using latter for StorageLevel and BlockManagerId - added optimized serialization for the closure used by SparkSQLExecuteImpl (now a proper class instead) - fixed index column determination in RowFormatRelation (was off by 1 due to 0 based vs 1 based) - set serializer/codec options explicitly in ClusterManagerTestBase since it does not use Lead API - formatting changes and fixed some compiler warnings - Kryo serialization for RowFormatScanRDD, SparkShellRowRDD, ColumnarStorePartitionedRDD, SparkShellCachedBatchRDD and MultiBucketExecutorPartition - added base RDDKryo to encapsulate serialization of bare minimum fields in RDD (using reflection where required) - removed unused SparkShellRDDHelper.mapBucketsToPartitions - updated log4j.properties for core/cluster tests - change Attribute to StructField in columns decoders since StructType has an efficient serializer as well as being cleaner since it doesn't depend on Attribute (with potentially invalid ExprId for remote node though those fields are not used) - updating spark link to fix AQP dunits with the new kryo serializer - skip DUnitSingleTest from the aqp test target since those really are dunits which should not be run like normal junit tests - re-create snappy catalog connection for MetaException failures too (message says "... we don't support retries ...") - clear the serializer/codec system properties when stopping Spark so that these are not carried through to subsequent tests in same JVMs
- Loading branch information
Sumedh Wale
authored
Nov 28, 2016
1 parent
46f3c1e
commit 9e75292
Showing
39 changed files
with
1,436 additions
and
402 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.