Binary compatibility report for the succinct-0.1.2 library between 1.3.0 and 1.5.0 versions (relating to the portability of client application succinct-0.1.2.jar)
Test Info
Library Name | succinct-0.1.2 |
Version #1 | 1.3.0 |
Version #2 | 1.5.0 |
Java Version | 1.7.0_75 |
Test Results
Total Java ARchives | 1 |
---|
Total Methods / Classes | 2586 / 463 |
---|
Verdict | Incompatible (67.3%) |
Problem Summary
| Severity | Count |
---|
Added Methods | - | 352 |
---|
Removed Methods | High | 902 |
---|
Problems with Data Types | High | 104 |
---|
Medium | 13 |
Low | 36 |
Problems with Methods | High | 0 |
---|
Medium | 1 |
Low | 1 |
Other Changes in Data Types | - | 14 |
Added Methods (352)
spark-sql_2.10-1.5.0.jar, Aggregate.class
package org.apache.spark.sql.execution
Aggregate.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/Aggregate.doExecute:()Lorg/apache/spark/rdd/RDD;]
Aggregate.metrics ( ) : scala.collection.immutable.Map<String,metric.LongSQLMetric>
[mangled: org/apache/spark/sql/execution/Aggregate.metrics:()Lscala/collection/immutable/Map;]
Aggregate.Aggregate..newAggregateBuffer ( ) : org.apache.spark.sql.catalyst.expressions.AggregateFunction1[ ]
[mangled: org/apache/spark/sql/execution/Aggregate.org.apache.spark.sql.execution.Aggregate..newAggregateBuffer:()[Lorg/apache/spark/sql/catalyst/expressions/AggregateFunction1;]
Aggregate.requiredChildDistribution ( ) : scala.collection.immutable.List<org.apache.spark.sql.catalyst.plans.physical.Distribution>
[mangled: org/apache/spark/sql/execution/Aggregate.requiredChildDistribution:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.5.0.jar, BaseRelation.class
package org.apache.spark.sql.sources
BaseRelation.needConversion ( ) : boolean
[mangled: org/apache/spark/sql/sources/BaseRelation.needConversion:()Z]
spark-sql_2.10-1.5.0.jar, BatchPythonEvaluation.class
package org.apache.spark.sql.execution
BatchPythonEvaluation.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/BatchPythonEvaluation.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.5.0.jar, BinaryNode.class
package org.apache.spark.sql.execution
BinaryNode.children ( ) [abstract] : scala.collection.Seq<SparkPlan>
[mangled: org/apache/spark/sql/execution/BinaryNode.children:()Lscala/collection/Seq;]
BinaryNode.left ( ) [abstract] : SparkPlan
[mangled: org/apache/spark/sql/execution/BinaryNode.left:()Lorg/apache/spark/sql/execution/SparkPlan;]
BinaryNode.right ( ) [abstract] : SparkPlan
[mangled: org/apache/spark/sql/execution/BinaryNode.right:()Lorg/apache/spark/sql/execution/SparkPlan;]
spark-sql_2.10-1.5.0.jar, BroadcastHashJoin.class
package org.apache.spark.sql.execution.joins
BroadcastHashJoin.canProcessSafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.canProcessSafeRows:()Z]
BroadcastHashJoin.canProcessUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.canProcessUnsafeRows:()Z]
BroadcastHashJoin.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.doExecute:()Lorg/apache/spark/rdd/RDD;]
BroadcastHashJoin.doPrepare ( ) : void
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.doPrepare:()V]
BroadcastHashJoin.hashJoin ( scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow> streamIter, org.apache.spark.sql.execution.metric.LongSQLMetric numStreamRows, HashedRelation hashedRelation, org.apache.spark.sql.execution.metric.LongSQLMetric numOutputRows ) : scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.hashJoin:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;Lorg/apache/spark/sql/execution/joins/HashedRelation;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;)Lscala/collection/Iterator;]
BroadcastHashJoin.isUnsafeMode ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.isUnsafeMode:()Z]
BroadcastHashJoin.metrics ( ) : scala.collection.immutable.Map<String,org.apache.spark.sql.execution.metric.LongSQLMetric>
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.metrics:()Lscala/collection/immutable/Map;]
BroadcastHashJoin.outputsUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.outputsUnsafeRows:()Z]
BroadcastHashJoin.streamSideKeyGenerator ( ) : org.apache.spark.sql.catalyst.expressions.package.Projection
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.streamSideKeyGenerator:()Lorg/apache/spark/sql/catalyst/expressions/package$Projection;]
spark-sql_2.10-1.5.0.jar, BroadcastLeftSemiJoinHash.class
package org.apache.spark.sql.execution.joins
BroadcastLeftSemiJoinHash.BroadcastLeftSemiJoinHash ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> leftKeys, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> rightKeys, org.apache.spark.sql.execution.SparkPlan left, org.apache.spark.sql.execution.SparkPlan right, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression> condition )
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash."<init>":(Lscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;Lorg/apache/spark/sql/execution/SparkPlan;Lscala/Option;)V]
BroadcastLeftSemiJoinHash.buildKeyHashSet ( scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow> buildIter, org.apache.spark.sql.execution.metric.LongSQLMetric numBuildRows ) : java.util.Set<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.buildKeyHashSet:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;)Ljava/util/Set;]
BroadcastLeftSemiJoinHash.canProcessSafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.canProcessSafeRows:()Z]
BroadcastLeftSemiJoinHash.canProcessUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.canProcessUnsafeRows:()Z]
BroadcastLeftSemiJoinHash.condition ( ) : scala.Option<org.apache.spark.sql.catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.condition:()Lscala/Option;]
BroadcastLeftSemiJoinHash.copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> leftKeys, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> rightKeys, org.apache.spark.sql.execution.SparkPlan left, org.apache.spark.sql.execution.SparkPlan right, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression> condition ) : BroadcastLeftSemiJoinHash
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.copy:(Lscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;Lorg/apache/spark/sql/execution/SparkPlan;Lscala/Option;)Lorg/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash;]
BroadcastLeftSemiJoinHash.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.doExecute:()Lorg/apache/spark/rdd/RDD;]
BroadcastLeftSemiJoinHash.hashSemiJoin ( scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow> streamIter, org.apache.spark.sql.execution.metric.LongSQLMetric numStreamRows, java.util.Set<org.apache.spark.sql.catalyst.InternalRow> hashSet, org.apache.spark.sql.execution.metric.LongSQLMetric numOutputRows ) : scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.hashSemiJoin:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;Ljava/util/Set;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;)Lscala/collection/Iterator;]
BroadcastLeftSemiJoinHash.hashSemiJoin ( scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow> streamIter, org.apache.spark.sql.execution.metric.LongSQLMetric numStreamRows, HashedRelation hashedRelation, org.apache.spark.sql.execution.metric.LongSQLMetric numOutputRows ) : scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.hashSemiJoin:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;Lorg/apache/spark/sql/execution/joins/HashedRelation;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;)Lscala/collection/Iterator;]
BroadcastLeftSemiJoinHash.leftKeyGenerator ( ) : org.apache.spark.sql.catalyst.expressions.package.Projection
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.leftKeyGenerator:()Lorg/apache/spark/sql/catalyst/expressions/package$Projection;]
BroadcastLeftSemiJoinHash.metrics ( ) : scala.collection.immutable.Map<String,org.apache.spark.sql.execution.metric.LongSQLMetric>
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.metrics:()Lscala/collection/immutable/Map;]
BroadcastLeftSemiJoinHash.HashSemiJoin..boundCondition ( ) : scala.Function1<org.apache.spark.sql.catalyst.InternalRow,Object>
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.org.apache.spark.sql.execution.joins.HashSemiJoin..boundCondition:()Lscala/Function1;]
BroadcastLeftSemiJoinHash.outputsUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.outputsUnsafeRows:()Z]
BroadcastLeftSemiJoinHash.rightKeyGenerator ( ) : org.apache.spark.sql.catalyst.expressions.package.Projection
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.rightKeyGenerator:()Lorg/apache/spark/sql/catalyst/expressions/package$Projection;]
BroadcastLeftSemiJoinHash.supportUnsafe ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.supportUnsafe:()Z]
spark-sql_2.10-1.5.0.jar, BroadcastNestedLoopJoin.class
package org.apache.spark.sql.execution.joins
BroadcastNestedLoopJoin.canProcessUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/BroadcastNestedLoopJoin.canProcessUnsafeRows:()Z]
BroadcastNestedLoopJoin.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/BroadcastNestedLoopJoin.doExecute:()Lorg/apache/spark/rdd/RDD;]
BroadcastNestedLoopJoin.metrics ( ) : scala.collection.immutable.Map<String,org.apache.spark.sql.execution.metric.LongSQLMetric>
[mangled: org/apache/spark/sql/execution/joins/BroadcastNestedLoopJoin.metrics:()Lscala/collection/immutable/Map;]
BroadcastNestedLoopJoin.BroadcastNestedLoopJoin..genResultProjection ( ) : scala.Function1<org.apache.spark.sql.catalyst.InternalRow,org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/BroadcastNestedLoopJoin.org.apache.spark.sql.execution.joins.BroadcastNestedLoopJoin..genResultProjection:()Lscala/Function1;]
BroadcastNestedLoopJoin.outputsUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/BroadcastNestedLoopJoin.outputsUnsafeRows:()Z]
spark-sql_2.10-1.5.0.jar, ByteArrayColumnType.class
package org.apache.spark.sql.columnar
ByteArrayColumnType.ByteArrayColumnType ( int typeId, int defaultSize )
[mangled: org/apache/spark/sql/columnar/ByteArrayColumnType."<init>":(II)V]
spark-sql_2.10-1.5.0.jar, CachedBatch.class
package org.apache.spark.sql.columnar
CachedBatch.CachedBatch ( byte[ ][ ] buffers, org.apache.spark.sql.catalyst.InternalRow stats )
[mangled: org/apache/spark/sql/columnar/CachedBatch."<init>":([[BLorg/apache/spark/sql/catalyst/InternalRow;)V]
CachedBatch.copy ( byte[ ][ ] buffers, org.apache.spark.sql.catalyst.InternalRow stats ) : CachedBatch
[mangled: org/apache/spark/sql/columnar/CachedBatch.copy:([[BLorg/apache/spark/sql/catalyst/InternalRow;)Lorg/apache/spark/sql/columnar/CachedBatch;]
CachedBatch.stats ( ) : org.apache.spark.sql.catalyst.InternalRow
[mangled: org/apache/spark/sql/columnar/CachedBatch.stats:()Lorg/apache/spark/sql/catalyst/InternalRow;]
spark-sql_2.10-1.5.0.jar, CacheTableCommand.class
package org.apache.spark.sql.execution
CacheTableCommand.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/execution/CacheTableCommand.children:()Lscala/collection/Seq;]
spark-sql_2.10-1.5.0.jar, CartesianProduct.class
package org.apache.spark.sql.execution.joins
CartesianProduct.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/CartesianProduct.doExecute:()Lorg/apache/spark/rdd/RDD;]
CartesianProduct.metrics ( ) : scala.collection.immutable.Map<String,org.apache.spark.sql.execution.metric.LongSQLMetric>
[mangled: org/apache/spark/sql/execution/joins/CartesianProduct.metrics:()Lscala/collection/immutable/Map;]
spark-sql_2.10-1.5.0.jar, Column.class
package org.apache.spark.sql
Column.alias ( String alias ) : Column
[mangled: org/apache/spark/sql/Column.alias:(Ljava/lang/String;)Lorg/apache/spark/sql/Column;]
Column.apply ( Object extraction ) : Column
[mangled: org/apache/spark/sql/Column.apply:(Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.as ( scala.collection.Seq<String> aliases ) : Column
[mangled: org/apache/spark/sql/Column.as:(Lscala/collection/Seq;)Lorg/apache/spark/sql/Column;]
Column.as ( String alias, types.Metadata metadata ) : Column
[mangled: org/apache/spark/sql/Column.as:(Ljava/lang/String;Lorg/apache/spark/sql/types/Metadata;)Lorg/apache/spark/sql/Column;]
Column.as ( String[ ] aliases ) : Column
[mangled: org/apache/spark/sql/Column.as:([Ljava/lang/String;)Lorg/apache/spark/sql/Column;]
Column.between ( Object lowerBound, Object upperBound ) : Column
[mangled: org/apache/spark/sql/Column.between:(Ljava/lang/Object;Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.bitwiseAND ( Object other ) : Column
[mangled: org/apache/spark/sql/Column.bitwiseAND:(Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.bitwiseOR ( Object other ) : Column
[mangled: org/apache/spark/sql/Column.bitwiseOR:(Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.bitwiseXOR ( Object other ) : Column
[mangled: org/apache/spark/sql/Column.bitwiseXOR:(Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.equals ( Object that ) : boolean
[mangled: org/apache/spark/sql/Column.equals:(Ljava/lang/Object;)Z]
Column.getItem ( Object key ) : Column
[mangled: org/apache/spark/sql/Column.getItem:(Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.hashCode ( ) : int
[mangled: org/apache/spark/sql/Column.hashCode:()I]
Column.in ( Object... list ) : Column
[mangled: org/apache/spark/sql/Column.in:([Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.isin ( Object... list ) : Column
[mangled: org/apache/spark/sql/Column.isin:([Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.isin ( scala.collection.Seq<Object> list ) : Column
[mangled: org/apache/spark/sql/Column.isin:(Lscala/collection/Seq;)Lorg/apache/spark/sql/Column;]
Column.isNaN ( ) : Column
[mangled: org/apache/spark/sql/Column.isNaN:()Lorg/apache/spark/sql/Column;]
Column.isTraceEnabled ( ) : boolean
[mangled: org/apache/spark/sql/Column.isTraceEnabled:()Z]
Column.log ( ) : org.slf4j.Logger
[mangled: org/apache/spark/sql/Column.log:()Lorg/slf4j/Logger;]
Column.logDebug ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/Column.logDebug:(Lscala/Function0;)V]
Column.logDebug ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/Column.logDebug:(Lscala/Function0;Ljava/lang/Throwable;)V]
Column.logError ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/Column.logError:(Lscala/Function0;)V]
Column.logError ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/Column.logError:(Lscala/Function0;Ljava/lang/Throwable;)V]
Column.logInfo ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/Column.logInfo:(Lscala/Function0;)V]
Column.logInfo ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/Column.logInfo:(Lscala/Function0;Ljava/lang/Throwable;)V]
Column.logName ( ) : String
[mangled: org/apache/spark/sql/Column.logName:()Ljava/lang/String;]
Column.logTrace ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/Column.logTrace:(Lscala/Function0;)V]
Column.logTrace ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/Column.logTrace:(Lscala/Function0;Ljava/lang/Throwable;)V]
Column.logWarning ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/Column.logWarning:(Lscala/Function0;)V]
Column.logWarning ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/Column.logWarning:(Lscala/Function0;Ljava/lang/Throwable;)V]
Column.org.apache.spark.Logging..log_ ( ) : org.slf4j.Logger
[mangled: org/apache/spark/sql/Column.org.apache.spark.Logging..log_:()Lorg/slf4j/Logger;]
Column.org.apache.spark.Logging..log__.eq ( org.slf4j.Logger p1 ) : void
[mangled: org/apache/spark/sql/Column.org.apache.spark.Logging..log__.eq:(Lorg/slf4j/Logger;)V]
Column.otherwise ( Object value ) : Column
[mangled: org/apache/spark/sql/Column.otherwise:(Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.over ( expressions.WindowSpec window ) : Column
[mangled: org/apache/spark/sql/Column.over:(Lorg/apache/spark/sql/expressions/WindowSpec;)Lorg/apache/spark/sql/Column;]
Column.when ( Column condition, Object value ) : Column
[mangled: org/apache/spark/sql/Column.when:(Lorg/apache/spark/sql/Column;Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
spark-sql_2.10-1.5.0.jar, ColumnBuilder.class
package org.apache.spark.sql.columnar
ColumnBuilder.appendFrom ( org.apache.spark.sql.catalyst.InternalRow p1, int p2 ) [abstract] : void
[mangled: org/apache/spark/sql/columnar/ColumnBuilder.appendFrom:(Lorg/apache/spark/sql/catalyst/InternalRow;I)V]
spark-sql_2.10-1.5.0.jar, ColumnStats.class
package org.apache.spark.sql.columnar
ColumnStats.collectedStatistics ( ) [abstract] : org.apache.spark.sql.catalyst.expressions.GenericInternalRow
[mangled: org/apache/spark/sql/columnar/ColumnStats.collectedStatistics:()Lorg/apache/spark/sql/catalyst/expressions/GenericInternalRow;]
ColumnStats.gatherStats ( org.apache.spark.sql.catalyst.InternalRow p1, int p2 ) [abstract] : void
[mangled: org/apache/spark/sql/columnar/ColumnStats.gatherStats:(Lorg/apache/spark/sql/catalyst/InternalRow;I)V]
spark-sql_2.10-1.5.0.jar, DataFrame.class
package org.apache.spark.sql
DataFrame.coalesce ( int numPartitions ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.coalesce:(I)Lorg/apache/spark/sql/DataFrame;]
DataFrame.cube ( Column... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.cube ( scala.collection.Seq<Column> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:(Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.cube ( String col1, scala.collection.Seq<String> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.cube ( String col1, String... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.describe ( scala.collection.Seq<String> cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.describe:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.describe ( String... cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.describe:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.drop ( Column col ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.drop:(Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.drop ( String colName ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.drop:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.dropDuplicates ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.dropDuplicates:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.dropDuplicates ( scala.collection.Seq<String> colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.dropDuplicates:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.dropDuplicates ( String[ ] colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.dropDuplicates:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.inputFiles ( ) : String[ ]
[mangled: org/apache/spark/sql/DataFrame.inputFiles:()[Ljava/lang/String;]
DataFrame.join ( DataFrame right, scala.collection.Seq<String> usingColumns ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.join ( DataFrame right, String usingColumn ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.na ( ) : DataFrameNaFunctions
[mangled: org/apache/spark/sql/DataFrame.na:()Lorg/apache/spark/sql/DataFrameNaFunctions;]
DataFrame.DataFrame..logicalPlanToDataFrame ( catalyst.plans.logical.LogicalPlan logicalPlan ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.org.apache.spark.sql.DataFrame..logicalPlanToDataFrame:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.randomSplit ( double[ ] weights ) : DataFrame[ ]
[mangled: org/apache/spark/sql/DataFrame.randomSplit:([D)[Lorg/apache/spark/sql/DataFrame;]
DataFrame.randomSplit ( double[ ] weights, long seed ) : DataFrame[ ]
[mangled: org/apache/spark/sql/DataFrame.randomSplit:([DJ)[Lorg/apache/spark/sql/DataFrame;]
DataFrame.randomSplit ( scala.collection.immutable.List<Object> weights, long seed ) : DataFrame[ ]
[mangled: org/apache/spark/sql/DataFrame.randomSplit:(Lscala/collection/immutable/List;J)[Lorg/apache/spark/sql/DataFrame;]
DataFrame.rollup ( Column... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.rollup ( scala.collection.Seq<Column> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:(Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.rollup ( String col1, scala.collection.Seq<String> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.rollup ( String col1, String... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.show ( boolean truncate ) : void
[mangled: org/apache/spark/sql/DataFrame.show:(Z)V]
DataFrame.show ( int numRows, boolean truncate ) : void
[mangled: org/apache/spark/sql/DataFrame.show:(IZ)V]
DataFrame.showString ( int _numRows, boolean truncate ) : String
[mangled: org/apache/spark/sql/DataFrame.showString:(IZ)Ljava/lang/String;]
DataFrame.stat ( ) : DataFrameStatFunctions
[mangled: org/apache/spark/sql/DataFrame.stat:()Lorg/apache/spark/sql/DataFrameStatFunctions;]
DataFrame.where ( String conditionExpr ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.where:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.withNewExecutionId ( scala.Function0<T> body ) : T
[mangled: org/apache/spark/sql/DataFrame.withNewExecutionId:(Lscala/Function0;)Ljava/lang/Object;]
DataFrame.write ( ) : DataFrameWriter
[mangled: org/apache/spark/sql/DataFrame.write:()Lorg/apache/spark/sql/DataFrameWriter;]
spark-sql_2.10-1.5.0.jar, DataFrameNaFunctions.class
package org.apache.spark.sql
DataFrameNaFunctions.DataFrameNaFunctions ( DataFrame df )
[mangled: org/apache/spark/sql/DataFrameNaFunctions."<init>":(Lorg/apache/spark/sql/DataFrame;)V]
spark-sql_2.10-1.5.0.jar, DataFrameWriter.class
package org.apache.spark.sql
DataFrameWriter.format ( String source ) : DataFrameWriter
[mangled: org/apache/spark/sql/DataFrameWriter.format:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrameWriter;]
DataFrameWriter.save ( String path ) : void
[mangled: org/apache/spark/sql/DataFrameWriter.save:(Ljava/lang/String;)V]
spark-sql_2.10-1.5.0.jar, DescribeCommand.class
package org.apache.spark.sql.execution
DescribeCommand.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/execution/DescribeCommand.children:()Lscala/collection/Seq;]
spark-sql_2.10-1.5.0.jar, Encoder<T>.class
package org.apache.spark.sql.columnar.compression
Encoder<T>.gatherCompressibilityStats ( org.apache.spark.sql.catalyst.InternalRow p1, int p2 ) [abstract] : void
[mangled: org/apache/spark/sql/columnar/compression/Encoder<T>.gatherCompressibilityStats:(Lorg/apache/spark/sql/catalyst/InternalRow;I)V]
spark-sql_2.10-1.5.0.jar, EvaluatePython.class
package org.apache.spark.sql.execution
EvaluatePython.javaToPython ( org.apache.spark.rdd.RDD<Object> p1 ) [static] : org.apache.spark.rdd.RDD<byte[ ]>
[mangled: org/apache/spark/sql/execution/EvaluatePython.javaToPython:(Lorg/apache/spark/rdd/RDD;)Lorg/apache/spark/rdd/RDD;]
EvaluatePython.registerPicklers ( ) [static] : void
[mangled: org/apache/spark/sql/execution/EvaluatePython.registerPicklers:()V]
spark-sql_2.10-1.5.0.jar, Except.class
package org.apache.spark.sql.execution
Except.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/Except.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.5.0.jar, Exchange.class
package org.apache.spark.sql.execution
Exchange.canProcessSafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/Exchange.canProcessSafeRows:()Z]
Exchange.canProcessUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/Exchange.canProcessUnsafeRows:()Z]
Exchange.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/Exchange.doExecute:()Lorg/apache/spark/rdd/RDD;]
Exchange.nodeName ( ) : String
[mangled: org/apache/spark/sql/execution/Exchange.nodeName:()Ljava/lang/String;]
Exchange.Exchange..needToCopyObjectsBeforeShuffle ( org.apache.spark.Partitioner partitioner, org.apache.spark.serializer.Serializer serializer ) : boolean
[mangled: org/apache/spark/sql/execution/Exchange.org.apache.spark.sql.execution.Exchange..needToCopyObjectsBeforeShuffle:(Lorg/apache/spark/Partitioner;Lorg/apache/spark/serializer/Serializer;)Z]
Exchange.Exchange..serializer ( ) : org.apache.spark.serializer.Serializer
[mangled: org/apache/spark/sql/execution/Exchange.org.apache.spark.sql.execution.Exchange..serializer:()Lorg/apache/spark/serializer/Serializer;]
Exchange.outputsUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/Exchange.outputsUnsafeRows:()Z]
spark-sql_2.10-1.5.0.jar, ExecutedCommand.class
package org.apache.spark.sql.execution
ExecutedCommand.argString ( ) : String
[mangled: org/apache/spark/sql/execution/ExecutedCommand.argString:()Ljava/lang/String;]
ExecutedCommand.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/ExecutedCommand.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.5.0.jar, Expand.class
package org.apache.spark.sql.execution
Expand.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/Expand.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.5.0.jar, ExplainCommand.class
package org.apache.spark.sql.execution
ExplainCommand.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/execution/ExplainCommand.children:()Lscala/collection/Seq;]
spark-sql_2.10-1.5.0.jar, ExternalSort.class
package org.apache.spark.sql.execution
ExternalSort.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/ExternalSort.doExecute:()Lorg/apache/spark/rdd/RDD;]
ExternalSort.outputOrdering ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>
[mangled: org/apache/spark/sql/execution/ExternalSort.outputOrdering:()Lscala/collection/Seq;]
spark-sql_2.10-1.5.0.jar, Filter.class
package org.apache.spark.sql.execution
Filter.canProcessSafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/Filter.canProcessSafeRows:()Z]
Filter.canProcessUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/Filter.canProcessUnsafeRows:()Z]
Filter.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/Filter.doExecute:()Lorg/apache/spark/rdd/RDD;]
Filter.metrics ( ) : scala.collection.immutable.Map<String,metric.LongSQLMetric>
[mangled: org/apache/spark/sql/execution/Filter.metrics:()Lscala/collection/immutable/Map;]
Filter.outputOrdering ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>
[mangled: org/apache/spark/sql/execution/Filter.outputOrdering:()Lscala/collection/Seq;]
Filter.outputsUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/Filter.outputsUnsafeRows:()Z]
spark-sql_2.10-1.5.0.jar, Generate.class
package org.apache.spark.sql.execution
Generate.copy ( org.apache.spark.sql.catalyst.expressions.Generator generator, boolean join, boolean outer, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, SparkPlan child ) : Generate
[mangled: org/apache/spark/sql/execution/Generate.copy:(Lorg/apache/spark/sql/catalyst/expressions/Generator;ZZLscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/Generate;]
Generate.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/Generate.doExecute:()Lorg/apache/spark/rdd/RDD;]
Generate.Generate ( org.apache.spark.sql.catalyst.expressions.Generator generator, boolean join, boolean outer, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, SparkPlan child )
[mangled: org/apache/spark/sql/execution/Generate."<init>":(Lorg/apache/spark/sql/catalyst/expressions/Generator;ZZLscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;)V]
spark-sql_2.10-1.5.0.jar, HashedRelation.class
package org.apache.spark.sql.execution.joins
HashedRelation.get ( org.apache.spark.sql.catalyst.InternalRow p1 ) [abstract] : scala.collection.Seq<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/HashedRelation.get:(Lorg/apache/spark/sql/catalyst/InternalRow;)Lscala/collection/Seq;]
HashedRelation.readBytes ( java.io.ObjectInput p1 ) [abstract] : byte[ ]
[mangled: org/apache/spark/sql/execution/joins/HashedRelation.readBytes:(Ljava/io/ObjectInput;)[B]
HashedRelation.writeBytes ( java.io.ObjectOutput p1, byte[ ] p2 ) [abstract] : void
[mangled: org/apache/spark/sql/execution/joins/HashedRelation.writeBytes:(Ljava/io/ObjectOutput;[B)V]
spark-sql_2.10-1.5.0.jar, HashJoin.class
package org.apache.spark.sql.execution.joins
HashJoin.canProcessSafeRows ( ) [abstract] : boolean
[mangled: org/apache/spark/sql/execution/joins/HashJoin.canProcessSafeRows:()Z]
HashJoin.canProcessUnsafeRows ( ) [abstract] : boolean
[mangled: org/apache/spark/sql/execution/joins/HashJoin.canProcessUnsafeRows:()Z]
HashJoin.hashJoin ( scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow> p1, org.apache.spark.sql.execution.metric.LongSQLMetric p2, HashedRelation p3, org.apache.spark.sql.execution.metric.LongSQLMetric p4 ) [abstract] : scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/HashJoin.hashJoin:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;Lorg/apache/spark/sql/execution/joins/HashedRelation;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;)Lscala/collection/Iterator;]
HashJoin.isUnsafeMode ( ) [abstract] : boolean
[mangled: org/apache/spark/sql/execution/joins/HashJoin.isUnsafeMode:()Z]
HashJoin.outputsUnsafeRows ( ) [abstract] : boolean
[mangled: org/apache/spark/sql/execution/joins/HashJoin.outputsUnsafeRows:()Z]
HashJoin.streamSideKeyGenerator ( ) [abstract] : org.apache.spark.sql.catalyst.expressions.package.Projection
[mangled: org/apache/spark/sql/execution/joins/HashJoin.streamSideKeyGenerator:()Lorg/apache/spark/sql/catalyst/expressions/package$Projection;]
spark-sql_2.10-1.5.0.jar, HashOuterJoin.class
package org.apache.spark.sql.execution.joins
HashOuterJoin.buildHashTable ( scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow> p1, org.apache.spark.sql.execution.metric.LongSQLMetric p2, org.apache.spark.sql.catalyst.expressions.package.Projection p3 ) [abstract] : java.util.HashMap<org.apache.spark.sql.catalyst.InternalRow,org.apache.spark.util.collection.CompactBuffer<org.apache.spark.sql.catalyst.InternalRow>>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.buildHashTable:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;Lorg/apache/spark/sql/catalyst/expressions/package$Projection;)Ljava/util/HashMap;]
HashOuterJoin.buildKeyGenerator ( ) [abstract] : org.apache.spark.sql.catalyst.expressions.package.Projection
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.buildKeyGenerator:()Lorg/apache/spark/sql/catalyst/expressions/package$Projection;]
HashOuterJoin.buildKeys ( ) [abstract] : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.buildKeys:()Lscala/collection/Seq;]
HashOuterJoin.buildPlan ( ) [abstract] : org.apache.spark.sql.execution.SparkPlan
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.buildPlan:()Lorg/apache/spark/sql/execution/SparkPlan;]
HashOuterJoin.canProcessSafeRows ( ) [abstract] : boolean
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.canProcessSafeRows:()Z]
HashOuterJoin.canProcessUnsafeRows ( ) [abstract] : boolean
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.canProcessUnsafeRows:()Z]
HashOuterJoin.EMPTY_LIST ( ) [abstract] : org.apache.spark.util.collection.CompactBuffer<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.EMPTY_LIST:()Lorg/apache/spark/util/collection/CompactBuffer;]
HashOuterJoin.fullOuterIterator ( org.apache.spark.sql.catalyst.InternalRow p1, scala.collection.Iterable<org.apache.spark.sql.catalyst.InternalRow> p2, scala.collection.Iterable<org.apache.spark.sql.catalyst.InternalRow> p3, org.apache.spark.sql.catalyst.expressions.JoinedRow p4, org.apache.spark.sql.execution.metric.LongSQLMetric p5 ) [abstract] : scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.fullOuterIterator:(Lorg/apache/spark/sql/catalyst/InternalRow;Lscala/collection/Iterable;Lscala/collection/Iterable;Lorg/apache/spark/sql/catalyst/expressions/JoinedRow;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;)Lscala/collection/Iterator;]
HashOuterJoin.isUnsafeMode ( ) [abstract] : boolean
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.isUnsafeMode:()Z]
HashOuterJoin.leftOuterIterator ( org.apache.spark.sql.catalyst.InternalRow p1, org.apache.spark.sql.catalyst.expressions.JoinedRow p2, scala.collection.Iterable<org.apache.spark.sql.catalyst.InternalRow> p3, scala.Function1<org.apache.spark.sql.catalyst.InternalRow,org.apache.spark.sql.catalyst.InternalRow> p4, org.apache.spark.sql.execution.metric.LongSQLMetric p5 ) [abstract] : scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.leftOuterIterator:(Lorg/apache/spark/sql/catalyst/InternalRow;Lorg/apache/spark/sql/catalyst/expressions/JoinedRow;Lscala/collection/Iterable;Lscala/Function1;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;)Lscala/collection/Iterator;]
HashOuterJoin.HashOuterJoin..DUMMY_LIST ( ) [abstract] : org.apache.spark.util.collection.CompactBuffer<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.org.apache.spark.sql.execution.joins.HashOuterJoin..DUMMY_LIST:()Lorg/apache/spark/util/collection/CompactBuffer;]
HashOuterJoin.HashOuterJoin..leftNullRow ( ) [abstract] : org.apache.spark.sql.catalyst.expressions.GenericInternalRow
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.org.apache.spark.sql.execution.joins.HashOuterJoin..leftNullRow:()Lorg/apache/spark/sql/catalyst/expressions/GenericInternalRow;]
HashOuterJoin.HashOuterJoin..rightNullRow ( ) [abstract] : org.apache.spark.sql.catalyst.expressions.GenericInternalRow
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.org.apache.spark.sql.execution.joins.HashOuterJoin..rightNullRow:()Lorg/apache/spark/sql/catalyst/expressions/GenericInternalRow;]
HashOuterJoin.outputsUnsafeRows ( ) [abstract] : boolean
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.outputsUnsafeRows:()Z]
HashOuterJoin.resultProjection ( ) [abstract] : scala.Function1<org.apache.spark.sql.catalyst.InternalRow,org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.resultProjection:()Lscala/Function1;]
HashOuterJoin.rightOuterIterator ( org.apache.spark.sql.catalyst.InternalRow p1, scala.collection.Iterable<org.apache.spark.sql.catalyst.InternalRow> p2, org.apache.spark.sql.catalyst.expressions.JoinedRow p3, scala.Function1<org.apache.spark.sql.catalyst.InternalRow,org.apache.spark.sql.catalyst.InternalRow> p4, org.apache.spark.sql.execution.metric.LongSQLMetric p5 ) [abstract] : scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.rightOuterIterator:(Lorg/apache/spark/sql/catalyst/InternalRow;Lscala/collection/Iterable;Lorg/apache/spark/sql/catalyst/expressions/JoinedRow;Lscala/Function1;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;)Lscala/collection/Iterator;]
HashOuterJoin.streamedKeyGenerator ( ) [abstract] : org.apache.spark.sql.catalyst.expressions.package.Projection
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.streamedKeyGenerator:()Lorg/apache/spark/sql/catalyst/expressions/package$Projection;]
HashOuterJoin.streamedKeys ( ) [abstract] : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.streamedKeys:()Lscala/collection/Seq;]
HashOuterJoin.streamedPlan ( ) [abstract] : org.apache.spark.sql.execution.SparkPlan
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.streamedPlan:()Lorg/apache/spark/sql/execution/SparkPlan;]
spark-sql_2.10-1.5.0.jar, InMemoryColumnarTableScan.class
package org.apache.spark.sql.columnar
InMemoryColumnarTableScan.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/columnar/InMemoryColumnarTableScan.doExecute:()Lorg/apache/spark/rdd/RDD;]
InMemoryColumnarTableScan.enableAccumulators ( ) : boolean
[mangled: org/apache/spark/sql/columnar/InMemoryColumnarTableScan.enableAccumulators:()Z]
spark-sql_2.10-1.5.0.jar, InMemoryRelation.class
package org.apache.spark.sql.columnar
InMemoryRelation.copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, boolean useCompression, int batchSize, org.apache.spark.storage.StorageLevel storageLevel, org.apache.spark.sql.execution.SparkPlan child, scala.Option<String> tableName, org.apache.spark.rdd.RDD<CachedBatch> _cachedColumnBuffers, org.apache.spark.sql.catalyst.plans.logical.Statistics _statistics, org.apache.spark.Accumulable<scala.collection.mutable.ArrayBuffer<org.apache.spark.sql.catalyst.InternalRow>,org.apache.spark.sql.catalyst.InternalRow> _batchStats ) : InMemoryRelation
[mangled: org/apache/spark/sql/columnar/InMemoryRelation.copy:(Lscala/collection/Seq;ZILorg/apache/spark/storage/StorageLevel;Lorg/apache/spark/sql/execution/SparkPlan;Lscala/Option;Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/catalyst/plans/logical/Statistics;Lorg/apache/spark/Accumulable;)Lorg/apache/spark/sql/columnar/InMemoryRelation;]
InMemoryRelation.InMemoryRelation ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, boolean useCompression, int batchSize, org.apache.spark.storage.StorageLevel storageLevel, org.apache.spark.sql.execution.SparkPlan child, scala.Option<String> tableName, org.apache.spark.rdd.RDD<CachedBatch> _cachedColumnBuffers, org.apache.spark.sql.catalyst.plans.logical.Statistics _statistics, org.apache.spark.Accumulable<scala.collection.mutable.ArrayBuffer<org.apache.spark.sql.catalyst.InternalRow>,org.apache.spark.sql.catalyst.InternalRow> _batchStats )
[mangled: org/apache/spark/sql/columnar/InMemoryRelation."<init>":(Lscala/collection/Seq;ZILorg/apache/spark/storage/StorageLevel;Lorg/apache/spark/sql/execution/SparkPlan;Lscala/Option;Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/catalyst/plans/logical/Statistics;Lorg/apache/spark/Accumulable;)V]
InMemoryRelation.newInstance ( ) : org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/columnar/InMemoryRelation.newInstance:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
InMemoryRelation.uncache ( boolean blocking ) : void
[mangled: org/apache/spark/sql/columnar/InMemoryRelation.uncache:(Z)V]
spark-sql_2.10-1.5.0.jar, IntColumnStats.class
package org.apache.spark.sql.columnar
IntColumnStats.collectedStatistics ( ) : org.apache.spark.sql.catalyst.expressions.GenericInternalRow
[mangled: org/apache/spark/sql/columnar/IntColumnStats.collectedStatistics:()Lorg/apache/spark/sql/catalyst/expressions/GenericInternalRow;]
IntColumnStats.gatherStats ( org.apache.spark.sql.catalyst.InternalRow row, int ordinal ) : void
[mangled: org/apache/spark/sql/columnar/IntColumnStats.gatherStats:(Lorg/apache/spark/sql/catalyst/InternalRow;I)V]
spark-sql_2.10-1.5.0.jar, Intersect.class
package org.apache.spark.sql.execution
Intersect.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/Intersect.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.5.0.jar, LeafNode.class
package org.apache.spark.sql.execution
LeafNode.children ( ) [abstract] : scala.collection.Seq<SparkPlan>
[mangled: org/apache/spark/sql/execution/LeafNode.children:()Lscala/collection/Seq;]
spark-sql_2.10-1.5.0.jar, LeftSemiJoinBNL.class
package org.apache.spark.sql.execution.joins
LeftSemiJoinBNL.canProcessUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinBNL.canProcessUnsafeRows:()Z]
LeftSemiJoinBNL.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinBNL.doExecute:()Lorg/apache/spark/rdd/RDD;]
LeftSemiJoinBNL.metrics ( ) : scala.collection.immutable.Map<String,org.apache.spark.sql.execution.metric.LongSQLMetric>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinBNL.metrics:()Lscala/collection/immutable/Map;]
LeftSemiJoinBNL.outputsUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinBNL.outputsUnsafeRows:()Z]
spark-sql_2.10-1.5.0.jar, LeftSemiJoinHash.class
package org.apache.spark.sql.execution.joins
LeftSemiJoinHash.buildKeyHashSet ( scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow> buildIter, org.apache.spark.sql.execution.metric.LongSQLMetric numBuildRows ) : java.util.Set<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.buildKeyHashSet:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;)Ljava/util/Set;]
LeftSemiJoinHash.canProcessSafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.canProcessSafeRows:()Z]
LeftSemiJoinHash.canProcessUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.canProcessUnsafeRows:()Z]
LeftSemiJoinHash.condition ( ) : scala.Option<org.apache.spark.sql.catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.condition:()Lscala/Option;]
LeftSemiJoinHash.copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> leftKeys, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> rightKeys, org.apache.spark.sql.execution.SparkPlan left, org.apache.spark.sql.execution.SparkPlan right, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression> condition ) : LeftSemiJoinHash
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.copy:(Lscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;Lorg/apache/spark/sql/execution/SparkPlan;Lscala/Option;)Lorg/apache/spark/sql/execution/joins/LeftSemiJoinHash;]
LeftSemiJoinHash.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.doExecute:()Lorg/apache/spark/rdd/RDD;]
LeftSemiJoinHash.hashSemiJoin ( scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow> streamIter, org.apache.spark.sql.execution.metric.LongSQLMetric numStreamRows, java.util.Set<org.apache.spark.sql.catalyst.InternalRow> hashSet, org.apache.spark.sql.execution.metric.LongSQLMetric numOutputRows ) : scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.hashSemiJoin:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;Ljava/util/Set;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;)Lscala/collection/Iterator;]
LeftSemiJoinHash.hashSemiJoin ( scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow> streamIter, org.apache.spark.sql.execution.metric.LongSQLMetric numStreamRows, HashedRelation hashedRelation, org.apache.spark.sql.execution.metric.LongSQLMetric numOutputRows ) : scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.hashSemiJoin:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;Lorg/apache/spark/sql/execution/joins/HashedRelation;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;)Lscala/collection/Iterator;]
LeftSemiJoinHash.leftKeyGenerator ( ) : org.apache.spark.sql.catalyst.expressions.package.Projection
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.leftKeyGenerator:()Lorg/apache/spark/sql/catalyst/expressions/package$Projection;]
LeftSemiJoinHash.LeftSemiJoinHash ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> leftKeys, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> rightKeys, org.apache.spark.sql.execution.SparkPlan left, org.apache.spark.sql.execution.SparkPlan right, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression> condition )
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash."<init>":(Lscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;Lorg/apache/spark/sql/execution/SparkPlan;Lscala/Option;)V]
LeftSemiJoinHash.metrics ( ) : scala.collection.immutable.Map<String,org.apache.spark.sql.execution.metric.LongSQLMetric>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.metrics:()Lscala/collection/immutable/Map;]
LeftSemiJoinHash.HashSemiJoin..boundCondition ( ) : scala.Function1<org.apache.spark.sql.catalyst.InternalRow,Object>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.org.apache.spark.sql.execution.joins.HashSemiJoin..boundCondition:()Lscala/Function1;]
LeftSemiJoinHash.outputPartitioning ( ) : org.apache.spark.sql.catalyst.plans.physical.Partitioning
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.outputPartitioning:()Lorg/apache/spark/sql/catalyst/plans/physical/Partitioning;]
LeftSemiJoinHash.outputsUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.outputsUnsafeRows:()Z]
LeftSemiJoinHash.rightKeyGenerator ( ) : org.apache.spark.sql.catalyst.expressions.package.Projection
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.rightKeyGenerator:()Lorg/apache/spark/sql/catalyst/expressions/package$Projection;]
LeftSemiJoinHash.supportUnsafe ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.supportUnsafe:()Z]
spark-sql_2.10-1.5.0.jar, Limit.class
package org.apache.spark.sql.execution
Limit.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/Limit.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.5.0.jar, LocalTableScan.class
package org.apache.spark.sql.execution
LocalTableScan.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/LocalTableScan.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.5.0.jar, LogicalLocalTable.class
package org.apache.spark.sql.execution
LogicalLocalTable.newInstance ( ) : org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/execution/LogicalLocalTable.newInstance:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
spark-sql_2.10-1.5.0.jar, LogicalRDD.class
package org.apache.spark.sql.execution
LogicalRDD.newInstance ( ) : org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/execution/LogicalRDD.newInstance:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
spark-sql_2.10-1.5.0.jar, NativeColumnType<T>.class
package org.apache.spark.sql.columnar
NativeColumnType<T>.dataType ( ) : org.apache.spark.sql.types.DataType
[mangled: org/apache/spark/sql/columnar/NativeColumnType<T>.dataType:()Lorg/apache/spark/sql/types/DataType;]
NativeColumnType<T>.dataType ( ) : T
[mangled: org/apache/spark/sql/columnar/NativeColumnType<T>.dataType:()Lorg/apache/spark/sql/types/AtomicType;]
NativeColumnType<T>.defaultSize ( ) : int
[mangled: org/apache/spark/sql/columnar/NativeColumnType<T>.defaultSize:()I]
NativeColumnType<T>.NativeColumnType ( T dataType, int typeId, int defaultSize ) : public
[mangled: org/apache/spark/sql/columnar/NativeColumnType<T>.org.apache.spark.sql.columnar.NativeColumnType:(Lorg/apache/spark/sql/types/AtomicType;II)V]
NativeColumnType<T>.typeId ( ) : int
[mangled: org/apache/spark/sql/columnar/NativeColumnType<T>.typeId:()I]
spark-sql_2.10-1.5.0.jar, NullableColumnBuilder.class
package org.apache.spark.sql.columnar
NullableColumnBuilder.appendFrom ( org.apache.spark.sql.catalyst.InternalRow p1, int p2 ) [abstract] : void
[mangled: org/apache/spark/sql/columnar/NullableColumnBuilder.appendFrom:(Lorg/apache/spark/sql/catalyst/InternalRow;I)V]
NullableColumnBuilder.NullableColumnBuilder..super.appendFrom ( org.apache.spark.sql.catalyst.InternalRow p1, int p2 ) [abstract] : void
[mangled: org/apache/spark/sql/columnar/NullableColumnBuilder.org.apache.spark.sql.columnar.NullableColumnBuilder..super.appendFrom:(Lorg/apache/spark/sql/catalyst/InternalRow;I)V]
spark-sql_2.10-1.5.0.jar, OutputFaker.class
package org.apache.spark.sql.execution
OutputFaker.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/OutputFaker.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.5.0.jar, PhysicalRDD.class
package org.apache.spark.sql.execution
PhysicalRDD.copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow> rdd, String extraInformation ) : PhysicalRDD
[mangled: org/apache/spark/sql/execution/PhysicalRDD.copy:(Lscala/collection/Seq;Lorg/apache/spark/rdd/RDD;Ljava/lang/String;)Lorg/apache/spark/sql/execution/PhysicalRDD;]
PhysicalRDD.createFromDataSource ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> p1, org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow> p2, org.apache.spark.sql.sources.BaseRelation p3 ) [static] : PhysicalRDD
[mangled: org/apache/spark/sql/execution/PhysicalRDD.createFromDataSource:(Lscala/collection/Seq;Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/sources/BaseRelation;)Lorg/apache/spark/sql/execution/PhysicalRDD;]
PhysicalRDD.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/PhysicalRDD.doExecute:()Lorg/apache/spark/rdd/RDD;]
PhysicalRDD.extraInformation ( ) : String
[mangled: org/apache/spark/sql/execution/PhysicalRDD.extraInformation:()Ljava/lang/String;]
PhysicalRDD.PhysicalRDD ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow> rdd, String extraInformation )
[mangled: org/apache/spark/sql/execution/PhysicalRDD."<init>":(Lscala/collection/Seq;Lorg/apache/spark/rdd/RDD;Ljava/lang/String;)V]
PhysicalRDD.simpleString ( ) : String
[mangled: org/apache/spark/sql/execution/PhysicalRDD.simpleString:()Ljava/lang/String;]
spark-sql_2.10-1.5.0.jar, Project.class
package org.apache.spark.sql.execution
Project.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/Project.doExecute:()Lorg/apache/spark/rdd/RDD;]
Project.metrics ( ) : scala.collection.immutable.Map<String,metric.LongSQLMetric>
[mangled: org/apache/spark/sql/execution/Project.metrics:()Lscala/collection/immutable/Map;]
Project.outputOrdering ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>
[mangled: org/apache/spark/sql/execution/Project.outputOrdering:()Lscala/collection/Seq;]
spark-sql_2.10-1.5.0.jar, PythonUDF.class
package org.apache.spark.sql.execution
PythonUDF.copy ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, String pythonVer, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, org.apache.spark.sql.types.DataType dataType, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> children ) : PythonUDF
[mangled: org/apache/spark/sql/execution/PythonUDF.copy:(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;Lscala/collection/Seq;)Lorg/apache/spark/sql/execution/PythonUDF;]
PythonUDF.eval ( org.apache.spark.sql.catalyst.InternalRow input ) : Object
[mangled: org/apache/spark/sql/execution/PythonUDF.eval:(Lorg/apache/spark/sql/catalyst/InternalRow;)Ljava/lang/Object;]
PythonUDF.genCode ( org.apache.spark.sql.catalyst.expressions.codegen.CodeGenContext ctx, org.apache.spark.sql.catalyst.expressions.codegen.GeneratedExpressionCode ev ) : String
[mangled: org/apache/spark/sql/execution/PythonUDF.genCode:(Lorg/apache/spark/sql/catalyst/expressions/codegen/CodeGenContext;Lorg/apache/spark/sql/catalyst/expressions/codegen/GeneratedExpressionCode;)Ljava/lang/String;]
PythonUDF.PythonUDF ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, String pythonVer, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, org.apache.spark.sql.types.DataType dataType, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> children )
[mangled: org/apache/spark/sql/execution/PythonUDF."<init>":(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;Lscala/collection/Seq;)V]
PythonUDF.pythonVer ( ) : String
[mangled: org/apache/spark/sql/execution/PythonUDF.pythonVer:()Ljava/lang/String;]
spark-sql_2.10-1.5.0.jar, RunnableCommand.class
package org.apache.spark.sql.execution
RunnableCommand.children ( ) [abstract] : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/execution/RunnableCommand.children:()Lscala/collection/Seq;]
RunnableCommand.output ( ) [abstract] : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/execution/RunnableCommand.output:()Lscala/collection/Seq;]
spark-sql_2.10-1.5.0.jar, Sample.class
package org.apache.spark.sql.execution
Sample.canProcessSafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/Sample.canProcessSafeRows:()Z]
Sample.canProcessUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/Sample.canProcessUnsafeRows:()Z]
Sample.copy ( double lowerBound, double upperBound, boolean withReplacement, long seed, SparkPlan child ) : Sample
[mangled: org/apache/spark/sql/execution/Sample.copy:(DDZJLorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/Sample;]
Sample.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/Sample.doExecute:()Lorg/apache/spark/rdd/RDD;]
Sample.lowerBound ( ) : double
[mangled: org/apache/spark/sql/execution/Sample.lowerBound:()D]
Sample.outputsUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/Sample.outputsUnsafeRows:()Z]
Sample.Sample ( double lowerBound, double upperBound, boolean withReplacement, long seed, SparkPlan child )
[mangled: org/apache/spark/sql/execution/Sample."<init>":(DDZJLorg/apache/spark/sql/execution/SparkPlan;)V]
Sample.upperBound ( ) : double
[mangled: org/apache/spark/sql/execution/Sample.upperBound:()D]
spark-sql_2.10-1.5.0.jar, SetCommand.class
package org.apache.spark.sql.execution
SetCommand.andThen ( scala.Function1<SetCommand,A> p1 ) [static] : scala.Function1<scala.Option<scala.Tuple2<String,scala.Option<String>>>,A>
[mangled: org/apache/spark/sql/execution/SetCommand.andThen:(Lscala/Function1;)Lscala/Function1;]
SetCommand.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/execution/SetCommand.children:()Lscala/collection/Seq;]
SetCommand.compose ( scala.Function1<A,scala.Option<scala.Tuple2<String,scala.Option<String>>>> p1 ) [static] : scala.Function1<A,SetCommand>
[mangled: org/apache/spark/sql/execution/SetCommand.compose:(Lscala/Function1;)Lscala/Function1;]
SetCommand.copy ( scala.Option<scala.Tuple2<String,scala.Option<String>>> kv ) : SetCommand
[mangled: org/apache/spark/sql/execution/SetCommand.copy:(Lscala/Option;)Lorg/apache/spark/sql/execution/SetCommand;]
SetCommand.SetCommand ( scala.Option<scala.Tuple2<String,scala.Option<String>>> kv )
[mangled: org/apache/spark/sql/execution/SetCommand."<init>":(Lscala/Option;)V]
spark-sql_2.10-1.5.0.jar, ShowTablesCommand.class
package org.apache.spark.sql.execution
ShowTablesCommand.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/execution/ShowTablesCommand.children:()Lscala/collection/Seq;]
spark-sql_2.10-1.5.0.jar, ShuffledHashJoin.class
package org.apache.spark.sql.execution.joins
ShuffledHashJoin.canProcessSafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/ShuffledHashJoin.canProcessSafeRows:()Z]
ShuffledHashJoin.canProcessUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/ShuffledHashJoin.canProcessUnsafeRows:()Z]
ShuffledHashJoin.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/ShuffledHashJoin.doExecute:()Lorg/apache/spark/rdd/RDD;]
ShuffledHashJoin.hashJoin ( scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow> streamIter, org.apache.spark.sql.execution.metric.LongSQLMetric numStreamRows, HashedRelation hashedRelation, org.apache.spark.sql.execution.metric.LongSQLMetric numOutputRows ) : scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/joins/ShuffledHashJoin.hashJoin:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;Lorg/apache/spark/sql/execution/joins/HashedRelation;Lorg/apache/spark/sql/execution/metric/LongSQLMetric;)Lscala/collection/Iterator;]
ShuffledHashJoin.isUnsafeMode ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/ShuffledHashJoin.isUnsafeMode:()Z]
ShuffledHashJoin.metrics ( ) : scala.collection.immutable.Map<String,org.apache.spark.sql.execution.metric.LongSQLMetric>
[mangled: org/apache/spark/sql/execution/joins/ShuffledHashJoin.metrics:()Lscala/collection/immutable/Map;]
ShuffledHashJoin.outputsUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/joins/ShuffledHashJoin.outputsUnsafeRows:()Z]
ShuffledHashJoin.streamSideKeyGenerator ( ) : org.apache.spark.sql.catalyst.expressions.package.Projection
[mangled: org/apache/spark/sql/execution/joins/ShuffledHashJoin.streamSideKeyGenerator:()Lorg/apache/spark/sql/catalyst/expressions/package$Projection;]
spark-sql_2.10-1.5.0.jar, Sort.class
package org.apache.spark.sql.execution
Sort.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/Sort.doExecute:()Lorg/apache/spark/rdd/RDD;]
Sort.outputOrdering ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>
[mangled: org/apache/spark/sql/execution/Sort.outputOrdering:()Lscala/collection/Seq;]
spark-sql_2.10-1.5.0.jar, SparkPlan.class
package org.apache.spark.sql.execution
SparkPlan.canProcessSafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/SparkPlan.canProcessSafeRows:()Z]
SparkPlan.canProcessUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/SparkPlan.canProcessUnsafeRows:()Z]
SparkPlan.doExecute ( ) [abstract] : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/SparkPlan.doExecute:()Lorg/apache/spark/rdd/RDD;]
SparkPlan.doPrepare ( ) : void
[mangled: org/apache/spark/sql/execution/SparkPlan.doPrepare:()V]
SparkPlan.longMetric ( String name ) : metric.LongSQLMetric
[mangled: org/apache/spark/sql/execution/SparkPlan.longMetric:(Ljava/lang/String;)Lorg/apache/spark/sql/execution/metric/LongSQLMetric;]
SparkPlan.metrics ( ) : scala.collection.immutable.Map<String,metric.SQLMetric<?,?>>
[mangled: org/apache/spark/sql/execution/SparkPlan.metrics:()Lscala/collection/immutable/Map;]
SparkPlan.newNaturalAscendingOrdering ( scala.collection.Seq<org.apache.spark.sql.types.DataType> dataTypes ) : scala.math.Ordering<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/SparkPlan.newNaturalAscendingOrdering:(Lscala/collection/Seq;)Lscala/math/Ordering;]
SparkPlan.outputOrdering ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>
[mangled: org/apache/spark/sql/execution/SparkPlan.outputOrdering:()Lscala/collection/Seq;]
SparkPlan.outputsUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/SparkPlan.outputsUnsafeRows:()Z]
SparkPlan.prepare ( ) : void
[mangled: org/apache/spark/sql/execution/SparkPlan.prepare:()V]
SparkPlan.requiredChildOrdering ( ) : scala.collection.Seq<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>>
[mangled: org/apache/spark/sql/execution/SparkPlan.requiredChildOrdering:()Lscala/collection/Seq;]
SparkPlan.unsafeEnabled ( ) : boolean
[mangled: org/apache/spark/sql/execution/SparkPlan.unsafeEnabled:()Z]
spark-sql_2.10-1.5.0.jar, SparkSQLParser.class
package org.apache.spark.sql
SparkSQLParser.DESCRIBE ( ) : catalyst.AbstractSparkSQLParser.Keyword
[mangled: org/apache/spark/sql/SparkSQLParser.DESCRIBE:()Lorg/apache/spark/sql/catalyst/AbstractSparkSQLParser$Keyword;]
SparkSQLParser.EXTENDED ( ) : catalyst.AbstractSparkSQLParser.Keyword
[mangled: org/apache/spark/sql/SparkSQLParser.EXTENDED:()Lorg/apache/spark/sql/catalyst/AbstractSparkSQLParser$Keyword;]
SparkSQLParser.FUNCTION ( ) : catalyst.AbstractSparkSQLParser.Keyword
[mangled: org/apache/spark/sql/SparkSQLParser.FUNCTION:()Lorg/apache/spark/sql/catalyst/AbstractSparkSQLParser$Keyword;]
SparkSQLParser.FUNCTIONS ( ) : catalyst.AbstractSparkSQLParser.Keyword
[mangled: org/apache/spark/sql/SparkSQLParser.FUNCTIONS:()Lorg/apache/spark/sql/catalyst/AbstractSparkSQLParser$Keyword;]
SparkSQLParser.SparkSQLParser..desc ( ) : scala.util.parsing.combinator.Parsers.Parser<catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/SparkSQLParser.org.apache.spark.sql.SparkSQLParser..desc:()Lscala/util/parsing/combinator/Parsers$Parser;]
spark-sql_2.10-1.5.0.jar, SparkStrategies.class
package org.apache.spark.sql.execution
SparkStrategies.Aggregation ( ) : SparkStrategies.Aggregation.
[mangled: org/apache/spark/sql/execution/SparkStrategies.Aggregation:()Lorg/apache/spark/sql/execution/SparkStrategies$Aggregation$;]
SparkStrategies.CanBroadcast ( ) : SparkStrategies.CanBroadcast.
[mangled: org/apache/spark/sql/execution/SparkStrategies.CanBroadcast:()Lorg/apache/spark/sql/execution/SparkStrategies$CanBroadcast$;]
SparkStrategies.EquiJoinSelection ( ) : SparkStrategies.EquiJoinSelection.
[mangled: org/apache/spark/sql/execution/SparkStrategies.EquiJoinSelection:()Lorg/apache/spark/sql/execution/SparkStrategies$EquiJoinSelection$;]
SparkStrategies.TakeOrderedAndProject ( ) : SparkStrategies.TakeOrderedAndProject.
[mangled: org/apache/spark/sql/execution/SparkStrategies.TakeOrderedAndProject:()Lorg/apache/spark/sql/execution/SparkStrategies$TakeOrderedAndProject$;]
spark-sql_2.10-1.5.0.jar, SQLContext.class
package org.apache.spark.sql
SQLContext.cacheManager ( ) : execution.CacheManager
[mangled: org/apache/spark/sql/SQLContext.cacheManager:()Lorg/apache/spark/sql/execution/CacheManager;]
SQLContext.createDataFrame ( org.apache.spark.rdd.RDD<Row> rowRDD, types.StructType schema, boolean needsConversion ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;Z)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.createSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.currentSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.currentSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.ddlParser ( ) : execution.datasources.DDLParser
[mangled: org/apache/spark/sql/SQLContext.ddlParser:()Lorg/apache/spark/sql/execution/datasources/DDLParser;]
SQLContext.defaultSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.defaultSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.detachSession ( ) : void
[mangled: org/apache/spark/sql/SQLContext.detachSession:()V]
SQLContext.dialectClassName ( ) : String
[mangled: org/apache/spark/sql/SQLContext.dialectClassName:()Ljava/lang/String;]
SQLContext.getConf ( SQLConf.SQLConfEntry<T> entry ) : T
[mangled: org/apache/spark/sql/SQLContext.getConf:(Lorg/apache/spark/sql/SQLConf$SQLConfEntry;)Ljava/lang/Object;]
SQLContext.getConf ( SQLConf.SQLConfEntry<T> entry, T defaultValue ) : T
[mangled: org/apache/spark/sql/SQLContext.getConf:(Lorg/apache/spark/sql/SQLConf$SQLConfEntry;Ljava/lang/Object;)Ljava/lang/Object;]
SQLContext.getOrCreate ( org.apache.spark.SparkContext p1 ) [static] : SQLContext
[mangled: org/apache/spark/sql/SQLContext.getOrCreate:(Lorg/apache/spark/SparkContext;)Lorg/apache/spark/sql/SQLContext;]
SQLContext.getSQLDialect ( ) : catalyst.ParserDialect
[mangled: org/apache/spark/sql/SQLContext.getSQLDialect:()Lorg/apache/spark/sql/catalyst/ParserDialect;]
SQLContext.internalCreateDataFrame ( org.apache.spark.rdd.RDD<catalyst.InternalRow> catalystRows, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.internalCreateDataFrame:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.listener ( ) : execution.ui.SQLListener
[mangled: org/apache/spark/sql/SQLContext.listener:()Lorg/apache/spark/sql/execution/ui/SQLListener;]
SQLContext.openSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.openSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.range ( long end ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.range:(J)Lorg/apache/spark/sql/DataFrame;]
SQLContext.range ( long start, long end ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.range:(JJ)Lorg/apache/spark/sql/DataFrame;]
SQLContext.range ( long start, long end, long step, int numPartitions ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.range:(JJJI)Lorg/apache/spark/sql/DataFrame;]
SQLContext.read ( ) : DataFrameReader
[mangled: org/apache/spark/sql/SQLContext.read:()Lorg/apache/spark/sql/DataFrameReader;]
SQLContext.setConf ( SQLConf.SQLConfEntry<T> entry, T value ) : void
[mangled: org/apache/spark/sql/SQLContext.setConf:(Lorg/apache/spark/sql/SQLConf$SQLConfEntry;Ljava/lang/Object;)V]
SQLContext.setSession ( SQLContext.SQLSession session ) : void
[mangled: org/apache/spark/sql/SQLContext.setSession:(Lorg/apache/spark/sql/SQLContext$SQLSession;)V]
SQLContext.tlSession ( ) : ThreadLocal<SQLContext.SQLSession>
[mangled: org/apache/spark/sql/SQLContext.tlSession:()Ljava/lang/ThreadLocal;]
spark-sql_2.10-1.5.0.jar, StringContains.class
package org.apache.spark.sql.sources
StringContains.attribute ( ) : String
[mangled: org/apache/spark/sql/sources/StringContains.attribute:()Ljava/lang/String;]
StringContains.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/StringContains.canEqual:(Ljava/lang/Object;)Z]
StringContains.copy ( String attribute, String value ) : StringContains
[mangled: org/apache/spark/sql/sources/StringContains.copy:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/sources/StringContains;]
StringContains.curried ( ) [static] : scala.Function1<String,scala.Function1<String,StringContains>>
[mangled: org/apache/spark/sql/sources/StringContains.curried:()Lscala/Function1;]
StringContains.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/StringContains.equals:(Ljava/lang/Object;)Z]
StringContains.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/StringContains.hashCode:()I]
StringContains.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/StringContains.productArity:()I]
StringContains.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/StringContains.productElement:(I)Ljava/lang/Object;]
StringContains.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/StringContains.productIterator:()Lscala/collection/Iterator;]
StringContains.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/StringContains.productPrefix:()Ljava/lang/String;]
StringContains.StringContains ( String attribute, String value )
[mangled: org/apache/spark/sql/sources/StringContains."<init>":(Ljava/lang/String;Ljava/lang/String;)V]
StringContains.toString ( ) : String
[mangled: org/apache/spark/sql/sources/StringContains.toString:()Ljava/lang/String;]
StringContains.tupled ( ) [static] : scala.Function1<scala.Tuple2<String,String>,StringContains>
[mangled: org/apache/spark/sql/sources/StringContains.tupled:()Lscala/Function1;]
StringContains.value ( ) : String
[mangled: org/apache/spark/sql/sources/StringContains.value:()Ljava/lang/String;]
spark-sql_2.10-1.5.0.jar, StringEndsWith.class
package org.apache.spark.sql.sources
StringEndsWith.attribute ( ) : String
[mangled: org/apache/spark/sql/sources/StringEndsWith.attribute:()Ljava/lang/String;]
StringEndsWith.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/StringEndsWith.canEqual:(Ljava/lang/Object;)Z]
StringEndsWith.copy ( String attribute, String value ) : StringEndsWith
[mangled: org/apache/spark/sql/sources/StringEndsWith.copy:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/sources/StringEndsWith;]
StringEndsWith.curried ( ) [static] : scala.Function1<String,scala.Function1<String,StringEndsWith>>
[mangled: org/apache/spark/sql/sources/StringEndsWith.curried:()Lscala/Function1;]
StringEndsWith.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/StringEndsWith.equals:(Ljava/lang/Object;)Z]
StringEndsWith.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/StringEndsWith.hashCode:()I]
StringEndsWith.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/StringEndsWith.productArity:()I]
StringEndsWith.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/StringEndsWith.productElement:(I)Ljava/lang/Object;]
StringEndsWith.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/StringEndsWith.productIterator:()Lscala/collection/Iterator;]
StringEndsWith.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/StringEndsWith.productPrefix:()Ljava/lang/String;]
StringEndsWith.StringEndsWith ( String attribute, String value )
[mangled: org/apache/spark/sql/sources/StringEndsWith."<init>":(Ljava/lang/String;Ljava/lang/String;)V]
StringEndsWith.toString ( ) : String
[mangled: org/apache/spark/sql/sources/StringEndsWith.toString:()Ljava/lang/String;]
StringEndsWith.tupled ( ) [static] : scala.Function1<scala.Tuple2<String,String>,StringEndsWith>
[mangled: org/apache/spark/sql/sources/StringEndsWith.tupled:()Lscala/Function1;]
StringEndsWith.value ( ) : String
[mangled: org/apache/spark/sql/sources/StringEndsWith.value:()Ljava/lang/String;]
spark-sql_2.10-1.5.0.jar, StringStartsWith.class
package org.apache.spark.sql.sources
StringStartsWith.attribute ( ) : String
[mangled: org/apache/spark/sql/sources/StringStartsWith.attribute:()Ljava/lang/String;]
StringStartsWith.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/StringStartsWith.canEqual:(Ljava/lang/Object;)Z]
StringStartsWith.copy ( String attribute, String value ) : StringStartsWith
[mangled: org/apache/spark/sql/sources/StringStartsWith.copy:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/sources/StringStartsWith;]
StringStartsWith.curried ( ) [static] : scala.Function1<String,scala.Function1<String,StringStartsWith>>
[mangled: org/apache/spark/sql/sources/StringStartsWith.curried:()Lscala/Function1;]
StringStartsWith.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/StringStartsWith.equals:(Ljava/lang/Object;)Z]
StringStartsWith.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/StringStartsWith.hashCode:()I]
StringStartsWith.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/StringStartsWith.productArity:()I]
StringStartsWith.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/StringStartsWith.productElement:(I)Ljava/lang/Object;]
StringStartsWith.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/StringStartsWith.productIterator:()Lscala/collection/Iterator;]
StringStartsWith.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/StringStartsWith.productPrefix:()Ljava/lang/String;]
StringStartsWith.StringStartsWith ( String attribute, String value )
[mangled: org/apache/spark/sql/sources/StringStartsWith."<init>":(Ljava/lang/String;Ljava/lang/String;)V]
StringStartsWith.toString ( ) : String
[mangled: org/apache/spark/sql/sources/StringStartsWith.toString:()Ljava/lang/String;]
StringStartsWith.tupled ( ) [static] : scala.Function1<scala.Tuple2<String,String>,StringStartsWith>
[mangled: org/apache/spark/sql/sources/StringStartsWith.tupled:()Lscala/Function1;]
StringStartsWith.value ( ) : String
[mangled: org/apache/spark/sql/sources/StringStartsWith.value:()Ljava/lang/String;]
spark-sql_2.10-1.5.0.jar, UnaryNode.class
package org.apache.spark.sql.execution
UnaryNode.child ( ) [abstract] : SparkPlan
[mangled: org/apache/spark/sql/execution/UnaryNode.child:()Lorg/apache/spark/sql/execution/SparkPlan;]
UnaryNode.children ( ) [abstract] : scala.collection.Seq<SparkPlan>
[mangled: org/apache/spark/sql/execution/UnaryNode.children:()Lscala/collection/Seq;]
spark-sql_2.10-1.5.0.jar, UncacheTableCommand.class
package org.apache.spark.sql.execution
UncacheTableCommand.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/execution/UncacheTableCommand.children:()Lscala/collection/Seq;]
spark-sql_2.10-1.5.0.jar, Union.class
package org.apache.spark.sql.execution
Union.canProcessSafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/Union.canProcessSafeRows:()Z]
Union.canProcessUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/Union.canProcessUnsafeRows:()Z]
Union.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.catalyst.InternalRow>
[mangled: org/apache/spark/sql/execution/Union.doExecute:()Lorg/apache/spark/rdd/RDD;]
Union.outputsUnsafeRows ( ) : boolean
[mangled: org/apache/spark/sql/execution/Union.outputsUnsafeRows:()Z]
spark-sql_2.10-1.5.0.jar, UserDefinedFunction.class
package org.apache.spark.sql
UserDefinedFunction.copy ( Object f, types.DataType dataType, scala.collection.Seq<types.DataType> inputTypes ) : UserDefinedFunction
[mangled: org/apache/spark/sql/UserDefinedFunction.copy:(Ljava/lang/Object;Lorg/apache/spark/sql/types/DataType;Lscala/collection/Seq;)Lorg/apache/spark/sql/UserDefinedFunction;]
UserDefinedFunction.inputTypes ( ) : scala.collection.Seq<types.DataType>
[mangled: org/apache/spark/sql/UserDefinedFunction.inputTypes:()Lscala/collection/Seq;]
UserDefinedFunction.UserDefinedFunction ( Object f, types.DataType dataType, scala.collection.Seq<types.DataType> inputTypes )
[mangled: org/apache/spark/sql/UserDefinedFunction."<init>":(Ljava/lang/Object;Lorg/apache/spark/sql/types/DataType;Lscala/collection/Seq;)V]
spark-sql_2.10-1.5.0.jar, UserDefinedPythonFunction.class
package org.apache.spark.sql
UserDefinedPythonFunction.builder ( scala.collection.Seq<catalyst.expressions.Expression> e ) : execution.PythonUDF
[mangled: org/apache/spark/sql/UserDefinedPythonFunction.builder:(Lscala/collection/Seq;)Lorg/apache/spark/sql/execution/PythonUDF;]
UserDefinedPythonFunction.copy ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, String pythonVer, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, types.DataType dataType ) : UserDefinedPythonFunction
[mangled: org/apache/spark/sql/UserDefinedPythonFunction.copy:(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;)Lorg/apache/spark/sql/UserDefinedPythonFunction;]
UserDefinedPythonFunction.pythonVer ( ) : String
[mangled: org/apache/spark/sql/UserDefinedPythonFunction.pythonVer:()Ljava/lang/String;]
UserDefinedPythonFunction.UserDefinedPythonFunction ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, String pythonVer, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, types.DataType dataType )
[mangled: org/apache/spark/sql/UserDefinedPythonFunction."<init>":(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;)V]
to the top
Removed Methods (902)
spark-sql_2.10-1.3.0.jar, AddExchange.class
package org.apache.spark.sql.execution
AddExchange.AddExchange ( org.apache.spark.sql.SQLContext sqlContext )
[mangled: org/apache/spark/sql/execution/AddExchange."<init>":(Lorg/apache/spark/sql/SQLContext;)V]
AddExchange.andThen ( scala.Function1<AddExchange,A> p1 ) [static] : scala.Function1<org.apache.spark.sql.SQLContext,A>
[mangled: org/apache/spark/sql/execution/AddExchange.andThen:(Lscala/Function1;)Lscala/Function1;]
AddExchange.apply ( org.apache.spark.sql.catalyst.trees.TreeNode plan ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/AddExchange.apply:(Lorg/apache/spark/sql/catalyst/trees/TreeNode;)Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
AddExchange.apply ( SparkPlan plan ) : SparkPlan
[mangled: org/apache/spark/sql/execution/AddExchange.apply:(Lorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/SparkPlan;]
AddExchange.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/execution/AddExchange.canEqual:(Ljava/lang/Object;)Z]
AddExchange.compose ( scala.Function1<A,org.apache.spark.sql.SQLContext> p1 ) [static] : scala.Function1<A,AddExchange>
[mangled: org/apache/spark/sql/execution/AddExchange.compose:(Lscala/Function1;)Lscala/Function1;]
AddExchange.copy ( org.apache.spark.sql.SQLContext sqlContext ) : AddExchange
[mangled: org/apache/spark/sql/execution/AddExchange.copy:(Lorg/apache/spark/sql/SQLContext;)Lorg/apache/spark/sql/execution/AddExchange;]
AddExchange.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/execution/AddExchange.equals:(Ljava/lang/Object;)Z]
AddExchange.hashCode ( ) : int
[mangled: org/apache/spark/sql/execution/AddExchange.hashCode:()I]
AddExchange.numPartitions ( ) : int
[mangled: org/apache/spark/sql/execution/AddExchange.numPartitions:()I]
AddExchange.productArity ( ) : int
[mangled: org/apache/spark/sql/execution/AddExchange.productArity:()I]
AddExchange.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/execution/AddExchange.productElement:(I)Ljava/lang/Object;]
AddExchange.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/execution/AddExchange.productIterator:()Lscala/collection/Iterator;]
AddExchange.productPrefix ( ) : String
[mangled: org/apache/spark/sql/execution/AddExchange.productPrefix:()Ljava/lang/String;]
AddExchange.sqlContext ( ) : org.apache.spark.sql.SQLContext
[mangled: org/apache/spark/sql/execution/AddExchange.sqlContext:()Lorg/apache/spark/sql/SQLContext;]
AddExchange.toString ( ) : String
[mangled: org/apache/spark/sql/execution/AddExchange.toString:()Ljava/lang/String;]
spark-sql_2.10-1.3.0.jar, Aggregate.class
package org.apache.spark.sql.execution
Aggregate.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/Aggregate.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
Aggregate.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Aggregate.children:()Lscala/collection/immutable/List;]
Aggregate.Aggregate..newAggregateBuffer ( ) : org.apache.spark.sql.catalyst.expressions.AggregateFunction[ ]
[mangled: org/apache/spark/sql/execution/Aggregate.org.apache.spark.sql.execution.Aggregate..newAggregateBuffer:()[Lorg/apache/spark/sql/catalyst/expressions/AggregateFunction;]
spark-sql_2.10-1.3.0.jar, AggregateEvaluation.class
package org.apache.spark.sql.execution
AggregateEvaluation.AggregateEvaluation ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> schema, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> initialValues, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> update, org.apache.spark.sql.catalyst.expressions.Expression result )
[mangled: org/apache/spark/sql/execution/AggregateEvaluation."<init>":(Lscala/collection/Seq;Lscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/catalyst/expressions/Expression;)V]
AggregateEvaluation.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.canEqual:(Ljava/lang/Object;)Z]
AggregateEvaluation.copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> schema, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> initialValues, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> update, org.apache.spark.sql.catalyst.expressions.Expression result ) : AggregateEvaluation
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.copy:(Lscala/collection/Seq;Lscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/execution/AggregateEvaluation;]
AggregateEvaluation.curried ( ) [static] : scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>,scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,scala.Function1<org.apache.spark.sql.catalyst.expressions.Expression,AggregateEvaluation>>>>
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.curried:()Lscala/Function1;]
AggregateEvaluation.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.equals:(Ljava/lang/Object;)Z]
AggregateEvaluation.hashCode ( ) : int
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.hashCode:()I]
AggregateEvaluation.initialValues ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.initialValues:()Lscala/collection/Seq;]
AggregateEvaluation.productArity ( ) : int
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.productArity:()I]
AggregateEvaluation.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.productElement:(I)Ljava/lang/Object;]
AggregateEvaluation.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.productIterator:()Lscala/collection/Iterator;]
AggregateEvaluation.productPrefix ( ) : String
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.productPrefix:()Ljava/lang/String;]
AggregateEvaluation.result ( ) : org.apache.spark.sql.catalyst.expressions.Expression
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.result:()Lorg/apache/spark/sql/catalyst/expressions/Expression;]
AggregateEvaluation.schema ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.schema:()Lscala/collection/Seq;]
AggregateEvaluation.toString ( ) : String
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.toString:()Ljava/lang/String;]
AggregateEvaluation.tupled ( ) [static] : scala.Function1<scala.Tuple4<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>,scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,org.apache.spark.sql.catalyst.expressions.Expression>,AggregateEvaluation>
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.tupled:()Lscala/Function1;]
AggregateEvaluation.update ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/execution/AggregateEvaluation.update:()Lscala/collection/Seq;]
spark-sql_2.10-1.3.0.jar, AppendingParquetOutputFormat.class
package org.apache.spark.sql.parquet
AppendingParquetOutputFormat.AppendingParquetOutputFormat ( int offset )
[mangled: org/apache/spark/sql/parquet/AppendingParquetOutputFormat."<init>":(I)V]
spark-sql_2.10-1.3.0.jar, BatchPythonEvaluation.class
package org.apache.spark.sql.execution
BatchPythonEvaluation.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/BatchPythonEvaluation.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, BroadcastHashJoin.class
package org.apache.spark.sql.execution.joins
BroadcastHashJoin.curried ( ) [static] : scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,scala.Function1<package.BuildSide,scala.Function1<org.apache.spark.sql.execution.SparkPlan,scala.Function1<org.apache.spark.sql.execution.SparkPlan,BroadcastHashJoin>>>>>
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.curried:()Lscala/Function1;]
BroadcastHashJoin.hashJoin ( scala.collection.Iterator<org.apache.spark.sql.Row> streamIter, HashedRelation hashedRelation ) : scala.collection.Iterator<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.hashJoin:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/joins/HashedRelation;)Lscala/collection/Iterator;]
BroadcastHashJoin.left ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.left:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
BroadcastHashJoin.right ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.right:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
BroadcastHashJoin.streamSideKeyGenerator ( ) : scala.Function0<org.apache.spark.sql.catalyst.expressions.package.MutableProjection>
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.streamSideKeyGenerator:()Lscala/Function0;]
BroadcastHashJoin.tupled ( ) [static] : scala.Function1<scala.Tuple5<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,package.BuildSide,org.apache.spark.sql.execution.SparkPlan,org.apache.spark.sql.execution.SparkPlan>,BroadcastHashJoin>
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, BroadcastLeftSemiJoinHash.class
package org.apache.spark.sql.execution.joins
BroadcastLeftSemiJoinHash.BroadcastLeftSemiJoinHash ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> leftKeys, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> rightKeys, org.apache.spark.sql.execution.SparkPlan left, org.apache.spark.sql.execution.SparkPlan right )
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash."<init>":(Lscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;Lorg/apache/spark/sql/execution/SparkPlan;)V]
BroadcastLeftSemiJoinHash.buildKeys ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.buildKeys:()Lscala/collection/Seq;]
BroadcastLeftSemiJoinHash.buildPlan ( ) : org.apache.spark.sql.execution.SparkPlan
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.buildPlan:()Lorg/apache/spark/sql/execution/SparkPlan;]
BroadcastLeftSemiJoinHash.buildSide ( ) : package.BuildRight.
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.buildSide:()Lorg/apache/spark/sql/execution/joins/package$BuildRight$;]
BroadcastLeftSemiJoinHash.buildSide ( ) : package.BuildSide
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.buildSide:()Lorg/apache/spark/sql/execution/joins/package$BuildSide;]
BroadcastLeftSemiJoinHash.buildSideKeyGenerator ( ) : org.apache.spark.sql.catalyst.expressions.package.Projection
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.buildSideKeyGenerator:()Lorg/apache/spark/sql/catalyst/expressions/package$Projection;]
BroadcastLeftSemiJoinHash.copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> leftKeys, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> rightKeys, org.apache.spark.sql.execution.SparkPlan left, org.apache.spark.sql.execution.SparkPlan right ) : BroadcastLeftSemiJoinHash
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.copy:(Lscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;Lorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash;]
BroadcastLeftSemiJoinHash.hashJoin ( scala.collection.Iterator<org.apache.spark.sql.Row> streamIter, HashedRelation hashedRelation ) : scala.collection.Iterator<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.hashJoin:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/joins/HashedRelation;)Lscala/collection/Iterator;]
BroadcastLeftSemiJoinHash.left ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.left:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
BroadcastLeftSemiJoinHash.right ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.right:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
BroadcastLeftSemiJoinHash.streamedKeys ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.streamedKeys:()Lscala/collection/Seq;]
BroadcastLeftSemiJoinHash.streamedPlan ( ) : org.apache.spark.sql.execution.SparkPlan
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.streamedPlan:()Lorg/apache/spark/sql/execution/SparkPlan;]
BroadcastLeftSemiJoinHash.streamSideKeyGenerator ( ) : scala.Function0<org.apache.spark.sql.catalyst.expressions.package.MutableProjection>
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.streamSideKeyGenerator:()Lscala/Function0;]
spark-sql_2.10-1.3.0.jar, BroadcastNestedLoopJoin.class
package org.apache.spark.sql.execution.joins
BroadcastNestedLoopJoin.left ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/BroadcastNestedLoopJoin.left:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
BroadcastNestedLoopJoin.right ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/BroadcastNestedLoopJoin.right:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
spark-sql_2.10-1.3.0.jar, CachedBatch.class
package org.apache.spark.sql.columnar
CachedBatch.CachedBatch ( byte[ ][ ] buffers, org.apache.spark.sql.Row stats )
[mangled: org/apache/spark/sql/columnar/CachedBatch."<init>":([[BLorg/apache/spark/sql/Row;)V]
CachedBatch.copy ( byte[ ][ ] buffers, org.apache.spark.sql.Row stats ) : CachedBatch
[mangled: org/apache/spark/sql/columnar/CachedBatch.copy:([[BLorg/apache/spark/sql/Row;)Lorg/apache/spark/sql/columnar/CachedBatch;]
CachedBatch.stats ( ) : org.apache.spark.sql.Row
[mangled: org/apache/spark/sql/columnar/CachedBatch.stats:()Lorg/apache/spark/sql/Row;]
spark-sql_2.10-1.3.0.jar, CachedData.class
package org.apache.spark.sql
CachedData.CachedData ( catalyst.plans.logical.LogicalPlan plan, columnar.InMemoryRelation cachedRepresentation )
[mangled: org/apache/spark/sql/CachedData."<init>":(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;Lorg/apache/spark/sql/columnar/InMemoryRelation;)V]
CachedData.cachedRepresentation ( ) : columnar.InMemoryRelation
[mangled: org/apache/spark/sql/CachedData.cachedRepresentation:()Lorg/apache/spark/sql/columnar/InMemoryRelation;]
CachedData.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/CachedData.canEqual:(Ljava/lang/Object;)Z]
CachedData.copy ( catalyst.plans.logical.LogicalPlan plan, columnar.InMemoryRelation cachedRepresentation ) : CachedData
[mangled: org/apache/spark/sql/CachedData.copy:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;Lorg/apache/spark/sql/columnar/InMemoryRelation;)Lorg/apache/spark/sql/CachedData;]
CachedData.curried ( ) [static] : scala.Function1<catalyst.plans.logical.LogicalPlan,scala.Function1<columnar.InMemoryRelation,CachedData>>
[mangled: org/apache/spark/sql/CachedData.curried:()Lscala/Function1;]
CachedData.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/CachedData.equals:(Ljava/lang/Object;)Z]
CachedData.hashCode ( ) : int
[mangled: org/apache/spark/sql/CachedData.hashCode:()I]
CachedData.plan ( ) : catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/CachedData.plan:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
CachedData.productArity ( ) : int
[mangled: org/apache/spark/sql/CachedData.productArity:()I]
CachedData.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/CachedData.productElement:(I)Ljava/lang/Object;]
CachedData.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/CachedData.productIterator:()Lscala/collection/Iterator;]
CachedData.productPrefix ( ) : String
[mangled: org/apache/spark/sql/CachedData.productPrefix:()Ljava/lang/String;]
CachedData.toString ( ) : String
[mangled: org/apache/spark/sql/CachedData.toString:()Ljava/lang/String;]
CachedData.tupled ( ) [static] : scala.Function1<scala.Tuple2<catalyst.plans.logical.LogicalPlan,columnar.InMemoryRelation>,CachedData>
[mangled: org/apache/spark/sql/CachedData.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, CacheManager.class
package org.apache.spark.sql
CacheManager.CacheManager ( SQLContext sqlContext )
[mangled: org/apache/spark/sql/CacheManager."<init>":(Lorg/apache/spark/sql/SQLContext;)V]
CacheManager.cacheQuery ( DataFrame query, scala.Option<String> tableName, org.apache.spark.storage.StorageLevel storageLevel ) : void
[mangled: org/apache/spark/sql/CacheManager.cacheQuery:(Lorg/apache/spark/sql/DataFrame;Lscala/Option;Lorg/apache/spark/storage/StorageLevel;)V]
CacheManager.cacheTable ( String tableName ) : void
[mangled: org/apache/spark/sql/CacheManager.cacheTable:(Ljava/lang/String;)V]
CacheManager.clearCache ( ) : void
[mangled: org/apache/spark/sql/CacheManager.clearCache:()V]
CacheManager.invalidateCache ( catalyst.plans.logical.LogicalPlan plan ) : void
[mangled: org/apache/spark/sql/CacheManager.invalidateCache:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)V]
CacheManager.isCached ( String tableName ) : boolean
[mangled: org/apache/spark/sql/CacheManager.isCached:(Ljava/lang/String;)Z]
CacheManager.tryUncacheQuery ( DataFrame query, boolean blocking ) : boolean
[mangled: org/apache/spark/sql/CacheManager.tryUncacheQuery:(Lorg/apache/spark/sql/DataFrame;Z)Z]
CacheManager.uncacheTable ( String tableName ) : void
[mangled: org/apache/spark/sql/CacheManager.uncacheTable:(Ljava/lang/String;)V]
CacheManager.useCachedData ( catalyst.plans.logical.LogicalPlan plan ) : catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/CacheManager.useCachedData:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
spark-sql_2.10-1.3.0.jar, CartesianProduct.class
package org.apache.spark.sql.execution.joins
CartesianProduct.left ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/CartesianProduct.left:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
CartesianProduct.right ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/CartesianProduct.right:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
spark-sql_2.10-1.3.0.jar, CaseInsensitiveMap.class
package org.apache.spark.sql.sources
CaseInsensitiveMap.CaseInsensitiveMap ( scala.collection.immutable.Map<String,String> map )
[mangled: org/apache/spark/sql/sources/CaseInsensitiveMap."<init>":(Lscala/collection/immutable/Map;)V]
spark-sql_2.10-1.3.0.jar, CatalystArrayContainsNullConverter.class
package org.apache.spark.sql.parquet
CatalystArrayContainsNullConverter.CatalystArrayContainsNullConverter ( org.apache.spark.sql.types.DataType elementType, int index, CatalystConverter parent )
[mangled: org/apache/spark/sql/parquet/CatalystArrayContainsNullConverter."<init>":(Lorg/apache/spark/sql/types/DataType;ILorg/apache/spark/sql/parquet/CatalystConverter;)V]
spark-sql_2.10-1.3.0.jar, CatalystArrayConverter.class
package org.apache.spark.sql.parquet
CatalystArrayConverter.CatalystArrayConverter ( org.apache.spark.sql.types.DataType elementType, int index, CatalystConverter parent )
[mangled: org/apache/spark/sql/parquet/CatalystArrayConverter."<init>":(Lorg/apache/spark/sql/types/DataType;ILorg/apache/spark/sql/parquet/CatalystConverter;)V]
spark-sql_2.10-1.3.0.jar, CatalystConverter.class
package org.apache.spark.sql.parquet
CatalystConverter.ARRAY_CONTAINS_NULL_BAG_SCHEMA_NAME ( ) [static] : String
[mangled: org/apache/spark/sql/parquet/CatalystConverter.ARRAY_CONTAINS_NULL_BAG_SCHEMA_NAME:()Ljava/lang/String;]
CatalystConverter.ARRAY_ELEMENTS_SCHEMA_NAME ( ) [static] : String
[mangled: org/apache/spark/sql/parquet/CatalystConverter.ARRAY_ELEMENTS_SCHEMA_NAME:()Ljava/lang/String;]
CatalystConverter.CatalystConverter ( )
[mangled: org/apache/spark/sql/parquet/CatalystConverter."<init>":()V]
CatalystConverter.clearBuffer ( ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.clearBuffer:()V]
CatalystConverter.getCurrentRecord ( ) : org.apache.spark.sql.Row
[mangled: org/apache/spark/sql/parquet/CatalystConverter.getCurrentRecord:()Lorg/apache/spark/sql/Row;]
CatalystConverter.index ( ) [abstract] : int
[mangled: org/apache/spark/sql/parquet/CatalystConverter.index:()I]
CatalystConverter.isRootConverter ( ) : boolean
[mangled: org/apache/spark/sql/parquet/CatalystConverter.isRootConverter:()Z]
CatalystConverter.MAP_KEY_SCHEMA_NAME ( ) [static] : String
[mangled: org/apache/spark/sql/parquet/CatalystConverter.MAP_KEY_SCHEMA_NAME:()Ljava/lang/String;]
CatalystConverter.MAP_SCHEMA_NAME ( ) [static] : String
[mangled: org/apache/spark/sql/parquet/CatalystConverter.MAP_SCHEMA_NAME:()Ljava/lang/String;]
CatalystConverter.MAP_VALUE_SCHEMA_NAME ( ) [static] : String
[mangled: org/apache/spark/sql/parquet/CatalystConverter.MAP_VALUE_SCHEMA_NAME:()Ljava/lang/String;]
CatalystConverter.parent ( ) [abstract] : CatalystConverter
[mangled: org/apache/spark/sql/parquet/CatalystConverter.parent:()Lorg/apache/spark/sql/parquet/CatalystConverter;]
CatalystConverter.readDecimal ( org.apache.spark.sql.types.Decimal dest, parquet.io.api.Binary value, org.apache.spark.sql.types.DecimalType ctype ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.readDecimal:(Lorg/apache/spark/sql/types/Decimal;Lparquet/io/api/Binary;Lorg/apache/spark/sql/types/DecimalType;)V]
CatalystConverter.readTimestamp ( parquet.io.api.Binary value ) : java.sql.Timestamp
[mangled: org/apache/spark/sql/parquet/CatalystConverter.readTimestamp:(Lparquet/io/api/Binary;)Ljava/sql/Timestamp;]
CatalystConverter.size ( ) [abstract] : int
[mangled: org/apache/spark/sql/parquet/CatalystConverter.size:()I]
CatalystConverter.THRIFT_ARRAY_ELEMENTS_SCHEMA_NAME_SUFFIX ( ) [static] : String
[mangled: org/apache/spark/sql/parquet/CatalystConverter.THRIFT_ARRAY_ELEMENTS_SCHEMA_NAME_SUFFIX:()Ljava/lang/String;]
CatalystConverter.updateBinary ( int fieldIndex, parquet.io.api.Binary value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateBinary:(ILparquet/io/api/Binary;)V]
CatalystConverter.updateBoolean ( int fieldIndex, boolean value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateBoolean:(IZ)V]
CatalystConverter.updateByte ( int fieldIndex, byte value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateByte:(IB)V]
CatalystConverter.updateDecimal ( int fieldIndex, parquet.io.api.Binary value, org.apache.spark.sql.types.DecimalType ctype ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateDecimal:(ILparquet/io/api/Binary;Lorg/apache/spark/sql/types/DecimalType;)V]
CatalystConverter.updateDouble ( int fieldIndex, double value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateDouble:(ID)V]
CatalystConverter.updateField ( int p1, Object p2 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateField:(ILjava/lang/Object;)V]
CatalystConverter.updateFloat ( int fieldIndex, float value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateFloat:(IF)V]
CatalystConverter.updateInt ( int fieldIndex, int value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateInt:(II)V]
CatalystConverter.updateLong ( int fieldIndex, long value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateLong:(IJ)V]
CatalystConverter.updateShort ( int fieldIndex, short value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateShort:(IS)V]
CatalystConverter.updateString ( int fieldIndex, String value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateString:(ILjava/lang/String;)V]
CatalystConverter.updateTimestamp ( int fieldIndex, parquet.io.api.Binary value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateTimestamp:(ILparquet/io/api/Binary;)V]
spark-sql_2.10-1.3.0.jar, CatalystGroupConverter.class
package org.apache.spark.sql.parquet
CatalystGroupConverter.buffer ( ) : scala.collection.mutable.ArrayBuffer<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.buffer:()Lscala/collection/mutable/ArrayBuffer;]
CatalystGroupConverter.buffer_.eq ( scala.collection.mutable.ArrayBuffer<org.apache.spark.sql.Row> p1 ) : void
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.buffer_.eq:(Lscala/collection/mutable/ArrayBuffer;)V]
CatalystGroupConverter.CatalystGroupConverter ( org.apache.spark.sql.catalyst.expressions.Attribute[ ] attributes )
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter."<init>":([Lorg/apache/spark/sql/catalyst/expressions/Attribute;)V]
CatalystGroupConverter.CatalystGroupConverter ( org.apache.spark.sql.types.StructField[ ] schema, int index, CatalystConverter parent )
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter."<init>":([Lorg/apache/spark/sql/types/StructField;ILorg/apache/spark/sql/parquet/CatalystConverter;)V]
CatalystGroupConverter.CatalystGroupConverter ( org.apache.spark.sql.types.StructField[ ] schema, int index, CatalystConverter parent, scala.collection.mutable.ArrayBuffer<Object> current, scala.collection.mutable.ArrayBuffer<org.apache.spark.sql.Row> buffer )
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter."<init>":([Lorg/apache/spark/sql/types/StructField;ILorg/apache/spark/sql/parquet/CatalystConverter;Lscala/collection/mutable/ArrayBuffer;Lscala/collection/mutable/ArrayBuffer;)V]
CatalystGroupConverter.clearBuffer ( ) : void
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.clearBuffer:()V]
CatalystGroupConverter.converters ( ) : parquet.io.api.Converter[ ]
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.converters:()[Lparquet/io/api/Converter;]
CatalystGroupConverter.current ( ) : scala.collection.mutable.ArrayBuffer<Object>
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.current:()Lscala/collection/mutable/ArrayBuffer;]
CatalystGroupConverter.current_.eq ( scala.collection.mutable.ArrayBuffer<Object> p1 ) : void
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.current_.eq:(Lscala/collection/mutable/ArrayBuffer;)V]
CatalystGroupConverter.end ( ) : void
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.end:()V]
CatalystGroupConverter.getConverter ( int fieldIndex ) : parquet.io.api.Converter
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.getConverter:(I)Lparquet/io/api/Converter;]
CatalystGroupConverter.getCurrentRecord ( ) : org.apache.spark.sql.Row
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.getCurrentRecord:()Lorg/apache/spark/sql/Row;]
CatalystGroupConverter.index ( ) : int
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.index:()I]
CatalystGroupConverter.parent ( ) : CatalystConverter
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.parent:()Lorg/apache/spark/sql/parquet/CatalystConverter;]
CatalystGroupConverter.schema ( ) : org.apache.spark.sql.types.StructField[ ]
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.schema:()[Lorg/apache/spark/sql/types/StructField;]
CatalystGroupConverter.size ( ) : int
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.size:()I]
CatalystGroupConverter.start ( ) : void
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.start:()V]
CatalystGroupConverter.updateField ( int fieldIndex, Object value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystGroupConverter.updateField:(ILjava/lang/Object;)V]
spark-sql_2.10-1.3.0.jar, CatalystMapConverter.class
package org.apache.spark.sql.parquet
CatalystMapConverter.CatalystMapConverter ( org.apache.spark.sql.types.StructField[ ] schema, int index, CatalystConverter parent )
[mangled: org/apache/spark/sql/parquet/CatalystMapConverter."<init>":([Lorg/apache/spark/sql/types/StructField;ILorg/apache/spark/sql/parquet/CatalystConverter;)V]
spark-sql_2.10-1.3.0.jar, CatalystNativeArrayConverter.class
package org.apache.spark.sql.parquet
CatalystNativeArrayConverter.CatalystNativeArrayConverter ( org.apache.spark.sql.types.NativeType elementType, int index, CatalystConverter parent, int capacity )
[mangled: org/apache/spark/sql/parquet/CatalystNativeArrayConverter."<init>":(Lorg/apache/spark/sql/types/NativeType;ILorg/apache/spark/sql/parquet/CatalystConverter;I)V]
spark-sql_2.10-1.3.0.jar, CatalystPrimitiveConverter.class
package org.apache.spark.sql.parquet
CatalystPrimitiveConverter.addBinary ( parquet.io.api.Binary value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystPrimitiveConverter.addBinary:(Lparquet/io/api/Binary;)V]
CatalystPrimitiveConverter.addBoolean ( boolean value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystPrimitiveConverter.addBoolean:(Z)V]
CatalystPrimitiveConverter.addDouble ( double value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystPrimitiveConverter.addDouble:(D)V]
CatalystPrimitiveConverter.addFloat ( float value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystPrimitiveConverter.addFloat:(F)V]
CatalystPrimitiveConverter.addInt ( int value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystPrimitiveConverter.addInt:(I)V]
CatalystPrimitiveConverter.addLong ( long value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystPrimitiveConverter.addLong:(J)V]
CatalystPrimitiveConverter.CatalystPrimitiveConverter ( CatalystConverter parent, int fieldIndex )
[mangled: org/apache/spark/sql/parquet/CatalystPrimitiveConverter."<init>":(Lorg/apache/spark/sql/parquet/CatalystConverter;I)V]
spark-sql_2.10-1.3.0.jar, CatalystPrimitiveRowConverter.class
package org.apache.spark.sql.parquet
CatalystPrimitiveRowConverter.CatalystPrimitiveRowConverter ( org.apache.spark.sql.catalyst.expressions.Attribute[ ] attributes )
[mangled: org/apache/spark/sql/parquet/CatalystPrimitiveRowConverter."<init>":([Lorg/apache/spark/sql/catalyst/expressions/Attribute;)V]
spark-sql_2.10-1.3.0.jar, CatalystPrimitiveStringConverter.class
package org.apache.spark.sql.parquet
CatalystPrimitiveStringConverter.CatalystPrimitiveStringConverter ( CatalystConverter parent, int fieldIndex )
[mangled: org/apache/spark/sql/parquet/CatalystPrimitiveStringConverter."<init>":(Lorg/apache/spark/sql/parquet/CatalystConverter;I)V]
spark-sql_2.10-1.3.0.jar, CatalystStructConverter.class
package org.apache.spark.sql.parquet
CatalystStructConverter.CatalystStructConverter ( org.apache.spark.sql.types.StructField[ ] schema, int index, CatalystConverter parent )
[mangled: org/apache/spark/sql/parquet/CatalystStructConverter."<init>":([Lorg/apache/spark/sql/types/StructField;ILorg/apache/spark/sql/parquet/CatalystConverter;)V]
spark-sql_2.10-1.3.0.jar, Column.class
package org.apache.spark.sql
Column.apply ( catalyst.expressions.Expression p1 ) [static] : Column
[mangled: org/apache/spark/sql/Column.apply:(Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/Column;]
Column.apply ( String p1 ) [static] : Column
[mangled: org/apache/spark/sql/Column.apply:(Ljava/lang/String;)Lorg/apache/spark/sql/Column;]
Column.getItem ( int ordinal ) : Column
[mangled: org/apache/spark/sql/Column.getItem:(I)Lorg/apache/spark/sql/Column;]
Column.in ( Column... list ) : Column
[mangled: org/apache/spark/sql/Column.in:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/Column;]
spark-sql_2.10-1.3.0.jar, ColumnBuilder.class
package org.apache.spark.sql.columnar
ColumnBuilder.appendFrom ( org.apache.spark.sql.Row p1, int p2 ) [abstract] : void
[mangled: org/apache/spark/sql/columnar/ColumnBuilder.appendFrom:(Lorg/apache/spark/sql/Row;I)V]
spark-sql_2.10-1.3.0.jar, ColumnStats.class
package org.apache.spark.sql.columnar
ColumnStats.collectedStatistics ( ) [abstract] : org.apache.spark.sql.Row
[mangled: org/apache/spark/sql/columnar/ColumnStats.collectedStatistics:()Lorg/apache/spark/sql/Row;]
ColumnStats.gatherStats ( org.apache.spark.sql.Row p1, int p2 ) [abstract] : void
[mangled: org/apache/spark/sql/columnar/ColumnStats.gatherStats:(Lorg/apache/spark/sql/Row;I)V]
spark-sql_2.10-1.3.0.jar, CreateTableUsing.class
package org.apache.spark.sql.sources
CreateTableUsing.allowExisting ( ) : boolean
[mangled: org/apache/spark/sql/sources/CreateTableUsing.allowExisting:()Z]
CreateTableUsing.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/CreateTableUsing.canEqual:(Ljava/lang/Object;)Z]
CreateTableUsing.copy ( String tableName, scala.Option<org.apache.spark.sql.types.StructType> userSpecifiedSchema, String provider, boolean temporary, scala.collection.immutable.Map<String,String> options, boolean allowExisting, boolean managedIfNoPath ) : CreateTableUsing
[mangled: org/apache/spark/sql/sources/CreateTableUsing.copy:(Ljava/lang/String;Lscala/Option;Ljava/lang/String;ZLscala/collection/immutable/Map;ZZ)Lorg/apache/spark/sql/sources/CreateTableUsing;]
CreateTableUsing.CreateTableUsing ( String tableName, scala.Option<org.apache.spark.sql.types.StructType> userSpecifiedSchema, String provider, boolean temporary, scala.collection.immutable.Map<String,String> options, boolean allowExisting, boolean managedIfNoPath )
[mangled: org/apache/spark/sql/sources/CreateTableUsing."<init>":(Ljava/lang/String;Lscala/Option;Ljava/lang/String;ZLscala/collection/immutable/Map;ZZ)V]
CreateTableUsing.curried ( ) [static] : scala.Function1<String,scala.Function1<scala.Option<org.apache.spark.sql.types.StructType>,scala.Function1<String,scala.Function1<Object,scala.Function1<scala.collection.immutable.Map<String,String>,scala.Function1<Object,scala.Function1<Object,CreateTableUsing>>>>>>>
[mangled: org/apache/spark/sql/sources/CreateTableUsing.curried:()Lscala/Function1;]
CreateTableUsing.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/CreateTableUsing.equals:(Ljava/lang/Object;)Z]
CreateTableUsing.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/CreateTableUsing.hashCode:()I]
CreateTableUsing.managedIfNoPath ( ) : boolean
[mangled: org/apache/spark/sql/sources/CreateTableUsing.managedIfNoPath:()Z]
CreateTableUsing.options ( ) : scala.collection.immutable.Map<String,String>
[mangled: org/apache/spark/sql/sources/CreateTableUsing.options:()Lscala/collection/immutable/Map;]
CreateTableUsing.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/CreateTableUsing.productArity:()I]
CreateTableUsing.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/CreateTableUsing.productElement:(I)Ljava/lang/Object;]
CreateTableUsing.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/CreateTableUsing.productIterator:()Lscala/collection/Iterator;]
CreateTableUsing.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/CreateTableUsing.productPrefix:()Ljava/lang/String;]
CreateTableUsing.provider ( ) : String
[mangled: org/apache/spark/sql/sources/CreateTableUsing.provider:()Ljava/lang/String;]
CreateTableUsing.tableName ( ) : String
[mangled: org/apache/spark/sql/sources/CreateTableUsing.tableName:()Ljava/lang/String;]
CreateTableUsing.temporary ( ) : boolean
[mangled: org/apache/spark/sql/sources/CreateTableUsing.temporary:()Z]
CreateTableUsing.tupled ( ) [static] : scala.Function1<scala.Tuple7<String,scala.Option<org.apache.spark.sql.types.StructType>,String,Object,scala.collection.immutable.Map<String,String>,Object,Object>,CreateTableUsing>
[mangled: org/apache/spark/sql/sources/CreateTableUsing.tupled:()Lscala/Function1;]
CreateTableUsing.userSpecifiedSchema ( ) : scala.Option<org.apache.spark.sql.types.StructType>
[mangled: org/apache/spark/sql/sources/CreateTableUsing.userSpecifiedSchema:()Lscala/Option;]
spark-sql_2.10-1.3.0.jar, CreateTableUsingAsSelect.class
package org.apache.spark.sql.sources
CreateTableUsingAsSelect.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.canEqual:(Ljava/lang/Object;)Z]
CreateTableUsingAsSelect.child ( ) : org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.child:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
CreateTableUsingAsSelect.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
CreateTableUsingAsSelect.copy ( String tableName, String provider, boolean temporary, org.apache.spark.sql.SaveMode mode, scala.collection.immutable.Map<String,String> options, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan child ) : CreateTableUsingAsSelect
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.copy:(Ljava/lang/String;Ljava/lang/String;ZLorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/sources/CreateTableUsingAsSelect;]
CreateTableUsingAsSelect.CreateTableUsingAsSelect ( String tableName, String provider, boolean temporary, org.apache.spark.sql.SaveMode mode, scala.collection.immutable.Map<String,String> options, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan child )
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect."<init>":(Ljava/lang/String;Ljava/lang/String;ZLorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)V]
CreateTableUsingAsSelect.curried ( ) [static] : scala.Function1<String,scala.Function1<String,scala.Function1<Object,scala.Function1<org.apache.spark.sql.SaveMode,scala.Function1<scala.collection.immutable.Map<String,String>,scala.Function1<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,CreateTableUsingAsSelect>>>>>>
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.curried:()Lscala/Function1;]
CreateTableUsingAsSelect.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.equals:(Ljava/lang/Object;)Z]
CreateTableUsingAsSelect.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.hashCode:()I]
CreateTableUsingAsSelect.mode ( ) : org.apache.spark.sql.SaveMode
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.mode:()Lorg/apache/spark/sql/SaveMode;]
CreateTableUsingAsSelect.options ( ) : scala.collection.immutable.Map<String,String>
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.options:()Lscala/collection/immutable/Map;]
CreateTableUsingAsSelect.output ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.output:()Lscala/collection/Seq;]
CreateTableUsingAsSelect.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.productArity:()I]
CreateTableUsingAsSelect.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.productElement:(I)Ljava/lang/Object;]
CreateTableUsingAsSelect.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.productIterator:()Lscala/collection/Iterator;]
CreateTableUsingAsSelect.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.productPrefix:()Ljava/lang/String;]
CreateTableUsingAsSelect.provider ( ) : String
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.provider:()Ljava/lang/String;]
CreateTableUsingAsSelect.tableName ( ) : String
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.tableName:()Ljava/lang/String;]
CreateTableUsingAsSelect.temporary ( ) : boolean
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.temporary:()Z]
CreateTableUsingAsSelect.tupled ( ) [static] : scala.Function1<scala.Tuple6<String,String,Object,org.apache.spark.sql.SaveMode,scala.collection.immutable.Map<String,String>,org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>,CreateTableUsingAsSelect>
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, CreateTempTableUsing.class
package org.apache.spark.sql.sources
CreateTempTableUsing.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.canEqual:(Ljava/lang/Object;)Z]
CreateTempTableUsing.copy ( String tableName, scala.Option<org.apache.spark.sql.types.StructType> userSpecifiedSchema, String provider, scala.collection.immutable.Map<String,String> options ) : CreateTempTableUsing
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.copy:(Ljava/lang/String;Lscala/Option;Ljava/lang/String;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/sources/CreateTempTableUsing;]
CreateTempTableUsing.CreateTempTableUsing ( String tableName, scala.Option<org.apache.spark.sql.types.StructType> userSpecifiedSchema, String provider, scala.collection.immutable.Map<String,String> options )
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing."<init>":(Ljava/lang/String;Lscala/Option;Ljava/lang/String;Lscala/collection/immutable/Map;)V]
CreateTempTableUsing.curried ( ) [static] : scala.Function1<String,scala.Function1<scala.Option<org.apache.spark.sql.types.StructType>,scala.Function1<String,scala.Function1<scala.collection.immutable.Map<String,String>,CreateTempTableUsing>>>>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.curried:()Lscala/Function1;]
CreateTempTableUsing.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.equals:(Ljava/lang/Object;)Z]
CreateTempTableUsing.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.hashCode:()I]
CreateTempTableUsing.options ( ) : scala.collection.immutable.Map<String,String>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.options:()Lscala/collection/immutable/Map;]
CreateTempTableUsing.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.productArity:()I]
CreateTempTableUsing.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.productElement:(I)Ljava/lang/Object;]
CreateTempTableUsing.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.productIterator:()Lscala/collection/Iterator;]
CreateTempTableUsing.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.productPrefix:()Ljava/lang/String;]
CreateTempTableUsing.provider ( ) : String
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.provider:()Ljava/lang/String;]
CreateTempTableUsing.run ( org.apache.spark.sql.SQLContext sqlContext ) : scala.collection.Seq<scala.runtime.Nothing.>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.run:(Lorg/apache/spark/sql/SQLContext;)Lscala/collection/Seq;]
CreateTempTableUsing.tableName ( ) : String
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.tableName:()Ljava/lang/String;]
CreateTempTableUsing.tupled ( ) [static] : scala.Function1<scala.Tuple4<String,scala.Option<org.apache.spark.sql.types.StructType>,String,scala.collection.immutable.Map<String,String>>,CreateTempTableUsing>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.tupled:()Lscala/Function1;]
CreateTempTableUsing.userSpecifiedSchema ( ) : scala.Option<org.apache.spark.sql.types.StructType>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.userSpecifiedSchema:()Lscala/Option;]
spark-sql_2.10-1.3.0.jar, CreateTempTableUsingAsSelect.class
package org.apache.spark.sql.sources
CreateTempTableUsingAsSelect.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.canEqual:(Ljava/lang/Object;)Z]
CreateTempTableUsingAsSelect.copy ( String tableName, String provider, org.apache.spark.sql.SaveMode mode, scala.collection.immutable.Map<String,String> options, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan query ) : CreateTempTableUsingAsSelect
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.copy:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/sources/CreateTempTableUsingAsSelect;]
CreateTempTableUsingAsSelect.CreateTempTableUsingAsSelect ( String tableName, String provider, org.apache.spark.sql.SaveMode mode, scala.collection.immutable.Map<String,String> options, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan query )
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect."<init>":(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)V]
CreateTempTableUsingAsSelect.curried ( ) [static] : scala.Function1<String,scala.Function1<String,scala.Function1<org.apache.spark.sql.SaveMode,scala.Function1<scala.collection.immutable.Map<String,String>,scala.Function1<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,CreateTempTableUsingAsSelect>>>>>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.curried:()Lscala/Function1;]
CreateTempTableUsingAsSelect.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.equals:(Ljava/lang/Object;)Z]
CreateTempTableUsingAsSelect.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.hashCode:()I]
CreateTempTableUsingAsSelect.mode ( ) : org.apache.spark.sql.SaveMode
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.mode:()Lorg/apache/spark/sql/SaveMode;]
CreateTempTableUsingAsSelect.options ( ) : scala.collection.immutable.Map<String,String>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.options:()Lscala/collection/immutable/Map;]
CreateTempTableUsingAsSelect.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.productArity:()I]
CreateTempTableUsingAsSelect.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.productElement:(I)Ljava/lang/Object;]
CreateTempTableUsingAsSelect.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.productIterator:()Lscala/collection/Iterator;]
CreateTempTableUsingAsSelect.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.productPrefix:()Ljava/lang/String;]
CreateTempTableUsingAsSelect.provider ( ) : String
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.provider:()Ljava/lang/String;]
CreateTempTableUsingAsSelect.query ( ) : org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.query:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
CreateTempTableUsingAsSelect.run ( org.apache.spark.sql.SQLContext sqlContext ) : scala.collection.Seq<scala.runtime.Nothing.>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.run:(Lorg/apache/spark/sql/SQLContext;)Lscala/collection/Seq;]
CreateTempTableUsingAsSelect.tableName ( ) : String
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.tableName:()Ljava/lang/String;]
CreateTempTableUsingAsSelect.tupled ( ) [static] : scala.Function1<scala.Tuple5<String,String,org.apache.spark.sql.SaveMode,scala.collection.immutable.Map<String,String>,org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>,CreateTempTableUsingAsSelect>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, DataFrame.class
package org.apache.spark.sql
DataFrame.cache ( ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.cache:()Lorg/apache/spark/sql/RDDApi;]
DataFrame.collect ( ) : Object
[mangled: org/apache/spark/sql/DataFrame.collect:()Ljava/lang/Object;]
DataFrame.first ( ) : Object
[mangled: org/apache/spark/sql/DataFrame.first:()Ljava/lang/Object;]
DataFrame.persist ( ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.persist:()Lorg/apache/spark/sql/RDDApi;]
DataFrame.persist ( org.apache.spark.storage.StorageLevel newLevel ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.persist:(Lorg/apache/spark/storage/StorageLevel;)Lorg/apache/spark/sql/RDDApi;]
DataFrame.showString ( int numRows ) : String
[mangled: org/apache/spark/sql/DataFrame.showString:(I)Ljava/lang/String;]
DataFrame.take ( int n ) : Object
[mangled: org/apache/spark/sql/DataFrame.take:(I)Ljava/lang/Object;]
DataFrame.unpersist ( ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.unpersist:()Lorg/apache/spark/sql/RDDApi;]
DataFrame.unpersist ( boolean blocking ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.unpersist:(Z)Lorg/apache/spark/sql/RDDApi;]
spark-sql_2.10-1.3.0.jar, DDLParser.class
package org.apache.spark.sql.sources
DDLParser.apply ( String input, boolean exceptionOnError ) : scala.Option<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/sources/DDLParser.apply:(Ljava/lang/String;Z)Lscala/Option;]
DDLParser.DDLParser ( scala.Function1<String,org.apache.spark.sql.catalyst.plans.logical.LogicalPlan> parseQuery )
[mangled: org/apache/spark/sql/sources/DDLParser."<init>":(Lscala/Function1;)V]
spark-sql_2.10-1.3.0.jar, DescribeCommand.class
package org.apache.spark.sql.sources
DescribeCommand.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/DescribeCommand.canEqual:(Ljava/lang/Object;)Z]
DescribeCommand.copy ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan table, boolean isExtended ) : DescribeCommand
[mangled: org/apache/spark/sql/sources/DescribeCommand.copy:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;Z)Lorg/apache/spark/sql/sources/DescribeCommand;]
DescribeCommand.curried ( ) [static] : scala.Function1<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,scala.Function1<Object,DescribeCommand>>
[mangled: org/apache/spark/sql/sources/DescribeCommand.curried:()Lscala/Function1;]
DescribeCommand.DescribeCommand ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan table, boolean isExtended )
[mangled: org/apache/spark/sql/sources/DescribeCommand."<init>":(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;Z)V]
DescribeCommand.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/DescribeCommand.equals:(Ljava/lang/Object;)Z]
DescribeCommand.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/DescribeCommand.hashCode:()I]
DescribeCommand.isExtended ( ) : boolean
[mangled: org/apache/spark/sql/sources/DescribeCommand.isExtended:()Z]
DescribeCommand.output ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.AttributeReference>
[mangled: org/apache/spark/sql/sources/DescribeCommand.output:()Lscala/collection/Seq;]
DescribeCommand.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/DescribeCommand.productArity:()I]
DescribeCommand.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/DescribeCommand.productElement:(I)Ljava/lang/Object;]
DescribeCommand.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/DescribeCommand.productIterator:()Lscala/collection/Iterator;]
DescribeCommand.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/DescribeCommand.productPrefix:()Ljava/lang/String;]
DescribeCommand.table ( ) : org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/sources/DescribeCommand.table:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
DescribeCommand.tupled ( ) [static] : scala.Function1<scala.Tuple2<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,Object>,DescribeCommand>
[mangled: org/apache/spark/sql/sources/DescribeCommand.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, Distinct.class
package org.apache.spark.sql.execution
Distinct.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/execution/Distinct.canEqual:(Ljava/lang/Object;)Z]
Distinct.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/Distinct.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
Distinct.child ( ) : SparkPlan
[mangled: org/apache/spark/sql/execution/Distinct.child:()Lorg/apache/spark/sql/execution/SparkPlan;]
Distinct.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Distinct.children:()Lscala/collection/immutable/List;]
Distinct.children ( ) : scala.collection.Seq
[mangled: org/apache/spark/sql/execution/Distinct.children:()Lscala/collection/Seq;]
Distinct.copy ( boolean partial, SparkPlan child ) : Distinct
[mangled: org/apache/spark/sql/execution/Distinct.copy:(ZLorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/Distinct;]
Distinct.curried ( ) [static] : scala.Function1<Object,scala.Function1<SparkPlan,Distinct>>
[mangled: org/apache/spark/sql/execution/Distinct.curried:()Lscala/Function1;]
Distinct.Distinct ( boolean partial, SparkPlan child )
[mangled: org/apache/spark/sql/execution/Distinct."<init>":(ZLorg/apache/spark/sql/execution/SparkPlan;)V]
Distinct.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/execution/Distinct.equals:(Ljava/lang/Object;)Z]
Distinct.execute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/Distinct.execute:()Lorg/apache/spark/rdd/RDD;]
Distinct.hashCode ( ) : int
[mangled: org/apache/spark/sql/execution/Distinct.hashCode:()I]
Distinct.output ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/execution/Distinct.output:()Lscala/collection/Seq;]
Distinct.outputPartitioning ( ) : org.apache.spark.sql.catalyst.plans.physical.Partitioning
[mangled: org/apache/spark/sql/execution/Distinct.outputPartitioning:()Lorg/apache/spark/sql/catalyst/plans/physical/Partitioning;]
Distinct.partial ( ) : boolean
[mangled: org/apache/spark/sql/execution/Distinct.partial:()Z]
Distinct.productArity ( ) : int
[mangled: org/apache/spark/sql/execution/Distinct.productArity:()I]
Distinct.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/execution/Distinct.productElement:(I)Ljava/lang/Object;]
Distinct.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/execution/Distinct.productIterator:()Lscala/collection/Iterator;]
Distinct.productPrefix ( ) : String
[mangled: org/apache/spark/sql/execution/Distinct.productPrefix:()Ljava/lang/String;]
Distinct.requiredChildDistribution ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.physical.Distribution>
[mangled: org/apache/spark/sql/execution/Distinct.requiredChildDistribution:()Lscala/collection/Seq;]
Distinct.tupled ( ) [static] : scala.Function1<scala.Tuple2<Object,SparkPlan>,Distinct>
[mangled: org/apache/spark/sql/execution/Distinct.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, DriverQuirks.class
package org.apache.spark.sql.jdbc
DriverQuirks.DriverQuirks ( )
[mangled: org/apache/spark/sql/jdbc/DriverQuirks."<init>":()V]
DriverQuirks.get ( String p1 ) [static] : DriverQuirks
[mangled: org/apache/spark/sql/jdbc/DriverQuirks.get:(Ljava/lang/String;)Lorg/apache/spark/sql/jdbc/DriverQuirks;]
DriverQuirks.getCatalystType ( int p1, String p2, int p3, org.apache.spark.sql.types.MetadataBuilder p4 ) [abstract] : org.apache.spark.sql.types.DataType
[mangled: org/apache/spark/sql/jdbc/DriverQuirks.getCatalystType:(ILjava/lang/String;ILorg/apache/spark/sql/types/MetadataBuilder;)Lorg/apache/spark/sql/types/DataType;]
DriverQuirks.getJDBCType ( org.apache.spark.sql.types.DataType p1 ) [abstract] : scala.Tuple2<String,scala.Option<Object>>
[mangled: org/apache/spark/sql/jdbc/DriverQuirks.getJDBCType:(Lorg/apache/spark/sql/types/DataType;)Lscala/Tuple2;]
spark-sql_2.10-1.3.0.jar, Encoder<T>.class
package org.apache.spark.sql.columnar.compression
Encoder<T>.gatherCompressibilityStats ( org.apache.spark.sql.Row p1, int p2 ) [abstract] : void
[mangled: org/apache/spark/sql/columnar/compression/Encoder<T>.gatherCompressibilityStats:(Lorg/apache/spark/sql/Row;I)V]
spark-sql_2.10-1.3.0.jar, EvaluatePython.class
package org.apache.spark.sql.execution
EvaluatePython.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/EvaluatePython.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
EvaluatePython.rowToArray ( org.apache.spark.sql.Row p1, scala.collection.Seq<org.apache.spark.sql.types.DataType> p2 ) [static] : Object[ ]
[mangled: org/apache/spark/sql/execution/EvaluatePython.rowToArray:(Lorg/apache/spark/sql/Row;Lscala/collection/Seq;)[Ljava/lang/Object;]
spark-sql_2.10-1.3.0.jar, Except.class
package org.apache.spark.sql.execution
Except.left ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/Except.left:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
Except.right ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/Except.right:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
spark-sql_2.10-1.3.0.jar, Exchange.class
package org.apache.spark.sql.execution
Exchange.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/Exchange.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
Exchange.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Exchange.children:()Lscala/collection/immutable/List;]
Exchange.Exchange..bypassMergeThreshold ( ) : int
[mangled: org/apache/spark/sql/execution/Exchange.org.apache.spark.sql.execution.Exchange..bypassMergeThreshold:()I]
Exchange.sortBasedShuffleOn ( ) : boolean
[mangled: org/apache/spark/sql/execution/Exchange.sortBasedShuffleOn:()Z]
spark-sql_2.10-1.3.0.jar, ExecutedCommand.class
package org.apache.spark.sql.execution
ExecutedCommand.children ( ) : scala.collection.immutable.Nil.
[mangled: org/apache/spark/sql/execution/ExecutedCommand.children:()Lscala/collection/immutable/Nil$;]
spark-sql_2.10-1.3.0.jar, Expand.class
package org.apache.spark.sql.execution
Expand.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/Expand.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
Expand.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Expand.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, ExternalSort.class
package org.apache.spark.sql.execution
ExternalSort.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/ExternalSort.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
ExternalSort.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/ExternalSort.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, Filter.class
package org.apache.spark.sql.execution
Filter.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/Filter.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
Filter.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Filter.children:()Lscala/collection/immutable/List;]
Filter.conditionEvaluator ( ) : scala.Function1<org.apache.spark.sql.Row,Object>
[mangled: org/apache/spark/sql/execution/Filter.conditionEvaluator:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, Generate.class
package org.apache.spark.sql.execution
Generate.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/Generate.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
Generate.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Generate.children:()Lscala/collection/immutable/List;]
Generate.copy ( org.apache.spark.sql.catalyst.expressions.Generator generator, boolean join, boolean outer, SparkPlan child ) : Generate
[mangled: org/apache/spark/sql/execution/Generate.copy:(Lorg/apache/spark/sql/catalyst/expressions/Generator;ZZLorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/Generate;]
Generate.Generate ( org.apache.spark.sql.catalyst.expressions.Generator generator, boolean join, boolean outer, SparkPlan child )
[mangled: org/apache/spark/sql/execution/Generate."<init>":(Lorg/apache/spark/sql/catalyst/expressions/Generator;ZZLorg/apache/spark/sql/execution/SparkPlan;)V]
Generate.generatorOutput ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/execution/Generate.generatorOutput:()Lscala/collection/Seq;]
spark-sql_2.10-1.3.0.jar, GeneratedAggregate.class
package org.apache.spark.sql.execution
GeneratedAggregate.aggregateExpressions ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression>
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.aggregateExpressions:()Lscala/collection/Seq;]
GeneratedAggregate.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.canEqual:(Ljava/lang/Object;)Z]
GeneratedAggregate.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
GeneratedAggregate.child ( ) : SparkPlan
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.child:()Lorg/apache/spark/sql/execution/SparkPlan;]
GeneratedAggregate.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.children:()Lscala/collection/immutable/List;]
GeneratedAggregate.children ( ) : scala.collection.Seq
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.children:()Lscala/collection/Seq;]
GeneratedAggregate.copy ( boolean partial, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> groupingExpressions, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression> aggregateExpressions, SparkPlan child ) : GeneratedAggregate
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.copy:(ZLscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/GeneratedAggregate;]
GeneratedAggregate.curried ( ) [static] : scala.Function1<Object,scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression>,scala.Function1<SparkPlan,GeneratedAggregate>>>>
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.curried:()Lscala/Function1;]
GeneratedAggregate.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.equals:(Ljava/lang/Object;)Z]
GeneratedAggregate.execute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.execute:()Lorg/apache/spark/rdd/RDD;]
GeneratedAggregate.GeneratedAggregate ( boolean partial, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> groupingExpressions, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression> aggregateExpressions, SparkPlan child )
[mangled: org/apache/spark/sql/execution/GeneratedAggregate."<init>":(ZLscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;)V]
GeneratedAggregate.groupingExpressions ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.groupingExpressions:()Lscala/collection/Seq;]
GeneratedAggregate.hashCode ( ) : int
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.hashCode:()I]
GeneratedAggregate.output ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.output:()Lscala/collection/Seq;]
GeneratedAggregate.outputPartitioning ( ) : org.apache.spark.sql.catalyst.plans.physical.Partitioning
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.outputPartitioning:()Lorg/apache/spark/sql/catalyst/plans/physical/Partitioning;]
GeneratedAggregate.partial ( ) : boolean
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.partial:()Z]
GeneratedAggregate.productArity ( ) : int
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.productArity:()I]
GeneratedAggregate.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.productElement:(I)Ljava/lang/Object;]
GeneratedAggregate.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.productIterator:()Lscala/collection/Iterator;]
GeneratedAggregate.productPrefix ( ) : String
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.productPrefix:()Ljava/lang/String;]
GeneratedAggregate.requiredChildDistribution ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.physical.Distribution>
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.requiredChildDistribution:()Lscala/collection/Seq;]
GeneratedAggregate.tupled ( ) [static] : scala.Function1<scala.Tuple4<Object,scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression>,SparkPlan>,GeneratedAggregate>
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, GenericColumnAccessor.class
package org.apache.spark.sql.columnar
GenericColumnAccessor.GenericColumnAccessor ( java.nio.ByteBuffer buffer )
[mangled: org/apache/spark/sql/columnar/GenericColumnAccessor."<init>":(Ljava/nio/ByteBuffer;)V]
spark-sql_2.10-1.3.0.jar, GenericColumnBuilder.class
package org.apache.spark.sql.columnar
GenericColumnBuilder.GenericColumnBuilder ( )
[mangled: org/apache/spark/sql/columnar/GenericColumnBuilder."<init>":()V]
spark-sql_2.10-1.3.0.jar, GenericColumnStats.class
package org.apache.spark.sql.columnar
GenericColumnStats.GenericColumnStats ( )
[mangled: org/apache/spark/sql/columnar/GenericColumnStats."<init>":()V]
spark-sql_2.10-1.3.0.jar, GroupedData.class
package org.apache.spark.sql
GroupedData.GroupedData ( DataFrame df, scala.collection.Seq<catalyst.expressions.Expression> groupingExprs )
[mangled: org/apache/spark/sql/GroupedData."<init>":(Lorg/apache/spark/sql/DataFrame;Lscala/collection/Seq;)V]
spark-sql_2.10-1.3.0.jar, HashedRelation.class
package org.apache.spark.sql.execution.joins
HashedRelation.get ( org.apache.spark.sql.Row p1 ) [abstract] : org.apache.spark.util.collection.CompactBuffer<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/HashedRelation.get:(Lorg/apache/spark/sql/Row;)Lorg/apache/spark/util/collection/CompactBuffer;]
spark-sql_2.10-1.3.0.jar, HashJoin.class
package org.apache.spark.sql.execution.joins
HashJoin.hashJoin ( scala.collection.Iterator<org.apache.spark.sql.Row> p1, HashedRelation p2 ) [abstract] : scala.collection.Iterator<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/HashJoin.hashJoin:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/joins/HashedRelation;)Lscala/collection/Iterator;]
HashJoin.streamSideKeyGenerator ( ) [abstract] : scala.Function0<org.apache.spark.sql.catalyst.expressions.package.MutableProjection>
[mangled: org/apache/spark/sql/execution/joins/HashJoin.streamSideKeyGenerator:()Lscala/Function0;]
spark-sql_2.10-1.3.0.jar, HashOuterJoin.class
package org.apache.spark.sql.execution.joins
HashOuterJoin.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.canEqual:(Ljava/lang/Object;)Z]
HashOuterJoin.children ( ) : scala.collection.Seq<org.apache.spark.sql.execution.SparkPlan>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.children:()Lscala/collection/Seq;]
HashOuterJoin.copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> leftKeys, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> rightKeys, org.apache.spark.sql.catalyst.plans.JoinType joinType, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression> condition, org.apache.spark.sql.execution.SparkPlan left, org.apache.spark.sql.execution.SparkPlan right ) : HashOuterJoin
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.copy:(Lscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/catalyst/plans/JoinType;Lscala/Option;Lorg/apache/spark/sql/execution/SparkPlan;Lorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/joins/HashOuterJoin;]
HashOuterJoin.curried ( ) [static] : scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,scala.Function1<org.apache.spark.sql.catalyst.plans.JoinType,scala.Function1<scala.Option<org.apache.spark.sql.catalyst.expressions.Expression>,scala.Function1<org.apache.spark.sql.execution.SparkPlan,scala.Function1<org.apache.spark.sql.execution.SparkPlan,HashOuterJoin>>>>>>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.curried:()Lscala/Function1;]
HashOuterJoin.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.equals:(Ljava/lang/Object;)Z]
HashOuterJoin.execute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.execute:()Lorg/apache/spark/rdd/RDD;]
HashOuterJoin.hashCode ( ) : int
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.hashCode:()I]
HashOuterJoin.HashOuterJoin ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> leftKeys, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> rightKeys, org.apache.spark.sql.catalyst.plans.JoinType joinType, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression> condition, org.apache.spark.sql.execution.SparkPlan left, org.apache.spark.sql.execution.SparkPlan right )
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin."<init>":(Lscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/catalyst/plans/JoinType;Lscala/Option;Lorg/apache/spark/sql/execution/SparkPlan;Lorg/apache/spark/sql/execution/SparkPlan;)V]
HashOuterJoin.left ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.left:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
HashOuterJoin.HashOuterJoin..buildHashTable ( scala.collection.Iterator<org.apache.spark.sql.Row> iter, org.apache.spark.sql.catalyst.expressions.package.Projection keyGenerator ) : java.util.HashMap<org.apache.spark.sql.Row,org.apache.spark.util.collection.CompactBuffer<org.apache.spark.sql.Row>>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.org.apache.spark.sql.execution.joins.HashOuterJoin..buildHashTable:(Lscala/collection/Iterator;Lorg/apache/spark/sql/catalyst/expressions/package$Projection;)Ljava/util/HashMap;]
HashOuterJoin.HashOuterJoin..DUMMY_LIST ( ) : scala.collection.Seq<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.org.apache.spark.sql.execution.joins.HashOuterJoin..DUMMY_LIST:()Lscala/collection/Seq;]
HashOuterJoin.HashOuterJoin..EMPTY_LIST ( ) : scala.collection.Seq<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.org.apache.spark.sql.execution.joins.HashOuterJoin..EMPTY_LIST:()Lscala/collection/Seq;]
HashOuterJoin.HashOuterJoin..fullOuterIterator ( org.apache.spark.sql.Row key, scala.collection.Iterable<org.apache.spark.sql.Row> leftIter, scala.collection.Iterable<org.apache.spark.sql.Row> rightIter, org.apache.spark.sql.catalyst.expressions.JoinedRow joinedRow ) : scala.collection.Iterator<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.org.apache.spark.sql.execution.joins.HashOuterJoin..fullOuterIterator:(Lorg/apache/spark/sql/Row;Lscala/collection/Iterable;Lscala/collection/Iterable;Lorg/apache/spark/sql/catalyst/expressions/JoinedRow;)Lscala/collection/Iterator;]
HashOuterJoin.HashOuterJoin..leftNullRow ( ) : org.apache.spark.sql.catalyst.expressions.GenericRow
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.org.apache.spark.sql.execution.joins.HashOuterJoin..leftNullRow:()Lorg/apache/spark/sql/catalyst/expressions/GenericRow;]
HashOuterJoin.HashOuterJoin..leftOuterIterator ( org.apache.spark.sql.Row key, org.apache.spark.sql.catalyst.expressions.JoinedRow joinedRow, scala.collection.Iterable<org.apache.spark.sql.Row> rightIter ) : scala.collection.Iterator<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.org.apache.spark.sql.execution.joins.HashOuterJoin..leftOuterIterator:(Lorg/apache/spark/sql/Row;Lorg/apache/spark/sql/catalyst/expressions/JoinedRow;Lscala/collection/Iterable;)Lscala/collection/Iterator;]
HashOuterJoin.HashOuterJoin..rightNullRow ( ) : org.apache.spark.sql.catalyst.expressions.GenericRow
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.org.apache.spark.sql.execution.joins.HashOuterJoin..rightNullRow:()Lorg/apache/spark/sql/catalyst/expressions/GenericRow;]
HashOuterJoin.HashOuterJoin..rightOuterIterator ( org.apache.spark.sql.Row key, scala.collection.Iterable<org.apache.spark.sql.Row> leftIter, org.apache.spark.sql.catalyst.expressions.JoinedRow joinedRow ) : scala.collection.Iterator<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.org.apache.spark.sql.execution.joins.HashOuterJoin..rightOuterIterator:(Lorg/apache/spark/sql/Row;Lscala/collection/Iterable;Lorg/apache/spark/sql/catalyst/expressions/JoinedRow;)Lscala/collection/Iterator;]
HashOuterJoin.outputPartitioning ( ) : org.apache.spark.sql.catalyst.plans.physical.Partitioning
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.outputPartitioning:()Lorg/apache/spark/sql/catalyst/plans/physical/Partitioning;]
HashOuterJoin.productArity ( ) : int
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.productArity:()I]
HashOuterJoin.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.productElement:(I)Ljava/lang/Object;]
HashOuterJoin.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.productIterator:()Lscala/collection/Iterator;]
HashOuterJoin.productPrefix ( ) : String
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.productPrefix:()Ljava/lang/String;]
HashOuterJoin.requiredChildDistribution ( ) : scala.collection.immutable.List<org.apache.spark.sql.catalyst.plans.physical.ClusteredDistribution>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.requiredChildDistribution:()Lscala/collection/immutable/List;]
HashOuterJoin.requiredChildDistribution ( ) : scala.collection.Seq
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.requiredChildDistribution:()Lscala/collection/Seq;]
HashOuterJoin.right ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.right:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
HashOuterJoin.tupled ( ) [static] : scala.Function1<scala.Tuple6<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,org.apache.spark.sql.catalyst.plans.JoinType,scala.Option<org.apache.spark.sql.catalyst.expressions.Expression>,org.apache.spark.sql.execution.SparkPlan,org.apache.spark.sql.execution.SparkPlan>,HashOuterJoin>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, InMemoryColumnarTableScan.class
package org.apache.spark.sql.columnar
InMemoryColumnarTableScan.children ( ) : scala.collection.immutable.Nil.
[mangled: org/apache/spark/sql/columnar/InMemoryColumnarTableScan.children:()Lscala/collection/immutable/Nil$;]
spark-sql_2.10-1.3.0.jar, InMemoryRelation.class
package org.apache.spark.sql.columnar
InMemoryRelation.copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, boolean useCompression, int batchSize, org.apache.spark.storage.StorageLevel storageLevel, org.apache.spark.sql.execution.SparkPlan child, scala.Option<String> tableName, org.apache.spark.rdd.RDD<CachedBatch> _cachedColumnBuffers, org.apache.spark.sql.catalyst.plans.logical.Statistics _statistics ) : InMemoryRelation
[mangled: org/apache/spark/sql/columnar/InMemoryRelation.copy:(Lscala/collection/Seq;ZILorg/apache/spark/storage/StorageLevel;Lorg/apache/spark/sql/execution/SparkPlan;Lscala/Option;Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/catalyst/plans/logical/Statistics;)Lorg/apache/spark/sql/columnar/InMemoryRelation;]
InMemoryRelation.InMemoryRelation ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, boolean useCompression, int batchSize, org.apache.spark.storage.StorageLevel storageLevel, org.apache.spark.sql.execution.SparkPlan child, scala.Option<String> tableName, org.apache.spark.rdd.RDD<CachedBatch> _cachedColumnBuffers, org.apache.spark.sql.catalyst.plans.logical.Statistics _statistics )
[mangled: org/apache/spark/sql/columnar/InMemoryRelation."<init>":(Lscala/collection/Seq;ZILorg/apache/spark/storage/StorageLevel;Lorg/apache/spark/sql/execution/SparkPlan;Lscala/Option;Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/catalyst/plans/logical/Statistics;)V]
InMemoryRelation.newInstance ( ) : org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation
[mangled: org/apache/spark/sql/columnar/InMemoryRelation.newInstance:()Lorg/apache/spark/sql/catalyst/analysis/MultiInstanceRelation;]
spark-sql_2.10-1.3.0.jar, InsertIntoDataSource.class
package org.apache.spark.sql.sources
InsertIntoDataSource.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.canEqual:(Ljava/lang/Object;)Z]
InsertIntoDataSource.copy ( LogicalRelation logicalRelation, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan query, boolean overwrite ) : InsertIntoDataSource
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.copy:(Lorg/apache/spark/sql/sources/LogicalRelation;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;Z)Lorg/apache/spark/sql/sources/InsertIntoDataSource;]
InsertIntoDataSource.curried ( ) [static] : scala.Function1<LogicalRelation,scala.Function1<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,scala.Function1<Object,InsertIntoDataSource>>>
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.curried:()Lscala/Function1;]
InsertIntoDataSource.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.equals:(Ljava/lang/Object;)Z]
InsertIntoDataSource.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.hashCode:()I]
InsertIntoDataSource.InsertIntoDataSource ( LogicalRelation logicalRelation, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan query, boolean overwrite )
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource."<init>":(Lorg/apache/spark/sql/sources/LogicalRelation;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;Z)V]
InsertIntoDataSource.logicalRelation ( ) : LogicalRelation
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.logicalRelation:()Lorg/apache/spark/sql/sources/LogicalRelation;]
InsertIntoDataSource.overwrite ( ) : boolean
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.overwrite:()Z]
InsertIntoDataSource.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.productArity:()I]
InsertIntoDataSource.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.productElement:(I)Ljava/lang/Object;]
InsertIntoDataSource.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.productIterator:()Lscala/collection/Iterator;]
InsertIntoDataSource.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.productPrefix:()Ljava/lang/String;]
InsertIntoDataSource.query ( ) : org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.query:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
InsertIntoDataSource.run ( org.apache.spark.sql.SQLContext sqlContext ) : scala.collection.Seq<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.run:(Lorg/apache/spark/sql/SQLContext;)Lscala/collection/Seq;]
InsertIntoDataSource.tupled ( ) [static] : scala.Function1<scala.Tuple3<LogicalRelation,org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,Object>,InsertIntoDataSource>
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, InsertIntoParquetTable.class
package org.apache.spark.sql.parquet
InsertIntoParquetTable.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.canEqual:(Ljava/lang/Object;)Z]
InsertIntoParquetTable.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
InsertIntoParquetTable.child ( ) : org.apache.spark.sql.execution.SparkPlan
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.child:()Lorg/apache/spark/sql/execution/SparkPlan;]
InsertIntoParquetTable.children ( ) : scala.collection.immutable.List<org.apache.spark.sql.execution.SparkPlan>
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.children:()Lscala/collection/immutable/List;]
InsertIntoParquetTable.children ( ) : scala.collection.Seq
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.children:()Lscala/collection/Seq;]
InsertIntoParquetTable.copy ( ParquetRelation relation, org.apache.spark.sql.execution.SparkPlan child, boolean overwrite ) : InsertIntoParquetTable
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.copy:(Lorg/apache/spark/sql/parquet/ParquetRelation;Lorg/apache/spark/sql/execution/SparkPlan;Z)Lorg/apache/spark/sql/parquet/InsertIntoParquetTable;]
InsertIntoParquetTable.curried ( ) [static] : scala.Function1<ParquetRelation,scala.Function1<org.apache.spark.sql.execution.SparkPlan,scala.Function1<Object,InsertIntoParquetTable>>>
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.curried:()Lscala/Function1;]
InsertIntoParquetTable.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.equals:(Ljava/lang/Object;)Z]
InsertIntoParquetTable.execute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.execute:()Lorg/apache/spark/rdd/RDD;]
InsertIntoParquetTable.hashCode ( ) : int
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.hashCode:()I]
InsertIntoParquetTable.InsertIntoParquetTable ( ParquetRelation relation, org.apache.spark.sql.execution.SparkPlan child, boolean overwrite )
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable."<init>":(Lorg/apache/spark/sql/parquet/ParquetRelation;Lorg/apache/spark/sql/execution/SparkPlan;Z)V]
InsertIntoParquetTable.newJobContext ( org.apache.hadoop.conf.Configuration conf, org.apache.hadoop.mapreduce.JobID jobId ) : org.apache.hadoop.mapreduce.JobContext
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.newJobContext:(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/mapreduce/JobID;)Lorg/apache/hadoop/mapreduce/JobContext;]
InsertIntoParquetTable.newTaskAttemptContext ( org.apache.hadoop.conf.Configuration conf, org.apache.hadoop.mapreduce.TaskAttemptID attemptId ) : org.apache.hadoop.mapreduce.TaskAttemptContext
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.newTaskAttemptContext:(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/mapreduce/TaskAttemptID;)Lorg/apache/hadoop/mapreduce/TaskAttemptContext;]
InsertIntoParquetTable.newTaskAttemptID ( String jtIdentifier, int jobId, boolean isMap, int taskId, int attemptId ) : org.apache.hadoop.mapreduce.TaskAttemptID
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.newTaskAttemptID:(Ljava/lang/String;IZII)Lorg/apache/hadoop/mapreduce/TaskAttemptID;]
InsertIntoParquetTable.output ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.output:()Lscala/collection/Seq;]
InsertIntoParquetTable.outputPartitioning ( ) : org.apache.spark.sql.catalyst.plans.physical.Partitioning
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.outputPartitioning:()Lorg/apache/spark/sql/catalyst/plans/physical/Partitioning;]
InsertIntoParquetTable.overwrite ( ) : boolean
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.overwrite:()Z]
InsertIntoParquetTable.productArity ( ) : int
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.productArity:()I]
InsertIntoParquetTable.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.productElement:(I)Ljava/lang/Object;]
InsertIntoParquetTable.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.productIterator:()Lscala/collection/Iterator;]
InsertIntoParquetTable.productPrefix ( ) : String
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.productPrefix:()Ljava/lang/String;]
InsertIntoParquetTable.relation ( ) : ParquetRelation
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.relation:()Lorg/apache/spark/sql/parquet/ParquetRelation;]
InsertIntoParquetTable.tupled ( ) [static] : scala.Function1<scala.Tuple3<ParquetRelation,org.apache.spark.sql.execution.SparkPlan,Object>,InsertIntoParquetTable>
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, IntColumnStats.class
package org.apache.spark.sql.columnar
IntColumnStats.collectedStatistics ( ) : org.apache.spark.sql.Row
[mangled: org/apache/spark/sql/columnar/IntColumnStats.collectedStatistics:()Lorg/apache/spark/sql/Row;]
IntColumnStats.gatherStats ( org.apache.spark.sql.Row row, int ordinal ) : void
[mangled: org/apache/spark/sql/columnar/IntColumnStats.gatherStats:(Lorg/apache/spark/sql/Row;I)V]
spark-sql_2.10-1.3.0.jar, Intersect.class
package org.apache.spark.sql.execution
Intersect.left ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/Intersect.left:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
Intersect.right ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/Intersect.right:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
spark-sql_2.10-1.3.0.jar, JDBCPartition.class
package org.apache.spark.sql.jdbc
JDBCPartition.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/jdbc/JDBCPartition.canEqual:(Ljava/lang/Object;)Z]
JDBCPartition.copy ( String whereClause, int idx ) : JDBCPartition
[mangled: org/apache/spark/sql/jdbc/JDBCPartition.copy:(Ljava/lang/String;I)Lorg/apache/spark/sql/jdbc/JDBCPartition;]
JDBCPartition.curried ( ) [static] : scala.Function1<String,scala.Function1<Object,JDBCPartition>>
[mangled: org/apache/spark/sql/jdbc/JDBCPartition.curried:()Lscala/Function1;]
JDBCPartition.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/jdbc/JDBCPartition.equals:(Ljava/lang/Object;)Z]
JDBCPartition.hashCode ( ) : int
[mangled: org/apache/spark/sql/jdbc/JDBCPartition.hashCode:()I]
JDBCPartition.idx ( ) : int
[mangled: org/apache/spark/sql/jdbc/JDBCPartition.idx:()I]
JDBCPartition.index ( ) : int
[mangled: org/apache/spark/sql/jdbc/JDBCPartition.index:()I]
JDBCPartition.JDBCPartition ( String whereClause, int idx )
[mangled: org/apache/spark/sql/jdbc/JDBCPartition."<init>":(Ljava/lang/String;I)V]
JDBCPartition.productArity ( ) : int
[mangled: org/apache/spark/sql/jdbc/JDBCPartition.productArity:()I]
JDBCPartition.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/jdbc/JDBCPartition.productElement:(I)Ljava/lang/Object;]
JDBCPartition.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/jdbc/JDBCPartition.productIterator:()Lscala/collection/Iterator;]
JDBCPartition.productPrefix ( ) : String
[mangled: org/apache/spark/sql/jdbc/JDBCPartition.productPrefix:()Ljava/lang/String;]
JDBCPartition.toString ( ) : String
[mangled: org/apache/spark/sql/jdbc/JDBCPartition.toString:()Ljava/lang/String;]
JDBCPartition.tupled ( ) [static] : scala.Function1<scala.Tuple2<String,Object>,JDBCPartition>
[mangled: org/apache/spark/sql/jdbc/JDBCPartition.tupled:()Lscala/Function1;]
JDBCPartition.whereClause ( ) : String
[mangled: org/apache/spark/sql/jdbc/JDBCPartition.whereClause:()Ljava/lang/String;]
spark-sql_2.10-1.3.0.jar, JDBCPartitioningInfo.class
package org.apache.spark.sql.jdbc
JDBCPartitioningInfo.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.canEqual:(Ljava/lang/Object;)Z]
JDBCPartitioningInfo.column ( ) : String
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.column:()Ljava/lang/String;]
JDBCPartitioningInfo.copy ( String column, long lowerBound, long upperBound, int numPartitions ) : JDBCPartitioningInfo
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.copy:(Ljava/lang/String;JJI)Lorg/apache/spark/sql/jdbc/JDBCPartitioningInfo;]
JDBCPartitioningInfo.curried ( ) [static] : scala.Function1<String,scala.Function1<Object,scala.Function1<Object,scala.Function1<Object,JDBCPartitioningInfo>>>>
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.curried:()Lscala/Function1;]
JDBCPartitioningInfo.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.equals:(Ljava/lang/Object;)Z]
JDBCPartitioningInfo.hashCode ( ) : int
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.hashCode:()I]
JDBCPartitioningInfo.JDBCPartitioningInfo ( String column, long lowerBound, long upperBound, int numPartitions )
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo."<init>":(Ljava/lang/String;JJI)V]
JDBCPartitioningInfo.lowerBound ( ) : long
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.lowerBound:()J]
JDBCPartitioningInfo.numPartitions ( ) : int
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.numPartitions:()I]
JDBCPartitioningInfo.productArity ( ) : int
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.productArity:()I]
JDBCPartitioningInfo.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.productElement:(I)Ljava/lang/Object;]
JDBCPartitioningInfo.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.productIterator:()Lscala/collection/Iterator;]
JDBCPartitioningInfo.productPrefix ( ) : String
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.productPrefix:()Ljava/lang/String;]
JDBCPartitioningInfo.toString ( ) : String
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.toString:()Ljava/lang/String;]
JDBCPartitioningInfo.tupled ( ) [static] : scala.Function1<scala.Tuple4<String,Object,Object,Object>,JDBCPartitioningInfo>
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.tupled:()Lscala/Function1;]
JDBCPartitioningInfo.upperBound ( ) : long
[mangled: org/apache/spark/sql/jdbc/JDBCPartitioningInfo.upperBound:()J]
spark-sql_2.10-1.3.0.jar, JDBCRDD.class
package org.apache.spark.sql.jdbc
JDBCRDD.BinaryConversion ( ) : JDBCRDD.BinaryConversion.
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.BinaryConversion:()Lorg/apache/spark/sql/jdbc/JDBCRDD$BinaryConversion$;]
JDBCRDD.BinaryLongConversion ( ) : JDBCRDD.BinaryLongConversion.
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.BinaryLongConversion:()Lorg/apache/spark/sql/jdbc/JDBCRDD$BinaryLongConversion$;]
JDBCRDD.BooleanConversion ( ) : JDBCRDD.BooleanConversion.
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.BooleanConversion:()Lorg/apache/spark/sql/jdbc/JDBCRDD$BooleanConversion$;]
JDBCRDD.compute ( org.apache.spark.Partition thePart, org.apache.spark.TaskContext context ) : Object
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.compute:(Lorg/apache/spark/Partition;Lorg/apache/spark/TaskContext;)Lscala/collection/Iterator;]
JDBCRDD.DateConversion ( ) : JDBCRDD.DateConversion.
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.DateConversion:()Lorg/apache/spark/sql/jdbc/JDBCRDD$DateConversion$;]
JDBCRDD.DecimalConversion ( ) : JDBCRDD.DecimalConversion.
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.DecimalConversion:()Lorg/apache/spark/sql/jdbc/JDBCRDD$DecimalConversion$;]
JDBCRDD.DoubleConversion ( ) : JDBCRDD.DoubleConversion.
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.DoubleConversion:()Lorg/apache/spark/sql/jdbc/JDBCRDD$DoubleConversion$;]
JDBCRDD.FloatConversion ( ) : JDBCRDD.FloatConversion.
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.FloatConversion:()Lorg/apache/spark/sql/jdbc/JDBCRDD$FloatConversion$;]
JDBCRDD.getConnector ( String p1, String p2 ) [static] : scala.Function0<java.sql.Connection>
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.getConnector:(Ljava/lang/String;Ljava/lang/String;)Lscala/Function0;]
JDBCRDD.getConversions ( org.apache.spark.sql.types.StructType schema ) : JDBCRDD.JDBCConversion[ ]
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.getConversions:(Lorg/apache/spark/sql/types/StructType;)[Lorg/apache/spark/sql/jdbc/JDBCRDD$JDBCConversion;]
JDBCRDD.getPartitions ( ) : org.apache.spark.Partition[ ]
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.getPartitions:()[Lorg/apache/spark/Partition;]
JDBCRDD.IntegerConversion ( ) : JDBCRDD.IntegerConversion.
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.IntegerConversion:()Lorg/apache/spark/sql/jdbc/JDBCRDD$IntegerConversion$;]
JDBCRDD.JDBCRDD ( org.apache.spark.SparkContext sc, scala.Function0<java.sql.Connection> getConnection, org.apache.spark.sql.types.StructType schema, String fqTable, String[ ] columns, org.apache.spark.sql.sources.Filter[ ] filters, org.apache.spark.Partition[ ] partitions )
[mangled: org/apache/spark/sql/jdbc/JDBCRDD."<init>":(Lorg/apache/spark/SparkContext;Lscala/Function0;Lorg/apache/spark/sql/types/StructType;Ljava/lang/String;[Ljava/lang/String;[Lorg/apache/spark/sql/sources/Filter;[Lorg/apache/spark/Partition;)V]
JDBCRDD.LongConversion ( ) : JDBCRDD.LongConversion.
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.LongConversion:()Lorg/apache/spark/sql/jdbc/JDBCRDD$LongConversion$;]
JDBCRDD.JDBCRDD..columnList ( ) : String
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.org.apache.spark.sql.jdbc.JDBCRDD..columnList:()Ljava/lang/String;]
JDBCRDD.JDBCRDD..compileFilter ( org.apache.spark.sql.sources.Filter f ) : String
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.org.apache.spark.sql.jdbc.JDBCRDD..compileFilter:(Lorg/apache/spark/sql/sources/Filter;)Ljava/lang/String;]
JDBCRDD.JDBCRDD..getWhereClause ( JDBCPartition part ) : String
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.org.apache.spark.sql.jdbc.JDBCRDD..getWhereClause:(Lorg/apache/spark/sql/jdbc/JDBCPartition;)Ljava/lang/String;]
JDBCRDD.resolveTable ( String p1, String p2 ) [static] : org.apache.spark.sql.types.StructType
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.resolveTable:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/types/StructType;]
JDBCRDD.scanTable ( org.apache.spark.SparkContext p1, org.apache.spark.sql.types.StructType p2, String p3, String p4, String p5, String[ ] p6, org.apache.spark.sql.sources.Filter[ ] p7, org.apache.spark.Partition[ ] p8 ) [static] : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.scanTable:(Lorg/apache/spark/SparkContext;Lorg/apache/spark/sql/types/StructType;Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;[Ljava/lang/String;[Lorg/apache/spark/sql/sources/Filter;[Lorg/apache/spark/Partition;)Lorg/apache/spark/rdd/RDD;]
JDBCRDD.StringConversion ( ) : JDBCRDD.StringConversion.
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.StringConversion:()Lorg/apache/spark/sql/jdbc/JDBCRDD$StringConversion$;]
JDBCRDD.TimestampConversion ( ) : JDBCRDD.TimestampConversion.
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.TimestampConversion:()Lorg/apache/spark/sql/jdbc/JDBCRDD$TimestampConversion$;]
spark-sql_2.10-1.3.0.jar, JDBCRelation.class
package org.apache.spark.sql.jdbc
JDBCRelation.buildScan ( String[ ] requiredColumns, org.apache.spark.sql.sources.Filter[ ] filters ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.buildScan:([Ljava/lang/String;[Lorg/apache/spark/sql/sources/Filter;)Lorg/apache/spark/rdd/RDD;]
JDBCRelation.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.canEqual:(Ljava/lang/Object;)Z]
JDBCRelation.columnPartition ( JDBCPartitioningInfo p1 ) [static] : org.apache.spark.Partition[ ]
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.columnPartition:(Lorg/apache/spark/sql/jdbc/JDBCPartitioningInfo;)[Lorg/apache/spark/Partition;]
JDBCRelation.copy ( String url, String table, org.apache.spark.Partition[ ] parts, org.apache.spark.sql.SQLContext sqlContext ) : JDBCRelation
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.copy:(Ljava/lang/String;Ljava/lang/String;[Lorg/apache/spark/Partition;Lorg/apache/spark/sql/SQLContext;)Lorg/apache/spark/sql/jdbc/JDBCRelation;]
JDBCRelation.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.equals:(Ljava/lang/Object;)Z]
JDBCRelation.hashCode ( ) : int
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.hashCode:()I]
JDBCRelation.JDBCRelation ( String url, String table, org.apache.spark.Partition[ ] parts, org.apache.spark.sql.SQLContext sqlContext )
[mangled: org/apache/spark/sql/jdbc/JDBCRelation."<init>":(Ljava/lang/String;Ljava/lang/String;[Lorg/apache/spark/Partition;Lorg/apache/spark/sql/SQLContext;)V]
JDBCRelation.parts ( ) : org.apache.spark.Partition[ ]
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.parts:()[Lorg/apache/spark/Partition;]
JDBCRelation.productArity ( ) : int
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.productArity:()I]
JDBCRelation.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.productElement:(I)Ljava/lang/Object;]
JDBCRelation.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.productIterator:()Lscala/collection/Iterator;]
JDBCRelation.productPrefix ( ) : String
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.productPrefix:()Ljava/lang/String;]
JDBCRelation.schema ( ) : org.apache.spark.sql.types.StructType
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.schema:()Lorg/apache/spark/sql/types/StructType;]
JDBCRelation.sqlContext ( ) : org.apache.spark.sql.SQLContext
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.sqlContext:()Lorg/apache/spark/sql/SQLContext;]
JDBCRelation.table ( ) : String
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.table:()Ljava/lang/String;]
JDBCRelation.toString ( ) : String
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.toString:()Ljava/lang/String;]
JDBCRelation.url ( ) : String
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.url:()Ljava/lang/String;]
spark-sql_2.10-1.3.0.jar, JSONRelation.class
package org.apache.spark.sql.json
JSONRelation.buildScan ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/json/JSONRelation.buildScan:()Lorg/apache/spark/rdd/RDD;]
JSONRelation.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/json/JSONRelation.canEqual:(Ljava/lang/Object;)Z]
JSONRelation.copy ( String path, double samplingRatio, scala.Option<org.apache.spark.sql.types.StructType> userSpecifiedSchema, org.apache.spark.sql.SQLContext sqlContext ) : JSONRelation
[mangled: org/apache/spark/sql/json/JSONRelation.copy:(Ljava/lang/String;DLscala/Option;Lorg/apache/spark/sql/SQLContext;)Lorg/apache/spark/sql/json/JSONRelation;]
JSONRelation.equals ( Object other ) : boolean
[mangled: org/apache/spark/sql/json/JSONRelation.equals:(Ljava/lang/Object;)Z]
JSONRelation.hashCode ( ) : int
[mangled: org/apache/spark/sql/json/JSONRelation.hashCode:()I]
JSONRelation.insert ( org.apache.spark.sql.DataFrame data, boolean overwrite ) : void
[mangled: org/apache/spark/sql/json/JSONRelation.insert:(Lorg/apache/spark/sql/DataFrame;Z)V]
JSONRelation.JSONRelation ( String path, double samplingRatio, scala.Option<org.apache.spark.sql.types.StructType> userSpecifiedSchema, org.apache.spark.sql.SQLContext sqlContext )
[mangled: org/apache/spark/sql/json/JSONRelation."<init>":(Ljava/lang/String;DLscala/Option;Lorg/apache/spark/sql/SQLContext;)V]
JSONRelation.JSONRelation..baseRDD ( ) : org.apache.spark.rdd.RDD<String>
[mangled: org/apache/spark/sql/json/JSONRelation.org.apache.spark.sql.json.JSONRelation..baseRDD:()Lorg/apache/spark/rdd/RDD;]
JSONRelation.path ( ) : String
[mangled: org/apache/spark/sql/json/JSONRelation.path:()Ljava/lang/String;]
JSONRelation.productArity ( ) : int
[mangled: org/apache/spark/sql/json/JSONRelation.productArity:()I]
JSONRelation.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/json/JSONRelation.productElement:(I)Ljava/lang/Object;]
JSONRelation.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/json/JSONRelation.productIterator:()Lscala/collection/Iterator;]
JSONRelation.productPrefix ( ) : String
[mangled: org/apache/spark/sql/json/JSONRelation.productPrefix:()Ljava/lang/String;]
JSONRelation.samplingRatio ( ) : double
[mangled: org/apache/spark/sql/json/JSONRelation.samplingRatio:()D]
JSONRelation.schema ( ) : org.apache.spark.sql.types.StructType
[mangled: org/apache/spark/sql/json/JSONRelation.schema:()Lorg/apache/spark/sql/types/StructType;]
JSONRelation.sqlContext ( ) : org.apache.spark.sql.SQLContext
[mangled: org/apache/spark/sql/json/JSONRelation.sqlContext:()Lorg/apache/spark/sql/SQLContext;]
JSONRelation.toString ( ) : String
[mangled: org/apache/spark/sql/json/JSONRelation.toString:()Ljava/lang/String;]
JSONRelation.userSpecifiedSchema ( ) : scala.Option<org.apache.spark.sql.types.StructType>
[mangled: org/apache/spark/sql/json/JSONRelation.userSpecifiedSchema:()Lscala/Option;]
spark-sql_2.10-1.3.0.jar, LeftSemiJoinBNL.class
package org.apache.spark.sql.execution.joins
LeftSemiJoinBNL.left ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinBNL.left:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
LeftSemiJoinBNL.right ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinBNL.right:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
spark-sql_2.10-1.3.0.jar, LeftSemiJoinHash.class
package org.apache.spark.sql.execution.joins
LeftSemiJoinHash.buildKeys ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.buildKeys:()Lscala/collection/Seq;]
LeftSemiJoinHash.buildPlan ( ) : org.apache.spark.sql.execution.SparkPlan
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.buildPlan:()Lorg/apache/spark/sql/execution/SparkPlan;]
LeftSemiJoinHash.buildSide ( ) : package.BuildRight.
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.buildSide:()Lorg/apache/spark/sql/execution/joins/package$BuildRight$;]
LeftSemiJoinHash.buildSide ( ) : package.BuildSide
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.buildSide:()Lorg/apache/spark/sql/execution/joins/package$BuildSide;]
LeftSemiJoinHash.buildSideKeyGenerator ( ) : org.apache.spark.sql.catalyst.expressions.package.Projection
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.buildSideKeyGenerator:()Lorg/apache/spark/sql/catalyst/expressions/package$Projection;]
LeftSemiJoinHash.copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> leftKeys, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> rightKeys, org.apache.spark.sql.execution.SparkPlan left, org.apache.spark.sql.execution.SparkPlan right ) : LeftSemiJoinHash
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.copy:(Lscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;Lorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/joins/LeftSemiJoinHash;]
LeftSemiJoinHash.hashJoin ( scala.collection.Iterator<org.apache.spark.sql.Row> streamIter, HashedRelation hashedRelation ) : scala.collection.Iterator<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.hashJoin:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/joins/HashedRelation;)Lscala/collection/Iterator;]
LeftSemiJoinHash.left ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.left:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
LeftSemiJoinHash.LeftSemiJoinHash ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> leftKeys, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> rightKeys, org.apache.spark.sql.execution.SparkPlan left, org.apache.spark.sql.execution.SparkPlan right )
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash."<init>":(Lscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;Lorg/apache/spark/sql/execution/SparkPlan;)V]
LeftSemiJoinHash.right ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.right:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
LeftSemiJoinHash.streamedKeys ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.streamedKeys:()Lscala/collection/Seq;]
LeftSemiJoinHash.streamedPlan ( ) : org.apache.spark.sql.execution.SparkPlan
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.streamedPlan:()Lorg/apache/spark/sql/execution/SparkPlan;]
LeftSemiJoinHash.streamSideKeyGenerator ( ) : scala.Function0<org.apache.spark.sql.catalyst.expressions.package.MutableProjection>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.streamSideKeyGenerator:()Lscala/Function0;]
spark-sql_2.10-1.3.0.jar, Limit.class
package org.apache.spark.sql.execution
Limit.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/Limit.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
Limit.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Limit.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, LocalTableScan.class
package org.apache.spark.sql.execution
LocalTableScan.children ( ) : scala.collection.immutable.Nil.
[mangled: org/apache/spark/sql/execution/LocalTableScan.children:()Lscala/collection/immutable/Nil$;]
spark-sql_2.10-1.3.0.jar, LogicalLocalTable.class
package org.apache.spark.sql.execution
LogicalLocalTable.children ( ) : scala.collection.immutable.Nil.
[mangled: org/apache/spark/sql/execution/LogicalLocalTable.children:()Lscala/collection/immutable/Nil$;]
LogicalLocalTable.newInstance ( ) : org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation
[mangled: org/apache/spark/sql/execution/LogicalLocalTable.newInstance:()Lorg/apache/spark/sql/catalyst/analysis/MultiInstanceRelation;]
spark-sql_2.10-1.3.0.jar, LogicalRDD.class
package org.apache.spark.sql.execution
LogicalRDD.children ( ) : scala.collection.immutable.Nil.
[mangled: org/apache/spark/sql/execution/LogicalRDD.children:()Lscala/collection/immutable/Nil$;]
LogicalRDD.newInstance ( ) : org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation
[mangled: org/apache/spark/sql/execution/LogicalRDD.newInstance:()Lorg/apache/spark/sql/catalyst/analysis/MultiInstanceRelation;]
spark-sql_2.10-1.3.0.jar, LogicalRelation.class
package org.apache.spark.sql.sources
LogicalRelation.andThen ( scala.Function1<LogicalRelation,A> p1 ) [static] : scala.Function1<BaseRelation,A>
[mangled: org/apache/spark/sql/sources/LogicalRelation.andThen:(Lscala/Function1;)Lscala/Function1;]
LogicalRelation.attributeMap ( ) : org.apache.spark.sql.catalyst.expressions.AttributeMap<org.apache.spark.sql.catalyst.expressions.AttributeReference>
[mangled: org/apache/spark/sql/sources/LogicalRelation.attributeMap:()Lorg/apache/spark/sql/catalyst/expressions/AttributeMap;]
LogicalRelation.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/LogicalRelation.canEqual:(Ljava/lang/Object;)Z]
LogicalRelation.compose ( scala.Function1<A,BaseRelation> p1 ) [static] : scala.Function1<A,LogicalRelation>
[mangled: org/apache/spark/sql/sources/LogicalRelation.compose:(Lscala/Function1;)Lscala/Function1;]
LogicalRelation.copy ( BaseRelation relation ) : LogicalRelation
[mangled: org/apache/spark/sql/sources/LogicalRelation.copy:(Lorg/apache/spark/sql/sources/BaseRelation;)Lorg/apache/spark/sql/sources/LogicalRelation;]
LogicalRelation.equals ( Object other ) : boolean
[mangled: org/apache/spark/sql/sources/LogicalRelation.equals:(Ljava/lang/Object;)Z]
LogicalRelation.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/LogicalRelation.hashCode:()I]
LogicalRelation.LogicalRelation ( BaseRelation relation )
[mangled: org/apache/spark/sql/sources/LogicalRelation."<init>":(Lorg/apache/spark/sql/sources/BaseRelation;)V]
LogicalRelation.newInstance ( ) : org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation
[mangled: org/apache/spark/sql/sources/LogicalRelation.newInstance:()Lorg/apache/spark/sql/catalyst/analysis/MultiInstanceRelation;]
LogicalRelation.newInstance ( ) : LogicalRelation
[mangled: org/apache/spark/sql/sources/LogicalRelation.newInstance:()Lorg/apache/spark/sql/sources/LogicalRelation;]
LogicalRelation.output ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.AttributeReference>
[mangled: org/apache/spark/sql/sources/LogicalRelation.output:()Lscala/collection/Seq;]
LogicalRelation.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/LogicalRelation.productArity:()I]
LogicalRelation.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/LogicalRelation.productElement:(I)Ljava/lang/Object;]
LogicalRelation.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/LogicalRelation.productIterator:()Lscala/collection/Iterator;]
LogicalRelation.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/LogicalRelation.productPrefix:()Ljava/lang/String;]
LogicalRelation.relation ( ) : BaseRelation
[mangled: org/apache/spark/sql/sources/LogicalRelation.relation:()Lorg/apache/spark/sql/sources/BaseRelation;]
LogicalRelation.sameResult ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan otherPlan ) : boolean
[mangled: org/apache/spark/sql/sources/LogicalRelation.sameResult:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Z]
LogicalRelation.simpleString ( ) : String
[mangled: org/apache/spark/sql/sources/LogicalRelation.simpleString:()Ljava/lang/String;]
LogicalRelation.statistics ( ) : org.apache.spark.sql.catalyst.plans.logical.Statistics
[mangled: org/apache/spark/sql/sources/LogicalRelation.statistics:()Lorg/apache/spark/sql/catalyst/plans/logical/Statistics;]
spark-sql_2.10-1.3.0.jar, MySQLQuirks.class
package org.apache.spark.sql.jdbc
MySQLQuirks.MySQLQuirks ( )
[mangled: org/apache/spark/sql/jdbc/MySQLQuirks."<init>":()V]
spark-sql_2.10-1.3.0.jar, NanoTime.class
package org.apache.spark.sql.parquet.timestamp
NanoTime.getJulianDay ( ) : int
[mangled: org/apache/spark/sql/parquet/timestamp/NanoTime.getJulianDay:()I]
NanoTime.getTimeOfDayNanos ( ) : long
[mangled: org/apache/spark/sql/parquet/timestamp/NanoTime.getTimeOfDayNanos:()J]
NanoTime.NanoTime ( )
[mangled: org/apache/spark/sql/parquet/timestamp/NanoTime."<init>":()V]
NanoTime.set ( int julianDay, long timeOfDayNanos ) : NanoTime
[mangled: org/apache/spark/sql/parquet/timestamp/NanoTime.set:(IJ)Lorg/apache/spark/sql/parquet/timestamp/NanoTime;]
NanoTime.toBinary ( ) : parquet.io.api.Binary
[mangled: org/apache/spark/sql/parquet/timestamp/NanoTime.toBinary:()Lparquet/io/api/Binary;]
spark-sql_2.10-1.3.0.jar, NativeColumnType<T>.class
package org.apache.spark.sql.columnar
NativeColumnType<T>.dataType ( ) : T
[mangled: org/apache/spark/sql/columnar/NativeColumnType<T>.dataType:()Lorg/apache/spark/sql/types/NativeType;]
NativeColumnType<T>.NativeColumnType ( T dataType, int typeId, int defaultSize ) : public
[mangled: org/apache/spark/sql/columnar/NativeColumnType<T>.org.apache.spark.sql.columnar.NativeColumnType:(Lorg/apache/spark/sql/types/NativeType;II)V]
spark-sql_2.10-1.3.0.jar, NoQuirks.class
package org.apache.spark.sql.jdbc
NoQuirks.NoQuirks ( )
[mangled: org/apache/spark/sql/jdbc/NoQuirks."<init>":()V]
spark-sql_2.10-1.3.0.jar, NullableColumnBuilder.class
package org.apache.spark.sql.columnar
NullableColumnBuilder.appendFrom ( org.apache.spark.sql.Row p1, int p2 ) [abstract] : void
[mangled: org/apache/spark/sql/columnar/NullableColumnBuilder.appendFrom:(Lorg/apache/spark/sql/Row;I)V]
NullableColumnBuilder.NullableColumnBuilder..super.appendFrom ( org.apache.spark.sql.Row p1, int p2 ) [abstract] : void
[mangled: org/apache/spark/sql/columnar/NullableColumnBuilder.org.apache.spark.sql.columnar.NullableColumnBuilder..super.appendFrom:(Lorg/apache/spark/sql/Row;I)V]
spark-sql_2.10-1.3.0.jar, OutputFaker.class
package org.apache.spark.sql.execution
OutputFaker.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/OutputFaker.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, ParquetRelation.class
package org.apache.spark.sql.parquet
ParquetRelation.attributeMap ( ) : org.apache.spark.sql.catalyst.expressions.AttributeMap<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/parquet/ParquetRelation.attributeMap:()Lorg/apache/spark/sql/catalyst/expressions/AttributeMap;]
ParquetRelation.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetRelation.canEqual:(Ljava/lang/Object;)Z]
ParquetRelation.conf ( ) : scala.Option<org.apache.hadoop.conf.Configuration>
[mangled: org/apache/spark/sql/parquet/ParquetRelation.conf:()Lscala/Option;]
ParquetRelation.copy ( String path, scala.Option<org.apache.hadoop.conf.Configuration> conf, org.apache.spark.sql.SQLContext sqlContext, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> partitioningAttributes ) : ParquetRelation
[mangled: org/apache/spark/sql/parquet/ParquetRelation.copy:(Ljava/lang/String;Lscala/Option;Lorg/apache/spark/sql/SQLContext;Lscala/collection/Seq;)Lorg/apache/spark/sql/parquet/ParquetRelation;]
ParquetRelation.create ( String p1, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan p2, org.apache.hadoop.conf.Configuration p3, org.apache.spark.sql.SQLContext p4 ) [static] : ParquetRelation
[mangled: org/apache/spark/sql/parquet/ParquetRelation.create:(Ljava/lang/String;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;Lorg/apache/hadoop/conf/Configuration;Lorg/apache/spark/sql/SQLContext;)Lorg/apache/spark/sql/parquet/ParquetRelation;]
ParquetRelation.createEmpty ( String p1, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> p2, boolean p3, org.apache.hadoop.conf.Configuration p4, org.apache.spark.sql.SQLContext p5 ) [static] : ParquetRelation
[mangled: org/apache/spark/sql/parquet/ParquetRelation.createEmpty:(Ljava/lang/String;Lscala/collection/Seq;ZLorg/apache/hadoop/conf/Configuration;Lorg/apache/spark/sql/SQLContext;)Lorg/apache/spark/sql/parquet/ParquetRelation;]
ParquetRelation.enableLogForwarding ( ) [static] : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation.enableLogForwarding:()V]
ParquetRelation.equals ( Object other ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetRelation.equals:(Ljava/lang/Object;)Z]
ParquetRelation.hashCode ( ) : int
[mangled: org/apache/spark/sql/parquet/ParquetRelation.hashCode:()I]
ParquetRelation.newInstance ( ) : org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation
[mangled: org/apache/spark/sql/parquet/ParquetRelation.newInstance:()Lorg/apache/spark/sql/catalyst/analysis/MultiInstanceRelation;]
ParquetRelation.newInstance ( ) : ParquetRelation
[mangled: org/apache/spark/sql/parquet/ParquetRelation.newInstance:()Lorg/apache/spark/sql/parquet/ParquetRelation;]
ParquetRelation.output ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/parquet/ParquetRelation.output:()Lscala/collection/Seq;]
ParquetRelation.ParquetRelation ( String path, scala.Option<org.apache.hadoop.conf.Configuration> conf, org.apache.spark.sql.SQLContext sqlContext, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> partitioningAttributes )
[mangled: org/apache/spark/sql/parquet/ParquetRelation."<init>":(Ljava/lang/String;Lscala/Option;Lorg/apache/spark/sql/SQLContext;Lscala/collection/Seq;)V]
ParquetRelation.parquetSchema ( ) : parquet.schema.MessageType
[mangled: org/apache/spark/sql/parquet/ParquetRelation.parquetSchema:()Lparquet/schema/MessageType;]
ParquetRelation.partitioningAttributes ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/parquet/ParquetRelation.partitioningAttributes:()Lscala/collection/Seq;]
ParquetRelation.path ( ) : String
[mangled: org/apache/spark/sql/parquet/ParquetRelation.path:()Ljava/lang/String;]
ParquetRelation.productArity ( ) : int
[mangled: org/apache/spark/sql/parquet/ParquetRelation.productArity:()I]
ParquetRelation.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/parquet/ParquetRelation.productElement:(I)Ljava/lang/Object;]
ParquetRelation.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/parquet/ParquetRelation.productIterator:()Lscala/collection/Iterator;]
ParquetRelation.productPrefix ( ) : String
[mangled: org/apache/spark/sql/parquet/ParquetRelation.productPrefix:()Ljava/lang/String;]
ParquetRelation.shortParquetCompressionCodecNames ( ) [static] : scala.collection.immutable.Map<String,parquet.hadoop.metadata.CompressionCodecName>
[mangled: org/apache/spark/sql/parquet/ParquetRelation.shortParquetCompressionCodecNames:()Lscala/collection/immutable/Map;]
ParquetRelation.sqlContext ( ) : org.apache.spark.sql.SQLContext
[mangled: org/apache/spark/sql/parquet/ParquetRelation.sqlContext:()Lorg/apache/spark/sql/SQLContext;]
ParquetRelation.statistics ( ) : org.apache.spark.sql.catalyst.plans.logical.Statistics
[mangled: org/apache/spark/sql/parquet/ParquetRelation.statistics:()Lorg/apache/spark/sql/catalyst/plans/logical/Statistics;]
spark-sql_2.10-1.3.0.jar, ParquetRelation2.class
package org.apache.spark.sql.parquet
ParquetRelation2.buildScan ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> predicates ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.buildScan:(Lscala/collection/Seq;Lscala/collection/Seq;)Lorg/apache/spark/rdd/RDD;]
ParquetRelation2.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.canEqual:(Ljava/lang/Object;)Z]
ParquetRelation2.copy ( scala.collection.Seq<String> paths, scala.collection.immutable.Map<String,String> parameters, scala.Option<org.apache.spark.sql.types.StructType> maybeSchema, scala.Option<PartitionSpec> maybePartitionSpec, org.apache.spark.sql.SQLContext sqlContext ) : ParquetRelation2
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.copy:(Lscala/collection/Seq;Lscala/collection/immutable/Map;Lscala/Option;Lscala/Option;Lorg/apache/spark/sql/SQLContext;)Lorg/apache/spark/sql/parquet/ParquetRelation2;]
ParquetRelation2.DEFAULT_PARTITION_NAME ( ) [static] : String
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.DEFAULT_PARTITION_NAME:()Ljava/lang/String;]
ParquetRelation2.equals ( Object other ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.equals:(Ljava/lang/Object;)Z]
ParquetRelation2.hashCode ( ) : int
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.hashCode:()I]
ParquetRelation2.insert ( org.apache.spark.sql.DataFrame data, boolean overwrite ) : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.insert:(Lorg/apache/spark/sql/DataFrame;Z)V]
ParquetRelation2.isPartitioned ( ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.isPartitioned:()Z]
ParquetRelation2.isTraceEnabled ( ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.isTraceEnabled:()Z]
ParquetRelation2.log ( ) : org.slf4j.Logger
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.log:()Lorg/slf4j/Logger;]
ParquetRelation2.logDebug ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.logDebug:(Lscala/Function0;)V]
ParquetRelation2.logDebug ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.logDebug:(Lscala/Function0;Ljava/lang/Throwable;)V]
ParquetRelation2.logError ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.logError:(Lscala/Function0;)V]
ParquetRelation2.logError ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.logError:(Lscala/Function0;Ljava/lang/Throwable;)V]
ParquetRelation2.logInfo ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.logInfo:(Lscala/Function0;)V]
ParquetRelation2.logInfo ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.logInfo:(Lscala/Function0;Ljava/lang/Throwable;)V]
ParquetRelation2.logName ( ) : String
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.logName:()Ljava/lang/String;]
ParquetRelation2.logTrace ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.logTrace:(Lscala/Function0;)V]
ParquetRelation2.logTrace ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.logTrace:(Lscala/Function0;Ljava/lang/Throwable;)V]
ParquetRelation2.logWarning ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.logWarning:(Lscala/Function0;)V]
ParquetRelation2.logWarning ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.logWarning:(Lscala/Function0;Ljava/lang/Throwable;)V]
ParquetRelation2.maybePartitionSpec ( ) : scala.Option<PartitionSpec>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.maybePartitionSpec:()Lscala/Option;]
ParquetRelation2.maybeSchema ( ) : scala.Option<org.apache.spark.sql.types.StructType>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.maybeSchema:()Lscala/Option;]
ParquetRelation2.MERGE_SCHEMA ( ) [static] : String
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.MERGE_SCHEMA:()Ljava/lang/String;]
ParquetRelation2.newJobContext ( org.apache.hadoop.conf.Configuration conf, org.apache.hadoop.mapreduce.JobID jobId ) : org.apache.hadoop.mapreduce.JobContext
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.newJobContext:(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/mapreduce/JobID;)Lorg/apache/hadoop/mapreduce/JobContext;]
ParquetRelation2.newTaskAttemptContext ( org.apache.hadoop.conf.Configuration conf, org.apache.hadoop.mapreduce.TaskAttemptID attemptId ) : org.apache.hadoop.mapreduce.TaskAttemptContext
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.newTaskAttemptContext:(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/mapreduce/TaskAttemptID;)Lorg/apache/hadoop/mapreduce/TaskAttemptContext;]
ParquetRelation2.newTaskAttemptID ( String jtIdentifier, int jobId, boolean isMap, int taskId, int attemptId ) : org.apache.hadoop.mapreduce.TaskAttemptID
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.newTaskAttemptID:(Ljava/lang/String;IZII)Lorg/apache/hadoop/mapreduce/TaskAttemptID;]
ParquetRelation2.org.apache.spark.Logging..log_ ( ) : org.slf4j.Logger
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.org.apache.spark.Logging..log_:()Lorg/slf4j/Logger;]
ParquetRelation2.org.apache.spark.Logging..log__.eq ( org.slf4j.Logger p1 ) : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.org.apache.spark.Logging..log__.eq:(Lorg/slf4j/Logger;)V]
ParquetRelation2.ParquetRelation2..defaultPartitionName ( ) : String
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.org.apache.spark.sql.parquet.ParquetRelation2..defaultPartitionName:()Ljava/lang/String;]
ParquetRelation2.ParquetRelation2..isSummaryFile ( org.apache.hadoop.fs.Path file ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.org.apache.spark.sql.parquet.ParquetRelation2..isSummaryFile:(Lorg/apache/hadoop/fs/Path;)Z]
ParquetRelation2.ParquetRelation2..maybeMetastoreSchema ( ) : scala.Option<org.apache.spark.sql.types.StructType>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.org.apache.spark.sql.parquet.ParquetRelation2..maybeMetastoreSchema:()Lscala/Option;]
ParquetRelation2.ParquetRelation2..metadataCache ( ) : ParquetRelation2.MetadataCache
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.org.apache.spark.sql.parquet.ParquetRelation2..metadataCache:()Lorg/apache/spark/sql/parquet/ParquetRelation2$MetadataCache;]
ParquetRelation2.ParquetRelation2..shouldMergeSchemas ( ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.org.apache.spark.sql.parquet.ParquetRelation2..shouldMergeSchemas:()Z]
ParquetRelation2.parameters ( ) : scala.collection.immutable.Map<String,String>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.parameters:()Lscala/collection/immutable/Map;]
ParquetRelation2.ParquetRelation2 ( scala.collection.Seq<String> paths, scala.collection.immutable.Map<String,String> parameters, scala.Option<org.apache.spark.sql.types.StructType> maybeSchema, scala.Option<PartitionSpec> maybePartitionSpec, org.apache.spark.sql.SQLContext sqlContext )
[mangled: org/apache/spark/sql/parquet/ParquetRelation2."<init>":(Lscala/collection/Seq;Lscala/collection/immutable/Map;Lscala/Option;Lscala/Option;Lorg/apache/spark/sql/SQLContext;)V]
ParquetRelation2.partitionColumns ( ) : org.apache.spark.sql.types.StructType
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.partitionColumns:()Lorg/apache/spark/sql/types/StructType;]
ParquetRelation2.partitions ( ) : scala.collection.Seq<Partition>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.partitions:()Lscala/collection/Seq;]
ParquetRelation2.partitionSpec ( ) : PartitionSpec
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.partitionSpec:()Lorg/apache/spark/sql/parquet/PartitionSpec;]
ParquetRelation2.paths ( ) : scala.collection.Seq<String>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.paths:()Lscala/collection/Seq;]
ParquetRelation2.productArity ( ) : int
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.productArity:()I]
ParquetRelation2.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.productElement:(I)Ljava/lang/Object;]
ParquetRelation2.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.productIterator:()Lscala/collection/Iterator;]
ParquetRelation2.productPrefix ( ) : String
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.productPrefix:()Ljava/lang/String;]
ParquetRelation2.schema ( ) : org.apache.spark.sql.types.StructType
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.schema:()Lorg/apache/spark/sql/types/StructType;]
ParquetRelation2.sizeInBytes ( ) : long
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.sizeInBytes:()J]
ParquetRelation2.sparkContext ( ) : org.apache.spark.SparkContext
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.sparkContext:()Lorg/apache/spark/SparkContext;]
ParquetRelation2.sqlContext ( ) : org.apache.spark.sql.SQLContext
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.sqlContext:()Lorg/apache/spark/sql/SQLContext;]
ParquetRelation2.toString ( ) : String
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.toString:()Ljava/lang/String;]
spark-sql_2.10-1.3.0.jar, ParquetTableScan.class
package org.apache.spark.sql.parquet
ParquetTableScan.attributes ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.attributes:()Lscala/collection/Seq;]
ParquetTableScan.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.canEqual:(Ljava/lang/Object;)Z]
ParquetTableScan.children ( ) : scala.collection.immutable.Nil.
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.children:()Lscala/collection/immutable/Nil$;]
ParquetTableScan.children ( ) : scala.collection.Seq
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.children:()Lscala/collection/Seq;]
ParquetTableScan.columnPruningPred ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.columnPruningPred:()Lscala/collection/Seq;]
ParquetTableScan.copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> attributes, ParquetRelation relation, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> columnPruningPred ) : ParquetTableScan
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.copy:(Lscala/collection/Seq;Lorg/apache/spark/sql/parquet/ParquetRelation;Lscala/collection/Seq;)Lorg/apache/spark/sql/parquet/ParquetTableScan;]
ParquetTableScan.curried ( ) [static] : scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>,scala.Function1<ParquetRelation,scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,ParquetTableScan>>>
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.curried:()Lscala/Function1;]
ParquetTableScan.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.equals:(Ljava/lang/Object;)Z]
ParquetTableScan.execute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.execute:()Lorg/apache/spark/rdd/RDD;]
ParquetTableScan.hashCode ( ) : int
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.hashCode:()I]
ParquetTableScan.output ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.output:()Lscala/collection/Seq;]
ParquetTableScan.ParquetTableScan ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> attributes, ParquetRelation relation, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> columnPruningPred )
[mangled: org/apache/spark/sql/parquet/ParquetTableScan."<init>":(Lscala/collection/Seq;Lorg/apache/spark/sql/parquet/ParquetRelation;Lscala/collection/Seq;)V]
ParquetTableScan.productArity ( ) : int
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.productArity:()I]
ParquetTableScan.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.productElement:(I)Ljava/lang/Object;]
ParquetTableScan.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.productIterator:()Lscala/collection/Iterator;]
ParquetTableScan.productPrefix ( ) : String
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.productPrefix:()Ljava/lang/String;]
ParquetTableScan.pruneColumns ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> prunedAttributes ) : ParquetTableScan
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.pruneColumns:(Lscala/collection/Seq;)Lorg/apache/spark/sql/parquet/ParquetTableScan;]
ParquetTableScan.relation ( ) : ParquetRelation
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.relation:()Lorg/apache/spark/sql/parquet/ParquetRelation;]
ParquetTableScan.requestedPartitionOrdinals ( ) : scala.Tuple2<Object,Object>[ ]
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.requestedPartitionOrdinals:()[Lscala/Tuple2;]
ParquetTableScan.tupled ( ) [static] : scala.Function1<scala.Tuple3<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>,ParquetRelation,scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>>,ParquetTableScan>
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, ParquetTest.class
package org.apache.spark.sql.parquet
ParquetTest.configuration ( ) [abstract] : org.apache.hadoop.conf.Configuration
[mangled: org/apache/spark/sql/parquet/ParquetTest.configuration:()Lorg/apache/hadoop/conf/Configuration;]
ParquetTest.makeParquetFile ( org.apache.spark.sql.DataFrame p1, java.io.File p2, scala.reflect.ClassTag<T> p3, scala.reflect.api.TypeTags.TypeTag<T> p4 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.makeParquetFile:(Lorg/apache/spark/sql/DataFrame;Ljava/io/File;Lscala/reflect/ClassTag;Lscala/reflect/api/TypeTags$TypeTag;)V]
ParquetTest.makeParquetFile ( scala.collection.Seq<T> p1, java.io.File p2, scala.reflect.ClassTag<T> p3, scala.reflect.api.TypeTags.TypeTag<T> p4 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.makeParquetFile:(Lscala/collection/Seq;Ljava/io/File;Lscala/reflect/ClassTag;Lscala/reflect/api/TypeTags$TypeTag;)V]
ParquetTest.makePartitionDir ( java.io.File p1, String p2, scala.collection.Seq<scala.Tuple2<String,Object>> p3 ) [abstract] : java.io.File
[mangled: org/apache/spark/sql/parquet/ParquetTest.makePartitionDir:(Ljava/io/File;Ljava/lang/String;Lscala/collection/Seq;)Ljava/io/File;]
ParquetTest.sqlContext ( ) [abstract] : org.apache.spark.sql.SQLContext
[mangled: org/apache/spark/sql/parquet/ParquetTest.sqlContext:()Lorg/apache/spark/sql/SQLContext;]
ParquetTest.withParquetDataFrame ( scala.collection.Seq<T> p1, scala.Function1<org.apache.spark.sql.DataFrame,scala.runtime.BoxedUnit> p2, scala.reflect.ClassTag<T> p3, scala.reflect.api.TypeTags.TypeTag<T> p4 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.withParquetDataFrame:(Lscala/collection/Seq;Lscala/Function1;Lscala/reflect/ClassTag;Lscala/reflect/api/TypeTags$TypeTag;)V]
ParquetTest.withParquetFile ( scala.collection.Seq<T> p1, scala.Function1<String,scala.runtime.BoxedUnit> p2, scala.reflect.ClassTag<T> p3, scala.reflect.api.TypeTags.TypeTag<T> p4 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.withParquetFile:(Lscala/collection/Seq;Lscala/Function1;Lscala/reflect/ClassTag;Lscala/reflect/api/TypeTags$TypeTag;)V]
ParquetTest.withParquetTable ( scala.collection.Seq<T> p1, String p2, scala.Function0<scala.runtime.BoxedUnit> p3, scala.reflect.ClassTag<T> p4, scala.reflect.api.TypeTags.TypeTag<T> p5 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.withParquetTable:(Lscala/collection/Seq;Ljava/lang/String;Lscala/Function0;Lscala/reflect/ClassTag;Lscala/reflect/api/TypeTags$TypeTag;)V]
ParquetTest.withSQLConf ( scala.collection.Seq<scala.Tuple2<String,String>> p1, scala.Function0<scala.runtime.BoxedUnit> p2 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.withSQLConf:(Lscala/collection/Seq;Lscala/Function0;)V]
ParquetTest.withTempDir ( scala.Function1<java.io.File,scala.runtime.BoxedUnit> p1 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.withTempDir:(Lscala/Function1;)V]
ParquetTest.withTempPath ( scala.Function1<java.io.File,scala.runtime.BoxedUnit> p1 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.withTempPath:(Lscala/Function1;)V]
ParquetTest.withTempTable ( String p1, scala.Function0<scala.runtime.BoxedUnit> p2 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.withTempTable:(Ljava/lang/String;Lscala/Function0;)V]
spark-sql_2.10-1.3.0.jar, ParquetTypeInfo.class
package org.apache.spark.sql.parquet
ParquetTypeInfo.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.canEqual:(Ljava/lang/Object;)Z]
ParquetTypeInfo.copy ( parquet.schema.PrimitiveType.PrimitiveTypeName primitiveType, scala.Option<parquet.schema.OriginalType> originalType, scala.Option<parquet.schema.DecimalMetadata> decimalMetadata, scala.Option<Object> length ) : ParquetTypeInfo
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.copy:(Lparquet/schema/PrimitiveType$PrimitiveTypeName;Lscala/Option;Lscala/Option;Lscala/Option;)Lorg/apache/spark/sql/parquet/ParquetTypeInfo;]
ParquetTypeInfo.curried ( ) [static] : scala.Function1<parquet.schema.PrimitiveType.PrimitiveTypeName,scala.Function1<scala.Option<parquet.schema.OriginalType>,scala.Function1<scala.Option<parquet.schema.DecimalMetadata>,scala.Function1<scala.Option<Object>,ParquetTypeInfo>>>>
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.curried:()Lscala/Function1;]
ParquetTypeInfo.decimalMetadata ( ) : scala.Option<parquet.schema.DecimalMetadata>
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.decimalMetadata:()Lscala/Option;]
ParquetTypeInfo.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.equals:(Ljava/lang/Object;)Z]
ParquetTypeInfo.hashCode ( ) : int
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.hashCode:()I]
ParquetTypeInfo.length ( ) : scala.Option<Object>
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.length:()Lscala/Option;]
ParquetTypeInfo.originalType ( ) : scala.Option<parquet.schema.OriginalType>
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.originalType:()Lscala/Option;]
ParquetTypeInfo.ParquetTypeInfo ( parquet.schema.PrimitiveType.PrimitiveTypeName primitiveType, scala.Option<parquet.schema.OriginalType> originalType, scala.Option<parquet.schema.DecimalMetadata> decimalMetadata, scala.Option<Object> length )
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo."<init>":(Lparquet/schema/PrimitiveType$PrimitiveTypeName;Lscala/Option;Lscala/Option;Lscala/Option;)V]
ParquetTypeInfo.primitiveType ( ) : parquet.schema.PrimitiveType.PrimitiveTypeName
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.primitiveType:()Lparquet/schema/PrimitiveType$PrimitiveTypeName;]
ParquetTypeInfo.productArity ( ) : int
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.productArity:()I]
ParquetTypeInfo.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.productElement:(I)Ljava/lang/Object;]
ParquetTypeInfo.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.productIterator:()Lscala/collection/Iterator;]
ParquetTypeInfo.productPrefix ( ) : String
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.productPrefix:()Ljava/lang/String;]
ParquetTypeInfo.toString ( ) : String
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.toString:()Ljava/lang/String;]
ParquetTypeInfo.tupled ( ) [static] : scala.Function1<scala.Tuple4<parquet.schema.PrimitiveType.PrimitiveTypeName,scala.Option<parquet.schema.OriginalType>,scala.Option<parquet.schema.DecimalMetadata>,scala.Option<Object>>,ParquetTypeInfo>
[mangled: org/apache/spark/sql/parquet/ParquetTypeInfo.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, Partition.class
package org.apache.spark.sql.parquet
Partition.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/Partition.canEqual:(Ljava/lang/Object;)Z]
Partition.copy ( org.apache.spark.sql.Row values, String path ) : Partition
[mangled: org/apache/spark/sql/parquet/Partition.copy:(Lorg/apache/spark/sql/Row;Ljava/lang/String;)Lorg/apache/spark/sql/parquet/Partition;]
Partition.curried ( ) [static] : scala.Function1<org.apache.spark.sql.Row,scala.Function1<String,Partition>>
[mangled: org/apache/spark/sql/parquet/Partition.curried:()Lscala/Function1;]
Partition.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/Partition.equals:(Ljava/lang/Object;)Z]
Partition.hashCode ( ) : int
[mangled: org/apache/spark/sql/parquet/Partition.hashCode:()I]
Partition.Partition ( org.apache.spark.sql.Row values, String path )
[mangled: org/apache/spark/sql/parquet/Partition."<init>":(Lorg/apache/spark/sql/Row;Ljava/lang/String;)V]
Partition.path ( ) : String
[mangled: org/apache/spark/sql/parquet/Partition.path:()Ljava/lang/String;]
Partition.productArity ( ) : int
[mangled: org/apache/spark/sql/parquet/Partition.productArity:()I]
Partition.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/parquet/Partition.productElement:(I)Ljava/lang/Object;]
Partition.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/parquet/Partition.productIterator:()Lscala/collection/Iterator;]
Partition.productPrefix ( ) : String
[mangled: org/apache/spark/sql/parquet/Partition.productPrefix:()Ljava/lang/String;]
Partition.toString ( ) : String
[mangled: org/apache/spark/sql/parquet/Partition.toString:()Ljava/lang/String;]
Partition.tupled ( ) [static] : scala.Function1<scala.Tuple2<org.apache.spark.sql.Row,String>,Partition>
[mangled: org/apache/spark/sql/parquet/Partition.tupled:()Lscala/Function1;]
Partition.values ( ) : org.apache.spark.sql.Row
[mangled: org/apache/spark/sql/parquet/Partition.values:()Lorg/apache/spark/sql/Row;]
spark-sql_2.10-1.3.0.jar, PartitionSpec.class
package org.apache.spark.sql.parquet
PartitionSpec.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/PartitionSpec.canEqual:(Ljava/lang/Object;)Z]
PartitionSpec.copy ( org.apache.spark.sql.types.StructType partitionColumns, scala.collection.Seq<Partition> partitions ) : PartitionSpec
[mangled: org/apache/spark/sql/parquet/PartitionSpec.copy:(Lorg/apache/spark/sql/types/StructType;Lscala/collection/Seq;)Lorg/apache/spark/sql/parquet/PartitionSpec;]
PartitionSpec.curried ( ) [static] : scala.Function1<org.apache.spark.sql.types.StructType,scala.Function1<scala.collection.Seq<Partition>,PartitionSpec>>
[mangled: org/apache/spark/sql/parquet/PartitionSpec.curried:()Lscala/Function1;]
PartitionSpec.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/PartitionSpec.equals:(Ljava/lang/Object;)Z]
PartitionSpec.hashCode ( ) : int
[mangled: org/apache/spark/sql/parquet/PartitionSpec.hashCode:()I]
PartitionSpec.partitionColumns ( ) : org.apache.spark.sql.types.StructType
[mangled: org/apache/spark/sql/parquet/PartitionSpec.partitionColumns:()Lorg/apache/spark/sql/types/StructType;]
PartitionSpec.partitions ( ) : scala.collection.Seq<Partition>
[mangled: org/apache/spark/sql/parquet/PartitionSpec.partitions:()Lscala/collection/Seq;]
PartitionSpec.PartitionSpec ( org.apache.spark.sql.types.StructType partitionColumns, scala.collection.Seq<Partition> partitions )
[mangled: org/apache/spark/sql/parquet/PartitionSpec."<init>":(Lorg/apache/spark/sql/types/StructType;Lscala/collection/Seq;)V]
PartitionSpec.productArity ( ) : int
[mangled: org/apache/spark/sql/parquet/PartitionSpec.productArity:()I]
PartitionSpec.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/parquet/PartitionSpec.productElement:(I)Ljava/lang/Object;]
PartitionSpec.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/parquet/PartitionSpec.productIterator:()Lscala/collection/Iterator;]
PartitionSpec.productPrefix ( ) : String
[mangled: org/apache/spark/sql/parquet/PartitionSpec.productPrefix:()Ljava/lang/String;]
PartitionSpec.toString ( ) : String
[mangled: org/apache/spark/sql/parquet/PartitionSpec.toString:()Ljava/lang/String;]
PartitionSpec.tupled ( ) [static] : scala.Function1<scala.Tuple2<org.apache.spark.sql.types.StructType,scala.collection.Seq<Partition>>,PartitionSpec>
[mangled: org/apache/spark/sql/parquet/PartitionSpec.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, PhysicalRDD.class
package org.apache.spark.sql.execution
PhysicalRDD.children ( ) : scala.collection.immutable.Nil.
[mangled: org/apache/spark/sql/execution/PhysicalRDD.children:()Lscala/collection/immutable/Nil$;]
PhysicalRDD.copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, org.apache.spark.rdd.RDD<org.apache.spark.sql.Row> rdd ) : PhysicalRDD
[mangled: org/apache/spark/sql/execution/PhysicalRDD.copy:(Lscala/collection/Seq;Lorg/apache/spark/rdd/RDD;)Lorg/apache/spark/sql/execution/PhysicalRDD;]
PhysicalRDD.curried ( ) [static] : scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>,scala.Function1<org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>,PhysicalRDD>>
[mangled: org/apache/spark/sql/execution/PhysicalRDD.curried:()Lscala/Function1;]
PhysicalRDD.PhysicalRDD ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, org.apache.spark.rdd.RDD<org.apache.spark.sql.Row> rdd )
[mangled: org/apache/spark/sql/execution/PhysicalRDD."<init>":(Lscala/collection/Seq;Lorg/apache/spark/rdd/RDD;)V]
PhysicalRDD.tupled ( ) [static] : scala.Function1<scala.Tuple2<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>,org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>>,PhysicalRDD>
[mangled: org/apache/spark/sql/execution/PhysicalRDD.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, PostgresQuirks.class
package org.apache.spark.sql.jdbc
PostgresQuirks.PostgresQuirks ( )
[mangled: org/apache/spark/sql/jdbc/PostgresQuirks."<init>":()V]
spark-sql_2.10-1.3.0.jar, PreWriteCheck.class
package org.apache.spark.sql.sources
PreWriteCheck.andThen ( scala.Function1<scala.runtime.BoxedUnit,A> g ) : scala.Function1<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcDD.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcDD.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcDF.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcDF.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcDI.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcDI.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcDJ.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcDJ.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcFD.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcFD.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcFF.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcFF.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcFI.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcFI.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcFJ.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcFJ.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcID.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcID.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcIF.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcIF.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcII.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcII.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcIJ.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcIJ.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcJD.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcJD.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcJF.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcJF.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcJI.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcJI.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcJJ.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcJJ.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcVD.sp ( scala.Function1<scala.runtime.BoxedUnit,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcVD.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcVF.sp ( scala.Function1<scala.runtime.BoxedUnit,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcVF.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcVI.sp ( scala.Function1<scala.runtime.BoxedUnit,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcVI.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcVJ.sp ( scala.Function1<scala.runtime.BoxedUnit,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcVJ.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcZD.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcZD.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcZF.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcZF.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcZI.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcZI.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.andThen.mcZJ.sp ( scala.Function1<Object,A> g ) : scala.Function1<Object,A>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.andThen.mcZJ.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.apply ( Object v1 ) : Object
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply:(Ljava/lang/Object;)Ljava/lang/Object;]
PreWriteCheck.apply ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan plan ) : void
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)V]
PreWriteCheck.apply.mcDD.sp ( double v1 ) : double
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcDD.sp:(D)D]
PreWriteCheck.apply.mcDF.sp ( float v1 ) : double
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcDF.sp:(F)D]
PreWriteCheck.apply.mcDI.sp ( int v1 ) : double
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcDI.sp:(I)D]
PreWriteCheck.apply.mcDJ.sp ( long v1 ) : double
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcDJ.sp:(J)D]
PreWriteCheck.apply.mcFD.sp ( double v1 ) : float
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcFD.sp:(D)F]
PreWriteCheck.apply.mcFF.sp ( float v1 ) : float
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcFF.sp:(F)F]
PreWriteCheck.apply.mcFI.sp ( int v1 ) : float
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcFI.sp:(I)F]
PreWriteCheck.apply.mcFJ.sp ( long v1 ) : float
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcFJ.sp:(J)F]
PreWriteCheck.apply.mcID.sp ( double v1 ) : int
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcID.sp:(D)I]
PreWriteCheck.apply.mcIF.sp ( float v1 ) : int
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcIF.sp:(F)I]
PreWriteCheck.apply.mcII.sp ( int v1 ) : int
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcII.sp:(I)I]
PreWriteCheck.apply.mcIJ.sp ( long v1 ) : int
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcIJ.sp:(J)I]
PreWriteCheck.apply.mcJD.sp ( double v1 ) : long
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcJD.sp:(D)J]
PreWriteCheck.apply.mcJF.sp ( float v1 ) : long
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcJF.sp:(F)J]
PreWriteCheck.apply.mcJI.sp ( int v1 ) : long
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcJI.sp:(I)J]
PreWriteCheck.apply.mcJJ.sp ( long v1 ) : long
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcJJ.sp:(J)J]
PreWriteCheck.apply.mcVD.sp ( double v1 ) : void
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcVD.sp:(D)V]
PreWriteCheck.apply.mcVF.sp ( float v1 ) : void
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcVF.sp:(F)V]
PreWriteCheck.apply.mcVI.sp ( int v1 ) : void
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcVI.sp:(I)V]
PreWriteCheck.apply.mcVJ.sp ( long v1 ) : void
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcVJ.sp:(J)V]
PreWriteCheck.apply.mcZD.sp ( double v1 ) : boolean
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcZD.sp:(D)Z]
PreWriteCheck.apply.mcZF.sp ( float v1 ) : boolean
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcZF.sp:(F)Z]
PreWriteCheck.apply.mcZI.sp ( int v1 ) : boolean
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcZI.sp:(I)Z]
PreWriteCheck.apply.mcZJ.sp ( long v1 ) : boolean
[mangled: org/apache/spark/sql/sources/PreWriteCheck.apply.mcZJ.sp:(J)Z]
PreWriteCheck.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/PreWriteCheck.canEqual:(Ljava/lang/Object;)Z]
PreWriteCheck.catalog ( ) : org.apache.spark.sql.catalyst.analysis.Catalog
[mangled: org/apache/spark/sql/sources/PreWriteCheck.catalog:()Lorg/apache/spark/sql/catalyst/analysis/Catalog;]
PreWriteCheck.compose ( scala.Function1<A,org.apache.spark.sql.catalyst.plans.logical.LogicalPlan> g ) : scala.Function1<A,scala.runtime.BoxedUnit>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcDD.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcDD.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcDF.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcDF.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcDI.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcDI.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcDJ.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcDJ.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcFD.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcFD.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcFF.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcFF.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcFI.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcFI.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcFJ.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcFJ.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcID.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcID.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcIF.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcIF.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcII.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcII.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcIJ.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcIJ.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcJD.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcJD.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcJF.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcJF.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcJI.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcJI.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcJJ.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcJJ.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcVD.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,scala.runtime.BoxedUnit>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcVD.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcVF.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,scala.runtime.BoxedUnit>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcVF.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcVI.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,scala.runtime.BoxedUnit>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcVI.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcVJ.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,scala.runtime.BoxedUnit>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcVJ.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcZD.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcZD.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcZF.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcZF.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcZI.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcZI.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.compose.mcZJ.sp ( scala.Function1<A,Object> g ) : scala.Function1<A,Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.compose.mcZJ.sp:(Lscala/Function1;)Lscala/Function1;]
PreWriteCheck.copy ( org.apache.spark.sql.catalyst.analysis.Catalog catalog ) : PreWriteCheck
[mangled: org/apache/spark/sql/sources/PreWriteCheck.copy:(Lorg/apache/spark/sql/catalyst/analysis/Catalog;)Lorg/apache/spark/sql/sources/PreWriteCheck;]
PreWriteCheck.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/PreWriteCheck.equals:(Ljava/lang/Object;)Z]
PreWriteCheck.failAnalysis ( String msg ) : scala.runtime.Nothing.
[mangled: org/apache/spark/sql/sources/PreWriteCheck.failAnalysis:(Ljava/lang/String;)Lscala/runtime/Nothing$;]
PreWriteCheck.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/PreWriteCheck.hashCode:()I]
PreWriteCheck.PreWriteCheck ( org.apache.spark.sql.catalyst.analysis.Catalog catalog )
[mangled: org/apache/spark/sql/sources/PreWriteCheck."<init>":(Lorg/apache/spark/sql/catalyst/analysis/Catalog;)V]
PreWriteCheck.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/PreWriteCheck.productArity:()I]
PreWriteCheck.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/PreWriteCheck.productElement:(I)Ljava/lang/Object;]
PreWriteCheck.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/PreWriteCheck.productIterator:()Lscala/collection/Iterator;]
PreWriteCheck.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/PreWriteCheck.productPrefix:()Ljava/lang/String;]
PreWriteCheck.toString ( ) : String
[mangled: org/apache/spark/sql/sources/PreWriteCheck.toString:()Ljava/lang/String;]
spark-sql_2.10-1.3.0.jar, Project.class
package org.apache.spark.sql.execution
Project.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/Project.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
Project.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Project.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, PythonUDF.class
package org.apache.spark.sql.execution
PythonUDF.copy ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, org.apache.spark.sql.types.DataType dataType, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> children ) : PythonUDF
[mangled: org/apache/spark/sql/execution/PythonUDF.copy:(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;Lscala/collection/Seq;)Lorg/apache/spark/sql/execution/PythonUDF;]
PythonUDF.eval ( org.apache.spark.sql.Row input ) : Object
[mangled: org/apache/spark/sql/execution/PythonUDF.eval:(Lorg/apache/spark/sql/Row;)Ljava/lang/Object;]
PythonUDF.eval ( org.apache.spark.sql.Row input ) : scala.runtime.Nothing.
[mangled: org/apache/spark/sql/execution/PythonUDF.eval:(Lorg/apache/spark/sql/Row;)Lscala/runtime/Nothing$;]
PythonUDF.PythonUDF ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, org.apache.spark.sql.types.DataType dataType, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> children )
[mangled: org/apache/spark/sql/execution/PythonUDF."<init>":(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;Lscala/collection/Seq;)V]
spark-sql_2.10-1.3.0.jar, RefreshTable.class
package org.apache.spark.sql.sources
RefreshTable.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/RefreshTable.canEqual:(Ljava/lang/Object;)Z]
RefreshTable.copy ( String databaseName, String tableName ) : RefreshTable
[mangled: org/apache/spark/sql/sources/RefreshTable.copy:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/sources/RefreshTable;]
RefreshTable.curried ( ) [static] : scala.Function1<String,scala.Function1<String,RefreshTable>>
[mangled: org/apache/spark/sql/sources/RefreshTable.curried:()Lscala/Function1;]
RefreshTable.databaseName ( ) : String
[mangled: org/apache/spark/sql/sources/RefreshTable.databaseName:()Ljava/lang/String;]
RefreshTable.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/RefreshTable.equals:(Ljava/lang/Object;)Z]
RefreshTable.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/RefreshTable.hashCode:()I]
RefreshTable.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/RefreshTable.productArity:()I]
RefreshTable.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/RefreshTable.productElement:(I)Ljava/lang/Object;]
RefreshTable.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/RefreshTable.productIterator:()Lscala/collection/Iterator;]
RefreshTable.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/RefreshTable.productPrefix:()Ljava/lang/String;]
RefreshTable.RefreshTable ( String databaseName, String tableName )
[mangled: org/apache/spark/sql/sources/RefreshTable."<init>":(Ljava/lang/String;Ljava/lang/String;)V]
RefreshTable.run ( org.apache.spark.sql.SQLContext sqlContext ) : scala.collection.Seq<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/sources/RefreshTable.run:(Lorg/apache/spark/sql/SQLContext;)Lscala/collection/Seq;]
RefreshTable.tableName ( ) : String
[mangled: org/apache/spark/sql/sources/RefreshTable.tableName:()Ljava/lang/String;]
RefreshTable.tupled ( ) [static] : scala.Function1<scala.Tuple2<String,String>,RefreshTable>
[mangled: org/apache/spark/sql/sources/RefreshTable.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, ResolvedDataSource.class
package org.apache.spark.sql.sources
ResolvedDataSource.apply ( org.apache.spark.sql.SQLContext p1, scala.Option<org.apache.spark.sql.types.StructType> p2, String p3, scala.collection.immutable.Map<String,String> p4 ) [static] : ResolvedDataSource
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.apply:(Lorg/apache/spark/sql/SQLContext;Lscala/Option;Ljava/lang/String;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/sources/ResolvedDataSource;]
ResolvedDataSource.apply ( org.apache.spark.sql.SQLContext p1, String p2, org.apache.spark.sql.SaveMode p3, scala.collection.immutable.Map<String,String> p4, org.apache.spark.sql.DataFrame p5 ) [static] : ResolvedDataSource
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.apply:(Lorg/apache/spark/sql/SQLContext;Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/sources/ResolvedDataSource;]
ResolvedDataSource.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.canEqual:(Ljava/lang/Object;)Z]
ResolvedDataSource.copy ( Class<?> provider, BaseRelation relation ) : ResolvedDataSource
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.copy:(Ljava/lang/Class;Lorg/apache/spark/sql/sources/BaseRelation;)Lorg/apache/spark/sql/sources/ResolvedDataSource;]
ResolvedDataSource.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.equals:(Ljava/lang/Object;)Z]
ResolvedDataSource.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.hashCode:()I]
ResolvedDataSource.lookupDataSource ( String p1 ) [static] : Class<?>
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.lookupDataSource:(Ljava/lang/String;)Ljava/lang/Class;]
ResolvedDataSource.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.productArity:()I]
ResolvedDataSource.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.productElement:(I)Ljava/lang/Object;]
ResolvedDataSource.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.productIterator:()Lscala/collection/Iterator;]
ResolvedDataSource.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.productPrefix:()Ljava/lang/String;]
ResolvedDataSource.provider ( ) : Class<?>
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.provider:()Ljava/lang/Class;]
ResolvedDataSource.relation ( ) : BaseRelation
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.relation:()Lorg/apache/spark/sql/sources/BaseRelation;]
ResolvedDataSource.ResolvedDataSource ( Class<?> provider, BaseRelation relation )
[mangled: org/apache/spark/sql/sources/ResolvedDataSource."<init>":(Ljava/lang/Class;Lorg/apache/spark/sql/sources/BaseRelation;)V]
ResolvedDataSource.toString ( ) : String
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.toString:()Ljava/lang/String;]
spark-sql_2.10-1.3.0.jar, RowReadSupport.class
package org.apache.spark.sql.parquet
RowReadSupport.RowReadSupport ( )
[mangled: org/apache/spark/sql/parquet/RowReadSupport."<init>":()V]
spark-sql_2.10-1.3.0.jar, RowRecordMaterializer.class
package org.apache.spark.sql.parquet
RowRecordMaterializer.RowRecordMaterializer ( parquet.schema.MessageType parquetSchema, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> attributes )
[mangled: org/apache/spark/sql/parquet/RowRecordMaterializer."<init>":(Lparquet/schema/MessageType;Lscala/collection/Seq;)V]
spark-sql_2.10-1.3.0.jar, RowWriteSupport.class
package org.apache.spark.sql.parquet
RowWriteSupport.attributes ( ) : org.apache.spark.sql.catalyst.expressions.Attribute[ ]
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.attributes:()[Lorg/apache/spark/sql/catalyst/expressions/Attribute;]
RowWriteSupport.attributes_.eq ( org.apache.spark.sql.catalyst.expressions.Attribute[ ] p1 ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.attributes_.eq:([Lorg/apache/spark/sql/catalyst/expressions/Attribute;)V]
RowWriteSupport.getSchema ( org.apache.hadoop.conf.Configuration p1 ) [static] : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.getSchema:(Lorg/apache/hadoop/conf/Configuration;)Lscala/collection/Seq;]
RowWriteSupport.init ( org.apache.hadoop.conf.Configuration configuration ) : parquet.hadoop.api.WriteSupport.WriteContext
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.init:(Lorg/apache/hadoop/conf/Configuration;)Lparquet/hadoop/api/WriteSupport$WriteContext;]
RowWriteSupport.isTraceEnabled ( ) : boolean
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.isTraceEnabled:()Z]
RowWriteSupport.log ( ) : org.slf4j.Logger
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.log:()Lorg/slf4j/Logger;]
RowWriteSupport.logDebug ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.logDebug:(Lscala/Function0;)V]
RowWriteSupport.logDebug ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.logDebug:(Lscala/Function0;Ljava/lang/Throwable;)V]
RowWriteSupport.logError ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.logError:(Lscala/Function0;)V]
RowWriteSupport.logError ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.logError:(Lscala/Function0;Ljava/lang/Throwable;)V]
RowWriteSupport.logInfo ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.logInfo:(Lscala/Function0;)V]
RowWriteSupport.logInfo ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.logInfo:(Lscala/Function0;Ljava/lang/Throwable;)V]
RowWriteSupport.logName ( ) : String
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.logName:()Ljava/lang/String;]
RowWriteSupport.logTrace ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.logTrace:(Lscala/Function0;)V]
RowWriteSupport.logTrace ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.logTrace:(Lscala/Function0;Ljava/lang/Throwable;)V]
RowWriteSupport.logWarning ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.logWarning:(Lscala/Function0;)V]
RowWriteSupport.logWarning ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.logWarning:(Lscala/Function0;Ljava/lang/Throwable;)V]
RowWriteSupport.org.apache.spark.Logging..log_ ( ) : org.slf4j.Logger
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.org.apache.spark.Logging..log_:()Lorg/slf4j/Logger;]
RowWriteSupport.org.apache.spark.Logging..log__.eq ( org.slf4j.Logger p1 ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.org.apache.spark.Logging..log__.eq:(Lorg/slf4j/Logger;)V]
RowWriteSupport.prepareForWrite ( parquet.io.api.RecordConsumer recordConsumer ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.prepareForWrite:(Lparquet/io/api/RecordConsumer;)V]
RowWriteSupport.RowWriteSupport ( )
[mangled: org/apache/spark/sql/parquet/RowWriteSupport."<init>":()V]
RowWriteSupport.setSchema ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> p1, org.apache.hadoop.conf.Configuration p2 ) [static] : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.setSchema:(Lscala/collection/Seq;Lorg/apache/hadoop/conf/Configuration;)V]
RowWriteSupport.SPARK_ROW_SCHEMA ( ) [static] : String
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.SPARK_ROW_SCHEMA:()Ljava/lang/String;]
RowWriteSupport.write ( Object p1 ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.write:(Ljava/lang/Object;)V]
RowWriteSupport.write ( org.apache.spark.sql.Row record ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.write:(Lorg/apache/spark/sql/Row;)V]
RowWriteSupport.writeArray ( org.apache.spark.sql.types.ArrayType schema, scala.collection.Seq<Object> array ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.writeArray:(Lorg/apache/spark/sql/types/ArrayType;Lscala/collection/Seq;)V]
RowWriteSupport.writeDecimal ( org.apache.spark.sql.types.Decimal decimal, int precision ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.writeDecimal:(Lorg/apache/spark/sql/types/Decimal;I)V]
RowWriteSupport.writeMap ( org.apache.spark.sql.types.MapType schema, scala.collection.immutable.Map<?,Object> map ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.writeMap:(Lorg/apache/spark/sql/types/MapType;Lscala/collection/immutable/Map;)V]
RowWriteSupport.writePrimitive ( org.apache.spark.sql.types.DataType schema, Object value ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.writePrimitive:(Lorg/apache/spark/sql/types/DataType;Ljava/lang/Object;)V]
RowWriteSupport.writer ( ) : parquet.io.api.RecordConsumer
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.writer:()Lparquet/io/api/RecordConsumer;]
RowWriteSupport.writer_.eq ( parquet.io.api.RecordConsumer p1 ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.writer_.eq:(Lparquet/io/api/RecordConsumer;)V]
RowWriteSupport.writeStruct ( org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.Row struct ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.writeStruct:(Lorg/apache/spark/sql/types/StructType;Lorg/apache/spark/sql/Row;)V]
RowWriteSupport.writeTimestamp ( java.sql.Timestamp ts ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.writeTimestamp:(Ljava/sql/Timestamp;)V]
RowWriteSupport.writeValue ( org.apache.spark.sql.types.DataType schema, Object value ) : void
[mangled: org/apache/spark/sql/parquet/RowWriteSupport.writeValue:(Lorg/apache/spark/sql/types/DataType;Ljava/lang/Object;)V]
spark-sql_2.10-1.3.0.jar, Sample.class
package org.apache.spark.sql.execution
Sample.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/Sample.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
Sample.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Sample.children:()Lscala/collection/immutable/List;]
Sample.copy ( double fraction, boolean withReplacement, long seed, SparkPlan child ) : Sample
[mangled: org/apache/spark/sql/execution/Sample.copy:(DZJLorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/Sample;]
Sample.fraction ( ) : double
[mangled: org/apache/spark/sql/execution/Sample.fraction:()D]
Sample.Sample ( double fraction, boolean withReplacement, long seed, SparkPlan child )
[mangled: org/apache/spark/sql/execution/Sample."<init>":(DZJLorg/apache/spark/sql/execution/SparkPlan;)V]
spark-sql_2.10-1.3.0.jar, SetCommand.class
package org.apache.spark.sql.execution
SetCommand.copy ( scala.Option<scala.Tuple2<String,scala.Option<String>>> kv, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output ) : SetCommand
[mangled: org/apache/spark/sql/execution/SetCommand.copy:(Lscala/Option;Lscala/collection/Seq;)Lorg/apache/spark/sql/execution/SetCommand;]
SetCommand.curried ( ) [static] : scala.Function1<scala.Option<scala.Tuple2<String,scala.Option<String>>>,scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>,SetCommand>>
[mangled: org/apache/spark/sql/execution/SetCommand.curried:()Lscala/Function1;]
SetCommand.SetCommand ( scala.Option<scala.Tuple2<String,scala.Option<String>>> kv, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output )
[mangled: org/apache/spark/sql/execution/SetCommand."<init>":(Lscala/Option;Lscala/collection/Seq;)V]
SetCommand.tupled ( ) [static] : scala.Function1<scala.Tuple2<scala.Option<scala.Tuple2<String,scala.Option<String>>>,scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>>,SetCommand>
[mangled: org/apache/spark/sql/execution/SetCommand.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, ShuffledHashJoin.class
package org.apache.spark.sql.execution.joins
ShuffledHashJoin.hashJoin ( scala.collection.Iterator<org.apache.spark.sql.Row> streamIter, HashedRelation hashedRelation ) : scala.collection.Iterator<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/ShuffledHashJoin.hashJoin:(Lscala/collection/Iterator;Lorg/apache/spark/sql/execution/joins/HashedRelation;)Lscala/collection/Iterator;]
ShuffledHashJoin.left ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/ShuffledHashJoin.left:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
ShuffledHashJoin.right ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/joins/ShuffledHashJoin.right:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
ShuffledHashJoin.streamSideKeyGenerator ( ) : scala.Function0<org.apache.spark.sql.catalyst.expressions.package.MutableProjection>
[mangled: org/apache/spark/sql/execution/joins/ShuffledHashJoin.streamSideKeyGenerator:()Lscala/Function0;]
spark-sql_2.10-1.3.0.jar, Sort.class
package org.apache.spark.sql.execution
Sort.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/Sort.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
Sort.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Sort.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, SparkStrategies.class
package org.apache.spark.sql.execution
SparkStrategies.HashJoin ( ) : SparkStrategies.HashJoin.
[mangled: org/apache/spark/sql/execution/SparkStrategies.HashJoin:()Lorg/apache/spark/sql/execution/SparkStrategies$HashJoin$;]
SparkStrategies.ParquetOperations ( ) : SparkStrategies.ParquetOperations.
[mangled: org/apache/spark/sql/execution/SparkStrategies.ParquetOperations:()Lorg/apache/spark/sql/execution/SparkStrategies$ParquetOperations$;]
SparkStrategies.TakeOrdered ( ) : SparkStrategies.TakeOrdered.
[mangled: org/apache/spark/sql/execution/SparkStrategies.TakeOrdered:()Lorg/apache/spark/sql/execution/SparkStrategies$TakeOrdered$;]
spark-sql_2.10-1.3.0.jar, SQLConf.class
package org.apache.spark.sql
SQLConf.getConf ( String key ) : String
[mangled: org/apache/spark/sql/SQLConf.getConf:(Ljava/lang/String;)Ljava/lang/String;]
SQLConf.getConf ( String key, String defaultValue ) : String
[mangled: org/apache/spark/sql/SQLConf.getConf:(Ljava/lang/String;Ljava/lang/String;)Ljava/lang/String;]
SQLConf.parquetUseDataSourceApi ( ) : boolean
[mangled: org/apache/spark/sql/SQLConf.parquetUseDataSourceApi:()Z]
SQLConf.setConf ( String key, String value ) : void
[mangled: org/apache/spark/sql/SQLConf.setConf:(Ljava/lang/String;Ljava/lang/String;)V]
spark-sql_2.10-1.3.0.jar, SQLContext.class
package org.apache.spark.sql
SQLContext.cacheManager ( ) : CacheManager
[mangled: org/apache/spark/sql/SQLContext.cacheManager:()Lorg/apache/spark/sql/CacheManager;]
SQLContext.checkAnalysis ( ) : catalyst.analysis.CheckAnalysis
[mangled: org/apache/spark/sql/SQLContext.checkAnalysis:()Lorg/apache/spark/sql/catalyst/analysis/CheckAnalysis;]
SQLContext.createDataFrame ( org.apache.spark.api.java.JavaRDD<Row> rowRDD, java.util.List<String> columns ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/api/java/JavaRDD;Ljava/util/List;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.ddlParser ( ) : sources.DDLParser
[mangled: org/apache/spark/sql/SQLContext.ddlParser:()Lorg/apache/spark/sql/sources/DDLParser;]
spark-sql_2.10-1.3.0.jar, TakeOrdered.class
package org.apache.spark.sql.execution
TakeOrdered.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/execution/TakeOrdered.canEqual:(Ljava/lang/Object;)Z]
TakeOrdered.child ( ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/TakeOrdered.child:()Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
TakeOrdered.child ( ) : SparkPlan
[mangled: org/apache/spark/sql/execution/TakeOrdered.child:()Lorg/apache/spark/sql/execution/SparkPlan;]
TakeOrdered.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/TakeOrdered.children:()Lscala/collection/immutable/List;]
TakeOrdered.children ( ) : scala.collection.Seq
[mangled: org/apache/spark/sql/execution/TakeOrdered.children:()Lscala/collection/Seq;]
TakeOrdered.copy ( int limit, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder> sortOrder, SparkPlan child ) : TakeOrdered
[mangled: org/apache/spark/sql/execution/TakeOrdered.copy:(ILscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/TakeOrdered;]
TakeOrdered.curried ( ) [static] : scala.Function1<Object,scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>,scala.Function1<SparkPlan,TakeOrdered>>>
[mangled: org/apache/spark/sql/execution/TakeOrdered.curried:()Lscala/Function1;]
TakeOrdered.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/execution/TakeOrdered.equals:(Ljava/lang/Object;)Z]
TakeOrdered.execute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/TakeOrdered.execute:()Lorg/apache/spark/rdd/RDD;]
TakeOrdered.executeCollect ( ) : org.apache.spark.sql.Row[ ]
[mangled: org/apache/spark/sql/execution/TakeOrdered.executeCollect:()[Lorg/apache/spark/sql/Row;]
TakeOrdered.hashCode ( ) : int
[mangled: org/apache/spark/sql/execution/TakeOrdered.hashCode:()I]
TakeOrdered.limit ( ) : int
[mangled: org/apache/spark/sql/execution/TakeOrdered.limit:()I]
TakeOrdered.ord ( ) : org.apache.spark.sql.catalyst.expressions.RowOrdering
[mangled: org/apache/spark/sql/execution/TakeOrdered.ord:()Lorg/apache/spark/sql/catalyst/expressions/RowOrdering;]
TakeOrdered.output ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/execution/TakeOrdered.output:()Lscala/collection/Seq;]
TakeOrdered.outputPartitioning ( ) : org.apache.spark.sql.catalyst.plans.physical.Partitioning
[mangled: org/apache/spark/sql/execution/TakeOrdered.outputPartitioning:()Lorg/apache/spark/sql/catalyst/plans/physical/Partitioning;]
TakeOrdered.outputPartitioning ( ) : org.apache.spark.sql.catalyst.plans.physical.SinglePartition.
[mangled: org/apache/spark/sql/execution/TakeOrdered.outputPartitioning:()Lorg/apache/spark/sql/catalyst/plans/physical/SinglePartition$;]
TakeOrdered.productArity ( ) : int
[mangled: org/apache/spark/sql/execution/TakeOrdered.productArity:()I]
TakeOrdered.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/execution/TakeOrdered.productElement:(I)Ljava/lang/Object;]
TakeOrdered.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/execution/TakeOrdered.productIterator:()Lscala/collection/Iterator;]
TakeOrdered.productPrefix ( ) : String
[mangled: org/apache/spark/sql/execution/TakeOrdered.productPrefix:()Ljava/lang/String;]
TakeOrdered.sortOrder ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>
[mangled: org/apache/spark/sql/execution/TakeOrdered.sortOrder:()Lscala/collection/Seq;]
TakeOrdered.TakeOrdered ( int limit, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder> sortOrder, SparkPlan child )
[mangled: org/apache/spark/sql/execution/TakeOrdered."<init>":(ILscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;)V]
TakeOrdered.tupled ( ) [static] : scala.Function1<scala.Tuple3<Object,scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>,SparkPlan>,TakeOrdered>
[mangled: org/apache/spark/sql/execution/TakeOrdered.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, TestGroupWriteSupport.class
package org.apache.spark.sql.parquet
TestGroupWriteSupport.TestGroupWriteSupport ( parquet.schema.MessageType schema )
[mangled: org/apache/spark/sql/parquet/TestGroupWriteSupport."<init>":(Lparquet/schema/MessageType;)V]
spark-sql_2.10-1.3.0.jar, UserDefinedFunction.class
package org.apache.spark.sql
UserDefinedFunction.copy ( Object f, types.DataType dataType ) : UserDefinedFunction
[mangled: org/apache/spark/sql/UserDefinedFunction.copy:(Ljava/lang/Object;Lorg/apache/spark/sql/types/DataType;)Lorg/apache/spark/sql/UserDefinedFunction;]
UserDefinedFunction.UserDefinedFunction ( Object f, types.DataType dataType )
[mangled: org/apache/spark/sql/UserDefinedFunction."<init>":(Ljava/lang/Object;Lorg/apache/spark/sql/types/DataType;)V]
spark-sql_2.10-1.3.0.jar, UserDefinedPythonFunction.class
package org.apache.spark.sql
UserDefinedPythonFunction.copy ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, types.DataType dataType ) : UserDefinedPythonFunction
[mangled: org/apache/spark/sql/UserDefinedPythonFunction.copy:(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;)Lorg/apache/spark/sql/UserDefinedPythonFunction;]
UserDefinedPythonFunction.UserDefinedPythonFunction ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, types.DataType dataType )
[mangled: org/apache/spark/sql/UserDefinedPythonFunction."<init>":(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;)V]
to the top
Problems with Data Types, High Severity (104)
spark-sql_2.10-1.3.0.jar
package org.apache.spark.sql
[+] CachedData (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
CachedData ( catalyst.plans.logical.LogicalPlan, columnar.InMemoryRelation )This constructor is from 'CachedData' class.
cachedRepresentation ( )This method is from 'CachedData' class.
canEqual ( java.lang.Object )This method is from 'CachedData' class.
copy ( catalyst.plans.logical.LogicalPlan, columnar.InMemoryRelation )This method is from 'CachedData' class.
curried ( )This method is from 'CachedData' class.
equals ( java.lang.Object )This method is from 'CachedData' class.
hashCode ( )This method is from 'CachedData' class.
plan ( )This method is from 'CachedData' class.
productArity ( )This method is from 'CachedData' class.
productElement ( int )This method is from 'CachedData' class.
productIterator ( )This method is from 'CachedData' class.
...
[+] CacheManager (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (9)
CacheManager ( SQLContext )This constructor is from 'CacheManager' class.
cacheQuery ( DataFrame, scala.Option<java.lang.String>, org.apache.spark.storage.StorageLevel )This method is from 'CacheManager' class.
cacheTable ( java.lang.String )This method is from 'CacheManager' class.
clearCache ( )This method is from 'CacheManager' class.
invalidateCache ( catalyst.plans.logical.LogicalPlan )This method is from 'CacheManager' class.
isCached ( java.lang.String )This method is from 'CacheManager' class.
tryUncacheQuery ( DataFrame, boolean )This method is from 'CacheManager' class.
uncacheTable ( java.lang.String )This method is from 'CacheManager' class.
useCachedData ( catalyst.plans.logical.LogicalPlan )This method is from 'CacheManager' class.
[+] DataFrame (1)
| Change | Effect |
---|
1 | Removed super-interface RDDApi<Row>. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
agg ( java.util.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
agg ( Column, Column... )This method is from 'DataFrame' class.
agg ( Column, scala.collection.Seq<Column> )This method is from 'DataFrame' class.
agg ( scala.collection.immutable.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
agg ( scala.Tuple2<java.lang.String,java.lang.String>, scala.collection.Seq<scala.Tuple2<java.lang.String,java.lang.String>> )This method is from 'DataFrame' class.
apply ( java.lang.String )This method is from 'DataFrame' class.
as ( java.lang.String )This method is from 'DataFrame' class.
as ( scala.Symbol )This method is from 'DataFrame' class.
cache ( )This method is from 'DataFrame' class.
col ( java.lang.String )This method is from 'DataFrame' class.
collect ( )This method is from 'DataFrame' class.
...
package org.apache.spark.sql.columnar
[+] ColumnBuilder (1)
| Change | Effect |
---|
1 | Abstract method appendFrom ( org.apache.spark.sql.Row, int ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (3)
build ( )This abstract method is from 'ColumnBuilder' interface.
columnStats ( )This abstract method is from 'ColumnBuilder' interface.
initialize ( int, java.lang.String, boolean )This abstract method is from 'ColumnBuilder' interface.
[+] ColumnStats (2)
| Change | Effect |
---|
1 | Abstract method collectedStatistics ( ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
2 | Abstract method gatherStats ( org.apache.spark.sql.Row, int ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (7)
columnStats ( )Return value of this abstract method has type 'ColumnStats'.
count ( )This abstract method is from 'ColumnStats' interface.
count_.eq ( int )This abstract method is from 'ColumnStats' interface.
nullCount ( )This abstract method is from 'ColumnStats' interface.
nullCount_.eq ( int )This abstract method is from 'ColumnStats' interface.
sizeInBytes ( )This abstract method is from 'ColumnStats' interface.
sizeInBytes_.eq ( long )This abstract method is from 'ColumnStats' interface.
[+] InMemoryColumnarTableScan (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
attributes ( )This method is from 'InMemoryColumnarTableScan' class.
buildFilter ( )This method is from 'InMemoryColumnarTableScan' class.
canEqual ( java.lang.Object )This method is from 'InMemoryColumnarTableScan' class.
children ( )This method is from 'InMemoryColumnarTableScan' class.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, InMemoryRelation )This method is from 'InMemoryColumnarTableScan' class.
curried ( )This method is from 'InMemoryColumnarTableScan' class.
equals ( java.lang.Object )This method is from 'InMemoryColumnarTableScan' class.
hashCode ( )This method is from 'InMemoryColumnarTableScan' class.
InMemoryColumnarTableScan ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, InMemoryRelation )This constructor is from 'InMemoryColumnarTableScan' class.
InMemoryColumnarTableScan..inMemoryPartitionPruningEnabled ( )This method is from 'InMemoryColumnarTableScan' class.
InMemoryColumnarTableScan..statsFor ( org.apache.spark.sql.catalyst.expressions.Attribute )This method is from 'InMemoryColumnarTableScan' class.
...
[+] InMemoryRelation (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, InMemoryRelation )3rd parameter 'relation' of this method has type 'InMemoryRelation'.
InMemoryColumnarTableScan ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, InMemoryRelation )3rd parameter 'relation' of this method has type 'InMemoryRelation'.
relation ( )Return value of this method has type 'InMemoryRelation'.
batchSize ( )This method is from 'InMemoryRelation' class.
cachedColumnBuffers ( )This method is from 'InMemoryRelation' class.
canEqual ( java.lang.Object )This method is from 'InMemoryRelation' class.
child ( )This method is from 'InMemoryRelation' class.
children ( )This method is from 'InMemoryRelation' class.
equals ( java.lang.Object )This method is from 'InMemoryRelation' class.
hashCode ( )This method is from 'InMemoryRelation' class.
newInstance ( )This method is from 'InMemoryRelation' class.
...
[+] NullableColumnBuilder (2)
| Change | Effect |
---|
1 | Abstract method appendFrom ( org.apache.spark.sql.Row, int ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
2 | Abstract method NullableColumnBuilder..super.appendFrom ( org.apache.spark.sql.Row, int ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
build ( )This abstract method is from 'NullableColumnBuilder' interface.
buildNonNulls ( )This abstract method is from 'NullableColumnBuilder' interface.
initialize ( int, java.lang.String, boolean )This abstract method is from 'NullableColumnBuilder' interface.
nullCount ( )This abstract method is from 'NullableColumnBuilder' interface.
nullCount_.eq ( int )This abstract method is from 'NullableColumnBuilder' interface.
nulls ( )This abstract method is from 'NullableColumnBuilder' interface.
nulls_.eq ( java.nio.ByteBuffer )This abstract method is from 'NullableColumnBuilder' interface.
NullableColumnBuilder..pos ( )This abstract method is from 'NullableColumnBuilder' interface.
NullableColumnBuilder..pos_.eq ( int )This abstract method is from 'NullableColumnBuilder' interface.
NullableColumnBuilder..super.build ( )This abstract method is from 'NullableColumnBuilder' interface.
NullableColumnBuilder..super.initialize ( int, java.lang.String, boolean )This abstract method is from 'NullableColumnBuilder' interface.
...
[+] TimestampColumnStats (1)
| Change | Effect |
---|
1 | Removed super-interface ColumnStats. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
TimestampColumnStats ( )This constructor is from 'TimestampColumnStats' class.
package org.apache.spark.sql.columnar.compression
[+] Encoder<T> (1)
| Change | Effect |
---|
1 | Abstract method gatherCompressibilityStats ( org.apache.spark.sql.Row, int ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (5)
encoder ( org.apache.spark.sql.columnar.NativeColumnType<T> )Return value of this abstract method has type 'Encoder<T>'.
compress ( java.nio.ByteBuffer, java.nio.ByteBuffer )This abstract method is from 'Encoder<T>' interface.
compressedSize ( )This abstract method is from 'Encoder<T>' interface.
compressionRatio ( )This abstract method is from 'Encoder<T>' interface.
uncompressedSize ( )This abstract method is from 'Encoder<T>' interface.
package org.apache.spark.sql.execution
[+] AddExchange (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
AddExchange ( org.apache.spark.sql.SQLContext )This constructor is from 'AddExchange' class.
andThen ( scala.Function1<AddExchange,A> )This method is from 'AddExchange' class.
apply ( org.apache.spark.sql.catalyst.trees.TreeNode )This method is from 'AddExchange' class.
apply ( SparkPlan )This method is from 'AddExchange' class.
canEqual ( java.lang.Object )This method is from 'AddExchange' class.
compose ( scala.Function1<A,org.apache.spark.sql.SQLContext> )This method is from 'AddExchange' class.
copy ( org.apache.spark.sql.SQLContext )This method is from 'AddExchange' class.
equals ( java.lang.Object )This method is from 'AddExchange' class.
hashCode ( )This method is from 'AddExchange' class.
numPartitions ( )This method is from 'AddExchange' class.
productArity ( )This method is from 'AddExchange' class.
...
[+] Aggregate (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
Aggregate ( boolean, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression>, SparkPlan )This constructor is from 'Aggregate' class.
aggregateExpressions ( )This method is from 'Aggregate' class.
canEqual ( java.lang.Object )This method is from 'Aggregate' class.
child ( )This method is from 'Aggregate' class.
children ( )This method is from 'Aggregate' class.
ComputedAggregate ( )This method is from 'Aggregate' class.
copy ( boolean, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression>, SparkPlan )This method is from 'Aggregate' class.
curried ( )This method is from 'Aggregate' class.
equals ( java.lang.Object )This method is from 'Aggregate' class.
groupingExpressions ( )This method is from 'Aggregate' class.
hashCode ( )This method is from 'Aggregate' class.
...
[+] AggregateEvaluation (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
AggregateEvaluation ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, org.apache.spark.sql.catalyst.expressions.Expression )This constructor is from 'AggregateEvaluation' class.
canEqual ( java.lang.Object )This method is from 'AggregateEvaluation' class.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, org.apache.spark.sql.catalyst.expressions.Expression )This method is from 'AggregateEvaluation' class.
curried ( )This method is from 'AggregateEvaluation' class.
equals ( java.lang.Object )This method is from 'AggregateEvaluation' class.
hashCode ( )This method is from 'AggregateEvaluation' class.
initialValues ( )This method is from 'AggregateEvaluation' class.
productArity ( )This method is from 'AggregateEvaluation' class.
productElement ( int )This method is from 'AggregateEvaluation' class.
productIterator ( )This method is from 'AggregateEvaluation' class.
productPrefix ( )This method is from 'AggregateEvaluation' class.
...
[+] BatchPythonEvaluation (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
BatchPythonEvaluation ( PythonUDF, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )This constructor is from 'BatchPythonEvaluation' class.
canEqual ( java.lang.Object )This method is from 'BatchPythonEvaluation' class.
child ( )This method is from 'BatchPythonEvaluation' class.
children ( )This method is from 'BatchPythonEvaluation' class.
copy ( PythonUDF, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )This method is from 'BatchPythonEvaluation' class.
curried ( )This method is from 'BatchPythonEvaluation' class.
equals ( java.lang.Object )This method is from 'BatchPythonEvaluation' class.
hashCode ( )This method is from 'BatchPythonEvaluation' class.
output ( )This method is from 'BatchPythonEvaluation' class.
productArity ( )This method is from 'BatchPythonEvaluation' class.
productElement ( int )This method is from 'BatchPythonEvaluation' class.
...
[+] CacheTableCommand (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
CacheTableCommand ( java.lang.String, scala.Option<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>, boolean )This constructor is from 'CacheTableCommand' class.
canEqual ( java.lang.Object )This method is from 'CacheTableCommand' class.
copy ( java.lang.String, scala.Option<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>, boolean )This method is from 'CacheTableCommand' class.
curried ( )This method is from 'CacheTableCommand' class.
equals ( java.lang.Object )This method is from 'CacheTableCommand' class.
hashCode ( )This method is from 'CacheTableCommand' class.
isLazy ( )This method is from 'CacheTableCommand' class.
output ( )This method is from 'CacheTableCommand' class.
plan ( )This method is from 'CacheTableCommand' class.
productArity ( )This method is from 'CacheTableCommand' class.
productElement ( int )This method is from 'CacheTableCommand' class.
...
[+] DescribeCommand (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'DescribeCommand' class.
child ( )This method is from 'DescribeCommand' class.
copy ( SparkPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )This method is from 'DescribeCommand' class.
curried ( )This method is from 'DescribeCommand' class.
DescribeCommand ( SparkPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )This constructor is from 'DescribeCommand' class.
equals ( java.lang.Object )This method is from 'DescribeCommand' class.
hashCode ( )This method is from 'DescribeCommand' class.
isExtended ( )This method is from 'DescribeCommand' class.
output ( )This method is from 'DescribeCommand' class.
productArity ( )This method is from 'DescribeCommand' class.
productElement ( int )This method is from 'DescribeCommand' class.
...
[+] Distinct (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'Distinct' class.
child ( )This method is from 'Distinct' class.
child ( )This method is from 'Distinct' class.
children ( )This method is from 'Distinct' class.
children ( )This method is from 'Distinct' class.
copy ( boolean, SparkPlan )This method is from 'Distinct' class.
curried ( )This method is from 'Distinct' class.
Distinct ( boolean, SparkPlan )This constructor is from 'Distinct' class.
equals ( java.lang.Object )This method is from 'Distinct' class.
execute ( )This method is from 'Distinct' class.
hashCode ( )This method is from 'Distinct' class.
...
[+] EvaluatePython (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'EvaluatePython' class.
child ( )This method is from 'EvaluatePython' class.
copy ( PythonUDF, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, org.apache.spark.sql.catalyst.expressions.AttributeReference )This method is from 'EvaluatePython' class.
equals ( java.lang.Object )This method is from 'EvaluatePython' class.
EvaluatePython ( PythonUDF, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, org.apache.spark.sql.catalyst.expressions.AttributeReference )This constructor is from 'EvaluatePython' class.
fromJava ( java.lang.Object, org.apache.spark.sql.types.DataType )This method is from 'EvaluatePython' class.
hashCode ( )This method is from 'EvaluatePython' class.
output ( )This method is from 'EvaluatePython' class.
productArity ( )This method is from 'EvaluatePython' class.
productElement ( int )This method is from 'EvaluatePython' class.
productIterator ( )This method is from 'EvaluatePython' class.
...
[+] Except (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'Except' class.
children ( )This method is from 'Except' class.
copy ( SparkPlan, SparkPlan )This method is from 'Except' class.
curried ( )This method is from 'Except' class.
equals ( java.lang.Object )This method is from 'Except' class.
Except ( SparkPlan, SparkPlan )This constructor is from 'Except' class.
hashCode ( )This method is from 'Except' class.
left ( )This method is from 'Except' class.
output ( )This method is from 'Except' class.
productArity ( )This method is from 'Except' class.
productElement ( int )This method is from 'Except' class.
...
[+] Exchange (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'Exchange' class.
child ( )This method is from 'Exchange' class.
children ( )This method is from 'Exchange' class.
copy ( org.apache.spark.sql.catalyst.plans.physical.Partitioning, SparkPlan )This method is from 'Exchange' class.
curried ( )This method is from 'Exchange' class.
equals ( java.lang.Object )This method is from 'Exchange' class.
Exchange ( org.apache.spark.sql.catalyst.plans.physical.Partitioning, SparkPlan )This constructor is from 'Exchange' class.
hashCode ( )This method is from 'Exchange' class.
newPartitioning ( )This method is from 'Exchange' class.
output ( )This method is from 'Exchange' class.
outputPartitioning ( )This method is from 'Exchange' class.
...
[+] ExecutedCommand (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
andThen ( scala.Function1<ExecutedCommand,A> )This method is from 'ExecutedCommand' class.
canEqual ( java.lang.Object )This method is from 'ExecutedCommand' class.
children ( )This method is from 'ExecutedCommand' class.
cmd ( )This method is from 'ExecutedCommand' class.
compose ( scala.Function1<A,RunnableCommand> )This method is from 'ExecutedCommand' class.
copy ( RunnableCommand )This method is from 'ExecutedCommand' class.
equals ( java.lang.Object )This method is from 'ExecutedCommand' class.
executeCollect ( )This method is from 'ExecutedCommand' class.
ExecutedCommand ( RunnableCommand )This constructor is from 'ExecutedCommand' class.
executeTake ( int )This method is from 'ExecutedCommand' class.
hashCode ( )This method is from 'ExecutedCommand' class.
...
[+] Expand (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'Expand' class.
child ( )This method is from 'Expand' class.
children ( )This method is from 'Expand' class.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.GroupExpression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )This method is from 'Expand' class.
curried ( )This method is from 'Expand' class.
equals ( java.lang.Object )This method is from 'Expand' class.
Expand ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.GroupExpression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )This constructor is from 'Expand' class.
hashCode ( )This method is from 'Expand' class.
output ( )This method is from 'Expand' class.
outputPartitioning ( )This method is from 'Expand' class.
productArity ( )This method is from 'Expand' class.
...
[+] ExplainCommand (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'ExplainCommand' class.
copy ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )This method is from 'ExplainCommand' class.
curried ( )This method is from 'ExplainCommand' class.
equals ( java.lang.Object )This method is from 'ExplainCommand' class.
ExplainCommand ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )This constructor is from 'ExplainCommand' class.
extended ( )This method is from 'ExplainCommand' class.
hashCode ( )This method is from 'ExplainCommand' class.
logicalPlan ( )This method is from 'ExplainCommand' class.
output ( )This method is from 'ExplainCommand' class.
productArity ( )This method is from 'ExplainCommand' class.
productElement ( int )This method is from 'ExplainCommand' class.
...
[+] ExternalSort (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'ExternalSort' class.
child ( )This method is from 'ExternalSort' class.
children ( )This method is from 'ExternalSort' class.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>, boolean, SparkPlan )This method is from 'ExternalSort' class.
curried ( )This method is from 'ExternalSort' class.
equals ( java.lang.Object )This method is from 'ExternalSort' class.
ExternalSort ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>, boolean, SparkPlan )This constructor is from 'ExternalSort' class.
global ( )This method is from 'ExternalSort' class.
hashCode ( )This method is from 'ExternalSort' class.
output ( )This method is from 'ExternalSort' class.
outputPartitioning ( )This method is from 'ExternalSort' class.
...
[+] Filter (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'Filter' class.
child ( )This method is from 'Filter' class.
children ( )This method is from 'Filter' class.
condition ( )This method is from 'Filter' class.
copy ( org.apache.spark.sql.catalyst.expressions.Expression, SparkPlan )This method is from 'Filter' class.
curried ( )This method is from 'Filter' class.
equals ( java.lang.Object )This method is from 'Filter' class.
Filter ( org.apache.spark.sql.catalyst.expressions.Expression, SparkPlan )This constructor is from 'Filter' class.
hashCode ( )This method is from 'Filter' class.
output ( )This method is from 'Filter' class.
outputPartitioning ( )This method is from 'Filter' class.
...
[+] Generate (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
boundGenerator ( )This method is from 'Generate' class.
canEqual ( java.lang.Object )This method is from 'Generate' class.
child ( )This method is from 'Generate' class.
children ( )This method is from 'Generate' class.
curried ( )This method is from 'Generate' class.
equals ( java.lang.Object )This method is from 'Generate' class.
generator ( )This method is from 'Generate' class.
hashCode ( )This method is from 'Generate' class.
join ( )This method is from 'Generate' class.
outer ( )This method is from 'Generate' class.
output ( )This method is from 'Generate' class.
...
[+] GeneratedAggregate (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
aggregateExpressions ( )This method is from 'GeneratedAggregate' class.
canEqual ( java.lang.Object )This method is from 'GeneratedAggregate' class.
child ( )This method is from 'GeneratedAggregate' class.
child ( )This method is from 'GeneratedAggregate' class.
children ( )This method is from 'GeneratedAggregate' class.
children ( )This method is from 'GeneratedAggregate' class.
copy ( boolean, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression>, SparkPlan )This method is from 'GeneratedAggregate' class.
curried ( )This method is from 'GeneratedAggregate' class.
equals ( java.lang.Object )This method is from 'GeneratedAggregate' class.
execute ( )This method is from 'GeneratedAggregate' class.
GeneratedAggregate ( boolean, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression>, SparkPlan )This constructor is from 'GeneratedAggregate' class.
...
[+] Intersect (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'Intersect' class.
children ( )This method is from 'Intersect' class.
copy ( SparkPlan, SparkPlan )This method is from 'Intersect' class.
curried ( )This method is from 'Intersect' class.
equals ( java.lang.Object )This method is from 'Intersect' class.
hashCode ( )This method is from 'Intersect' class.
Intersect ( SparkPlan, SparkPlan )This constructor is from 'Intersect' class.
left ( )This method is from 'Intersect' class.
output ( )This method is from 'Intersect' class.
productArity ( )This method is from 'Intersect' class.
productElement ( int )This method is from 'Intersect' class.
...
[+] Limit (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'Limit' class.
child ( )This method is from 'Limit' class.
children ( )This method is from 'Limit' class.
copy ( int, SparkPlan )This method is from 'Limit' class.
curried ( )This method is from 'Limit' class.
equals ( java.lang.Object )This method is from 'Limit' class.
executeCollect ( )This method is from 'Limit' class.
hashCode ( )This method is from 'Limit' class.
limit ( )This method is from 'Limit' class.
Limit ( int, SparkPlan )This constructor is from 'Limit' class.
output ( )This method is from 'Limit' class.
...
[+] LocalTableScan (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'LocalTableScan' class.
children ( )This method is from 'LocalTableScan' class.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, scala.collection.Seq<org.apache.spark.sql.Row> )This method is from 'LocalTableScan' class.
curried ( )This method is from 'LocalTableScan' class.
equals ( java.lang.Object )This method is from 'LocalTableScan' class.
executeCollect ( )This method is from 'LocalTableScan' class.
executeTake ( int )This method is from 'LocalTableScan' class.
hashCode ( )This method is from 'LocalTableScan' class.
LocalTableScan ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, scala.collection.Seq<org.apache.spark.sql.Row> )This constructor is from 'LocalTableScan' class.
output ( )This method is from 'LocalTableScan' class.
productArity ( )This method is from 'LocalTableScan' class.
...
[+] LogicalLocalTable (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'LogicalLocalTable' class.
children ( )This method is from 'LogicalLocalTable' class.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, scala.collection.Seq<org.apache.spark.sql.Row>, org.apache.spark.sql.SQLContext )This method is from 'LogicalLocalTable' class.
equals ( java.lang.Object )This method is from 'LogicalLocalTable' class.
hashCode ( )This method is from 'LogicalLocalTable' class.
LogicalLocalTable ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, scala.collection.Seq<org.apache.spark.sql.Row>, org.apache.spark.sql.SQLContext )This constructor is from 'LogicalLocalTable' class.
newInstance ( )This method is from 'LogicalLocalTable' class.
output ( )This method is from 'LogicalLocalTable' class.
productArity ( )This method is from 'LogicalLocalTable' class.
productElement ( int )This method is from 'LogicalLocalTable' class.
productIterator ( )This method is from 'LogicalLocalTable' class.
...
[+] LogicalRDD (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'LogicalRDD' class.
children ( )This method is from 'LogicalRDD' class.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>, org.apache.spark.sql.SQLContext )This method is from 'LogicalRDD' class.
equals ( java.lang.Object )This method is from 'LogicalRDD' class.
hashCode ( )This method is from 'LogicalRDD' class.
LogicalRDD ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>, org.apache.spark.sql.SQLContext )This constructor is from 'LogicalRDD' class.
newInstance ( )This method is from 'LogicalRDD' class.
output ( )This method is from 'LogicalRDD' class.
productArity ( )This method is from 'LogicalRDD' class.
productElement ( int )This method is from 'LogicalRDD' class.
productIterator ( )This method is from 'LogicalRDD' class.
...
[+] OutputFaker (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'OutputFaker' class.
child ( )This method is from 'OutputFaker' class.
children ( )This method is from 'OutputFaker' class.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )This method is from 'OutputFaker' class.
curried ( )This method is from 'OutputFaker' class.
equals ( java.lang.Object )This method is from 'OutputFaker' class.
hashCode ( )This method is from 'OutputFaker' class.
output ( )This method is from 'OutputFaker' class.
OutputFaker ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )This constructor is from 'OutputFaker' class.
productArity ( )This method is from 'OutputFaker' class.
productElement ( int )This method is from 'OutputFaker' class.
...
[+] PhysicalRDD (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (10)
canEqual ( java.lang.Object )This method is from 'PhysicalRDD' class.
children ( )This method is from 'PhysicalRDD' class.
equals ( java.lang.Object )This method is from 'PhysicalRDD' class.
hashCode ( )This method is from 'PhysicalRDD' class.
output ( )This method is from 'PhysicalRDD' class.
productArity ( )This method is from 'PhysicalRDD' class.
productElement ( int )This method is from 'PhysicalRDD' class.
productIterator ( )This method is from 'PhysicalRDD' class.
productPrefix ( )This method is from 'PhysicalRDD' class.
rdd ( )This method is from 'PhysicalRDD' class.
[+] Project (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
buildProjection ( )This method is from 'Project' class.
canEqual ( java.lang.Object )This method is from 'Project' class.
child ( )This method is from 'Project' class.
children ( )This method is from 'Project' class.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression>, SparkPlan )This method is from 'Project' class.
curried ( )This method is from 'Project' class.
equals ( java.lang.Object )This method is from 'Project' class.
hashCode ( )This method is from 'Project' class.
output ( )This method is from 'Project' class.
outputPartitioning ( )This method is from 'Project' class.
productArity ( )This method is from 'Project' class.
...
[+] PythonUDF (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
BatchPythonEvaluation ( PythonUDF, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )1st parameter 'udf' of this method has type 'PythonUDF'.
copy ( PythonUDF, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )1st parameter 'udf' of this method has type 'PythonUDF'.
udf ( )Return value of this method has type 'PythonUDF'.
copy ( PythonUDF, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, org.apache.spark.sql.catalyst.expressions.AttributeReference )1st parameter 'udf' of this method has type 'PythonUDF'.
EvaluatePython ( PythonUDF, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, org.apache.spark.sql.catalyst.expressions.AttributeReference )1st parameter 'udf' of this method has type 'PythonUDF'.
udf ( )Return value of this method has type 'PythonUDF'.
accumulator ( )This method is from 'PythonUDF' class.
broadcastVars ( )This method is from 'PythonUDF' class.
canEqual ( java.lang.Object )This method is from 'PythonUDF' class.
children ( )This method is from 'PythonUDF' class.
command ( )This method is from 'PythonUDF' class.
...
[+] Sample (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'Sample' class.
child ( )This method is from 'Sample' class.
children ( )This method is from 'Sample' class.
curried ( )This method is from 'Sample' class.
equals ( java.lang.Object )This method is from 'Sample' class.
hashCode ( )This method is from 'Sample' class.
output ( )This method is from 'Sample' class.
outputPartitioning ( )This method is from 'Sample' class.
productArity ( )This method is from 'Sample' class.
productElement ( int )This method is from 'Sample' class.
productIterator ( )This method is from 'Sample' class.
...
[+] SetCommand (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (10)
canEqual ( java.lang.Object )This method is from 'SetCommand' class.
equals ( java.lang.Object )This method is from 'SetCommand' class.
hashCode ( )This method is from 'SetCommand' class.
kv ( )This method is from 'SetCommand' class.
output ( )This method is from 'SetCommand' class.
productArity ( )This method is from 'SetCommand' class.
productElement ( int )This method is from 'SetCommand' class.
productIterator ( )This method is from 'SetCommand' class.
productPrefix ( )This method is from 'SetCommand' class.
run ( org.apache.spark.sql.SQLContext )This method is from 'SetCommand' class.
[+] ShowTablesCommand (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
andThen ( scala.Function1<ShowTablesCommand,A> )This method is from 'ShowTablesCommand' class.
canEqual ( java.lang.Object )This method is from 'ShowTablesCommand' class.
compose ( scala.Function1<A,scala.Option<java.lang.String>> )This method is from 'ShowTablesCommand' class.
copy ( scala.Option<java.lang.String> )This method is from 'ShowTablesCommand' class.
databaseName ( )This method is from 'ShowTablesCommand' class.
equals ( java.lang.Object )This method is from 'ShowTablesCommand' class.
hashCode ( )This method is from 'ShowTablesCommand' class.
output ( )This method is from 'ShowTablesCommand' class.
productArity ( )This method is from 'ShowTablesCommand' class.
productElement ( int )This method is from 'ShowTablesCommand' class.
productIterator ( )This method is from 'ShowTablesCommand' class.
...
[+] Sort (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'Sort' class.
child ( )This method is from 'Sort' class.
children ( )This method is from 'Sort' class.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>, boolean, SparkPlan )This method is from 'Sort' class.
curried ( )This method is from 'Sort' class.
equals ( java.lang.Object )This method is from 'Sort' class.
global ( )This method is from 'Sort' class.
hashCode ( )This method is from 'Sort' class.
output ( )This method is from 'Sort' class.
outputPartitioning ( )This method is from 'Sort' class.
productArity ( )This method is from 'Sort' class.
...
[+] TakeOrdered (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'TakeOrdered' class.
child ( )This method is from 'TakeOrdered' class.
child ( )This method is from 'TakeOrdered' class.
children ( )This method is from 'TakeOrdered' class.
children ( )This method is from 'TakeOrdered' class.
copy ( int, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>, SparkPlan )This method is from 'TakeOrdered' class.
curried ( )This method is from 'TakeOrdered' class.
equals ( java.lang.Object )This method is from 'TakeOrdered' class.
execute ( )This method is from 'TakeOrdered' class.
executeCollect ( )This method is from 'TakeOrdered' class.
hashCode ( )This method is from 'TakeOrdered' class.
...
[+] UncacheTableCommand (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
andThen ( scala.Function1<UncacheTableCommand,A> )This method is from 'UncacheTableCommand' class.
canEqual ( java.lang.Object )This method is from 'UncacheTableCommand' class.
compose ( scala.Function1<A,java.lang.String> )This method is from 'UncacheTableCommand' class.
copy ( java.lang.String )This method is from 'UncacheTableCommand' class.
equals ( java.lang.Object )This method is from 'UncacheTableCommand' class.
hashCode ( )This method is from 'UncacheTableCommand' class.
output ( )This method is from 'UncacheTableCommand' class.
productArity ( )This method is from 'UncacheTableCommand' class.
productElement ( int )This method is from 'UncacheTableCommand' class.
productIterator ( )This method is from 'UncacheTableCommand' class.
productPrefix ( )This method is from 'UncacheTableCommand' class.
...
[+] Union (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
andThen ( scala.Function1<Union,A> )This method is from 'Union' class.
canEqual ( java.lang.Object )This method is from 'Union' class.
children ( )This method is from 'Union' class.
compose ( scala.Function1<A,scala.collection.Seq<SparkPlan>> )This method is from 'Union' class.
copy ( scala.collection.Seq<SparkPlan> )This method is from 'Union' class.
equals ( java.lang.Object )This method is from 'Union' class.
hashCode ( )This method is from 'Union' class.
output ( )This method is from 'Union' class.
productArity ( )This method is from 'Union' class.
productElement ( int )This method is from 'Union' class.
productIterator ( )This method is from 'Union' class.
...
package org.apache.spark.sql.execution.joins
[+] BroadcastHashJoin (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
BroadcastHashJoin ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, package.BuildSide, org.apache.spark.sql.execution.SparkPlan, org.apache.spark.sql.execution.SparkPlan )This constructor is from 'BroadcastHashJoin' class.
buildKeys ( )This method is from 'BroadcastHashJoin' class.
buildPlan ( )This method is from 'BroadcastHashJoin' class.
buildSide ( )This method is from 'BroadcastHashJoin' class.
buildSideKeyGenerator ( )This method is from 'BroadcastHashJoin' class.
canEqual ( java.lang.Object )This method is from 'BroadcastHashJoin' class.
children ( )This method is from 'BroadcastHashJoin' class.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, package.BuildSide, org.apache.spark.sql.execution.SparkPlan, org.apache.spark.sql.execution.SparkPlan )This method is from 'BroadcastHashJoin' class.
equals ( java.lang.Object )This method is from 'BroadcastHashJoin' class.
hashCode ( )This method is from 'BroadcastHashJoin' class.
left ( )This method is from 'BroadcastHashJoin' class.
...
[+] BroadcastLeftSemiJoinHash (2)
| Change | Effect |
---|
1 | Removed super-interface HashJoin. | A client program may be interrupted by NoSuchMethodError exception. |
2 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'BroadcastLeftSemiJoinHash' class.
children ( )This method is from 'BroadcastLeftSemiJoinHash' class.
curried ( )This method is from 'BroadcastLeftSemiJoinHash' class.
equals ( java.lang.Object )This method is from 'BroadcastLeftSemiJoinHash' class.
hashCode ( )This method is from 'BroadcastLeftSemiJoinHash' class.
left ( )This method is from 'BroadcastLeftSemiJoinHash' class.
leftKeys ( )This method is from 'BroadcastLeftSemiJoinHash' class.
output ( )This method is from 'BroadcastLeftSemiJoinHash' class.
productArity ( )This method is from 'BroadcastLeftSemiJoinHash' class.
productElement ( int )This method is from 'BroadcastLeftSemiJoinHash' class.
productIterator ( )This method is from 'BroadcastLeftSemiJoinHash' class.
...
[+] BroadcastNestedLoopJoin (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
BroadcastNestedLoopJoin ( org.apache.spark.sql.execution.SparkPlan, org.apache.spark.sql.execution.SparkPlan, package.BuildSide, org.apache.spark.sql.catalyst.plans.JoinType, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression> )This constructor is from 'BroadcastNestedLoopJoin' class.
buildSide ( )This method is from 'BroadcastNestedLoopJoin' class.
canEqual ( java.lang.Object )This method is from 'BroadcastNestedLoopJoin' class.
children ( )This method is from 'BroadcastNestedLoopJoin' class.
condition ( )This method is from 'BroadcastNestedLoopJoin' class.
copy ( org.apache.spark.sql.execution.SparkPlan, org.apache.spark.sql.execution.SparkPlan, package.BuildSide, org.apache.spark.sql.catalyst.plans.JoinType, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression> )This method is from 'BroadcastNestedLoopJoin' class.
curried ( )This method is from 'BroadcastNestedLoopJoin' class.
equals ( java.lang.Object )This method is from 'BroadcastNestedLoopJoin' class.
hashCode ( )This method is from 'BroadcastNestedLoopJoin' class.
joinType ( )This method is from 'BroadcastNestedLoopJoin' class.
left ( )This method is from 'BroadcastNestedLoopJoin' class.
...
[+] CartesianProduct (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'CartesianProduct' class.
CartesianProduct ( org.apache.spark.sql.execution.SparkPlan, org.apache.spark.sql.execution.SparkPlan )This constructor is from 'CartesianProduct' class.
children ( )This method is from 'CartesianProduct' class.
copy ( org.apache.spark.sql.execution.SparkPlan, org.apache.spark.sql.execution.SparkPlan )This method is from 'CartesianProduct' class.
curried ( )This method is from 'CartesianProduct' class.
equals ( java.lang.Object )This method is from 'CartesianProduct' class.
hashCode ( )This method is from 'CartesianProduct' class.
left ( )This method is from 'CartesianProduct' class.
output ( )This method is from 'CartesianProduct' class.
productArity ( )This method is from 'CartesianProduct' class.
productElement ( int )This method is from 'CartesianProduct' class.
...
[+] GeneralHashedRelation (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Serializable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
GeneralHashedRelation ( java.util.HashMap<org.apache.spark.sql.Row,org.apache.spark.util.collection.CompactBuffer<org.apache.spark.sql.Row>> )This constructor is from 'GeneralHashedRelation' class.
[+] HashJoin (2)
| Change | Effect |
---|
1 | Abstract method hashJoin ( scala.collection.Iterator<org.apache.spark.sql.Row>, HashedRelation ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
2 | Abstract method streamSideKeyGenerator ( ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
buildKeys ( )This abstract method is from 'HashJoin' interface.
buildPlan ( )This abstract method is from 'HashJoin' interface.
buildSide ( )This abstract method is from 'HashJoin' interface.
buildSideKeyGenerator ( )This abstract method is from 'HashJoin' interface.
left ( )This abstract method is from 'HashJoin' interface.
leftKeys ( )This abstract method is from 'HashJoin' interface.
output ( )This abstract method is from 'HashJoin' interface.
right ( )This abstract method is from 'HashJoin' interface.
rightKeys ( )This abstract method is from 'HashJoin' interface.
streamedKeys ( )This abstract method is from 'HashJoin' interface.
streamedPlan ( )This abstract method is from 'HashJoin' interface.
...
[+] HashOuterJoin (1)
| Change | Effect |
---|
1 | This class became interface. | A client program may be interrupted by IncompatibleClassChangeError or InstantiationError exception dependent on the usage of this class. |
[+] affected methods (8)
condition ( )This method is from 'HashOuterJoin' class.
joinType ( )This method is from 'HashOuterJoin' class.
left ( )This method is from 'HashOuterJoin' class.
leftKeys ( )This method is from 'HashOuterJoin' class.
HashOuterJoin..boundCondition ( )This method is from 'HashOuterJoin' class.
output ( )This method is from 'HashOuterJoin' class.
right ( )This method is from 'HashOuterJoin' class.
rightKeys ( )This method is from 'HashOuterJoin' class.
[+] LeftSemiJoinBNL (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
broadcast ( )This method is from 'LeftSemiJoinBNL' class.
canEqual ( java.lang.Object )This method is from 'LeftSemiJoinBNL' class.
children ( )This method is from 'LeftSemiJoinBNL' class.
condition ( )This method is from 'LeftSemiJoinBNL' class.
copy ( org.apache.spark.sql.execution.SparkPlan, org.apache.spark.sql.execution.SparkPlan, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression> )This method is from 'LeftSemiJoinBNL' class.
curried ( )This method is from 'LeftSemiJoinBNL' class.
equals ( java.lang.Object )This method is from 'LeftSemiJoinBNL' class.
hashCode ( )This method is from 'LeftSemiJoinBNL' class.
left ( )This method is from 'LeftSemiJoinBNL' class.
LeftSemiJoinBNL ( org.apache.spark.sql.execution.SparkPlan, org.apache.spark.sql.execution.SparkPlan, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression> )This constructor is from 'LeftSemiJoinBNL' class.
LeftSemiJoinBNL..boundCondition ( )This method is from 'LeftSemiJoinBNL' class.
...
[+] LeftSemiJoinHash (2)
| Change | Effect |
---|
1 | Removed super-interface HashJoin. | A client program may be interrupted by NoSuchMethodError exception. |
2 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'LeftSemiJoinHash' class.
children ( )This method is from 'LeftSemiJoinHash' class.
curried ( )This method is from 'LeftSemiJoinHash' class.
equals ( java.lang.Object )This method is from 'LeftSemiJoinHash' class.
hashCode ( )This method is from 'LeftSemiJoinHash' class.
left ( )This method is from 'LeftSemiJoinHash' class.
leftKeys ( )This method is from 'LeftSemiJoinHash' class.
output ( )This method is from 'LeftSemiJoinHash' class.
productArity ( )This method is from 'LeftSemiJoinHash' class.
productElement ( int )This method is from 'LeftSemiJoinHash' class.
productIterator ( )This method is from 'LeftSemiJoinHash' class.
...
[+] ShuffledHashJoin (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (>10)
buildKeys ( )This method is from 'ShuffledHashJoin' class.
buildPlan ( )This method is from 'ShuffledHashJoin' class.
buildSide ( )This method is from 'ShuffledHashJoin' class.
buildSideKeyGenerator ( )This method is from 'ShuffledHashJoin' class.
canEqual ( java.lang.Object )This method is from 'ShuffledHashJoin' class.
children ( )This method is from 'ShuffledHashJoin' class.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, package.BuildSide, org.apache.spark.sql.execution.SparkPlan, org.apache.spark.sql.execution.SparkPlan )This method is from 'ShuffledHashJoin' class.
curried ( )This method is from 'ShuffledHashJoin' class.
equals ( java.lang.Object )This method is from 'ShuffledHashJoin' class.
hashCode ( )This method is from 'ShuffledHashJoin' class.
left ( )This method is from 'ShuffledHashJoin' class.
...
[+] UniqueKeyHashedRelation (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Serializable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
UniqueKeyHashedRelation ( java.util.HashMap<org.apache.spark.sql.Row,org.apache.spark.sql.Row> )This constructor is from 'UniqueKeyHashedRelation' class.
package org.apache.spark.sql.jdbc
[+] DriverQuirks (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (4)
DriverQuirks ( )This constructor is from 'DriverQuirks' abstract class.
get ( java.lang.String )This method is from 'DriverQuirks' abstract class.
getCatalystType ( int, java.lang.String, int, org.apache.spark.sql.types.MetadataBuilder )This abstract method is from 'DriverQuirks' abstract class.
getJDBCType ( org.apache.spark.sql.types.DataType )This abstract method is from 'DriverQuirks' abstract class.
[+] JDBCPartition (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'JDBCPartition' class.
copy ( java.lang.String, int )This method is from 'JDBCPartition' class.
curried ( )This method is from 'JDBCPartition' class.
equals ( java.lang.Object )This method is from 'JDBCPartition' class.
hashCode ( )This method is from 'JDBCPartition' class.
idx ( )This method is from 'JDBCPartition' class.
index ( )This method is from 'JDBCPartition' class.
JDBCPartition ( java.lang.String, int )This constructor is from 'JDBCPartition' class.
productArity ( )This method is from 'JDBCPartition' class.
productElement ( int )This method is from 'JDBCPartition' class.
productIterator ( )This method is from 'JDBCPartition' class.
...
[+] JDBCPartitioningInfo (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'JDBCPartitioningInfo' class.
column ( )This method is from 'JDBCPartitioningInfo' class.
copy ( java.lang.String, long, long, int )This method is from 'JDBCPartitioningInfo' class.
curried ( )This method is from 'JDBCPartitioningInfo' class.
equals ( java.lang.Object )This method is from 'JDBCPartitioningInfo' class.
hashCode ( )This method is from 'JDBCPartitioningInfo' class.
JDBCPartitioningInfo ( java.lang.String, long, long, int )This constructor is from 'JDBCPartitioningInfo' class.
lowerBound ( )This method is from 'JDBCPartitioningInfo' class.
numPartitions ( )This method is from 'JDBCPartitioningInfo' class.
productArity ( )This method is from 'JDBCPartitioningInfo' class.
productElement ( int )This method is from 'JDBCPartitioningInfo' class.
...
[+] JDBCRDD (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
BinaryConversion ( )This method is from 'JDBCRDD' class.
BinaryLongConversion ( )This method is from 'JDBCRDD' class.
BooleanConversion ( )This method is from 'JDBCRDD' class.
compute ( org.apache.spark.Partition, org.apache.spark.TaskContext )This method is from 'JDBCRDD' class.
DateConversion ( )This method is from 'JDBCRDD' class.
DecimalConversion ( )This method is from 'JDBCRDD' class.
DoubleConversion ( )This method is from 'JDBCRDD' class.
FloatConversion ( )This method is from 'JDBCRDD' class.
getConnector ( java.lang.String, java.lang.String )This method is from 'JDBCRDD' class.
getConversions ( org.apache.spark.sql.types.StructType )This method is from 'JDBCRDD' class.
getPartitions ( )This method is from 'JDBCRDD' class.
...
[+] JDBCRelation (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
buildScan ( java.lang.String[ ], org.apache.spark.sql.sources.Filter[ ] )This method is from 'JDBCRelation' class.
canEqual ( java.lang.Object )This method is from 'JDBCRelation' class.
columnPartition ( JDBCPartitioningInfo )This method is from 'JDBCRelation' class.
copy ( java.lang.String, java.lang.String, org.apache.spark.Partition[ ], org.apache.spark.sql.SQLContext )This method is from 'JDBCRelation' class.
equals ( java.lang.Object )This method is from 'JDBCRelation' class.
hashCode ( )This method is from 'JDBCRelation' class.
JDBCRelation ( java.lang.String, java.lang.String, org.apache.spark.Partition[ ], org.apache.spark.sql.SQLContext )This constructor is from 'JDBCRelation' class.
parts ( )This method is from 'JDBCRelation' class.
productArity ( )This method is from 'JDBCRelation' class.
productElement ( int )This method is from 'JDBCRelation' class.
productIterator ( )This method is from 'JDBCRelation' class.
...
[+] MySQLQuirks (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
MySQLQuirks ( )This constructor is from 'MySQLQuirks' class.
[+] NoQuirks (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
NoQuirks ( )This constructor is from 'NoQuirks' class.
[+] PostgresQuirks (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
PostgresQuirks ( )This constructor is from 'PostgresQuirks' class.
package org.apache.spark.sql.json
[+] JSONRelation (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
buildScan ( )This method is from 'JSONRelation' class.
canEqual ( java.lang.Object )This method is from 'JSONRelation' class.
copy ( java.lang.String, double, scala.Option<org.apache.spark.sql.types.StructType>, org.apache.spark.sql.SQLContext )This method is from 'JSONRelation' class.
equals ( java.lang.Object )This method is from 'JSONRelation' class.
hashCode ( )This method is from 'JSONRelation' class.
insert ( org.apache.spark.sql.DataFrame, boolean )This method is from 'JSONRelation' class.
JSONRelation ( java.lang.String, double, scala.Option<org.apache.spark.sql.types.StructType>, org.apache.spark.sql.SQLContext )This constructor is from 'JSONRelation' class.
JSONRelation..baseRDD ( )This method is from 'JSONRelation' class.
path ( )This method is from 'JSONRelation' class.
productArity ( )This method is from 'JSONRelation' class.
productElement ( int )This method is from 'JSONRelation' class.
...
package org.apache.spark.sql.parquet
[+] AppendingParquetOutputFormat (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
AppendingParquetOutputFormat ( int )This constructor is from 'AppendingParquetOutputFormat' class.
[+] CatalystArrayContainsNullConverter (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
CatalystArrayContainsNullConverter ( org.apache.spark.sql.types.DataType, int, CatalystConverter )This constructor is from 'CatalystArrayContainsNullConverter' class.
[+] CatalystArrayConverter (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
CatalystArrayConverter ( org.apache.spark.sql.types.DataType, int, CatalystConverter )This constructor is from 'CatalystArrayConverter' class.
[+] CatalystConverter (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
ARRAY_CONTAINS_NULL_BAG_SCHEMA_NAME ( )This method is from 'CatalystConverter' abstract class.
ARRAY_ELEMENTS_SCHEMA_NAME ( )This method is from 'CatalystConverter' abstract class.
CatalystConverter ( )This constructor is from 'CatalystConverter' abstract class.
clearBuffer ( )This abstract method is from 'CatalystConverter' abstract class.
getCurrentRecord ( )This method is from 'CatalystConverter' abstract class.
index ( )This abstract method is from 'CatalystConverter' abstract class.
isRootConverter ( )This method is from 'CatalystConverter' abstract class.
MAP_KEY_SCHEMA_NAME ( )This method is from 'CatalystConverter' abstract class.
MAP_SCHEMA_NAME ( )This method is from 'CatalystConverter' abstract class.
MAP_VALUE_SCHEMA_NAME ( )This method is from 'CatalystConverter' abstract class.
parent ( )This abstract method is from 'CatalystConverter' abstract class.
...
[+] CatalystGroupConverter (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
buffer ( )This method is from 'CatalystGroupConverter' class.
buffer_.eq ( scala.collection.mutable.ArrayBuffer<org.apache.spark.sql.Row> )This method is from 'CatalystGroupConverter' class.
CatalystGroupConverter ( org.apache.spark.sql.catalyst.expressions.Attribute[ ] )This constructor is from 'CatalystGroupConverter' class.
CatalystGroupConverter ( org.apache.spark.sql.types.StructField[ ], int, CatalystConverter )This constructor is from 'CatalystGroupConverter' class.
CatalystGroupConverter ( org.apache.spark.sql.types.StructField[ ], int, CatalystConverter, scala.collection.mutable.ArrayBuffer<java.lang.Object>, scala.collection.mutable.ArrayBuffer<org.apache.spark.sql.Row> )This constructor is from 'CatalystGroupConverter' class.
clearBuffer ( )This method is from 'CatalystGroupConverter' class.
converters ( )This method is from 'CatalystGroupConverter' class.
current ( )This method is from 'CatalystGroupConverter' class.
current_.eq ( scala.collection.mutable.ArrayBuffer<java.lang.Object> )This method is from 'CatalystGroupConverter' class.
end ( )This method is from 'CatalystGroupConverter' class.
getConverter ( int )This method is from 'CatalystGroupConverter' class.
...
[+] CatalystMapConverter (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
CatalystMapConverter ( org.apache.spark.sql.types.StructField[ ], int, CatalystConverter )This constructor is from 'CatalystMapConverter' class.
[+] CatalystNativeArrayConverter (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
CatalystNativeArrayConverter ( org.apache.spark.sql.types.NativeType, int, CatalystConverter, int )This constructor is from 'CatalystNativeArrayConverter' class.
[+] CatalystPrimitiveConverter (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (7)
addBinary ( parquet.io.api.Binary )This method is from 'CatalystPrimitiveConverter' class.
addBoolean ( boolean )This method is from 'CatalystPrimitiveConverter' class.
addDouble ( double )This method is from 'CatalystPrimitiveConverter' class.
addFloat ( float )This method is from 'CatalystPrimitiveConverter' class.
addInt ( int )This method is from 'CatalystPrimitiveConverter' class.
addLong ( long )This method is from 'CatalystPrimitiveConverter' class.
CatalystPrimitiveConverter ( CatalystConverter, int )This constructor is from 'CatalystPrimitiveConverter' class.
[+] CatalystPrimitiveRowConverter (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
CatalystPrimitiveRowConverter ( org.apache.spark.sql.catalyst.expressions.Attribute[ ] )This constructor is from 'CatalystPrimitiveRowConverter' class.
[+] CatalystPrimitiveStringConverter (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
CatalystPrimitiveStringConverter ( CatalystConverter, int )This constructor is from 'CatalystPrimitiveStringConverter' class.
[+] CatalystStructConverter (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
CatalystStructConverter ( org.apache.spark.sql.types.StructField[ ], int, CatalystConverter )This constructor is from 'CatalystStructConverter' class.
[+] InsertIntoParquetTable (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'InsertIntoParquetTable' class.
child ( )This method is from 'InsertIntoParquetTable' class.
child ( )This method is from 'InsertIntoParquetTable' class.
children ( )This method is from 'InsertIntoParquetTable' class.
children ( )This method is from 'InsertIntoParquetTable' class.
copy ( ParquetRelation, org.apache.spark.sql.execution.SparkPlan, boolean )This method is from 'InsertIntoParquetTable' class.
curried ( )This method is from 'InsertIntoParquetTable' class.
equals ( java.lang.Object )This method is from 'InsertIntoParquetTable' class.
execute ( )This method is from 'InsertIntoParquetTable' class.
hashCode ( )This method is from 'InsertIntoParquetTable' class.
InsertIntoParquetTable ( ParquetRelation, org.apache.spark.sql.execution.SparkPlan, boolean )This constructor is from 'InsertIntoParquetTable' class.
...
[+] ParquetRelation (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
attributeMap ( )This method is from 'ParquetRelation' class.
canEqual ( java.lang.Object )This method is from 'ParquetRelation' class.
conf ( )This method is from 'ParquetRelation' class.
copy ( java.lang.String, scala.Option<org.apache.hadoop.conf.Configuration>, org.apache.spark.sql.SQLContext, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> )This method is from 'ParquetRelation' class.
create ( java.lang.String, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, org.apache.hadoop.conf.Configuration, org.apache.spark.sql.SQLContext )This method is from 'ParquetRelation' class.
createEmpty ( java.lang.String, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean, org.apache.hadoop.conf.Configuration, org.apache.spark.sql.SQLContext )This method is from 'ParquetRelation' class.
enableLogForwarding ( )This method is from 'ParquetRelation' class.
equals ( java.lang.Object )This method is from 'ParquetRelation' class.
hashCode ( )This method is from 'ParquetRelation' class.
newInstance ( )This method is from 'ParquetRelation' class.
newInstance ( )This method is from 'ParquetRelation' class.
...
[+] ParquetRelation2 (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
buildScan ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> )This method is from 'ParquetRelation2' class.
canEqual ( java.lang.Object )This method is from 'ParquetRelation2' class.
copy ( scala.collection.Seq<java.lang.String>, scala.collection.immutable.Map<java.lang.String,java.lang.String>, scala.Option<org.apache.spark.sql.types.StructType>, scala.Option<PartitionSpec>, org.apache.spark.sql.SQLContext )This method is from 'ParquetRelation2' class.
DEFAULT_PARTITION_NAME ( )This method is from 'ParquetRelation2' class.
equals ( java.lang.Object )This method is from 'ParquetRelation2' class.
hashCode ( )This method is from 'ParquetRelation2' class.
insert ( org.apache.spark.sql.DataFrame, boolean )This method is from 'ParquetRelation2' class.
isPartitioned ( )This method is from 'ParquetRelation2' class.
isTraceEnabled ( )This method is from 'ParquetRelation2' class.
log ( )This method is from 'ParquetRelation2' class.
logDebug ( scala.Function0<java.lang.String> )This method is from 'ParquetRelation2' class.
...
[+] ParquetTableScan (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
attributes ( )This method is from 'ParquetTableScan' class.
canEqual ( java.lang.Object )This method is from 'ParquetTableScan' class.
children ( )This method is from 'ParquetTableScan' class.
children ( )This method is from 'ParquetTableScan' class.
columnPruningPred ( )This method is from 'ParquetTableScan' class.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, ParquetRelation, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> )This method is from 'ParquetTableScan' class.
curried ( )This method is from 'ParquetTableScan' class.
equals ( java.lang.Object )This method is from 'ParquetTableScan' class.
execute ( )This method is from 'ParquetTableScan' class.
hashCode ( )This method is from 'ParquetTableScan' class.
output ( )This method is from 'ParquetTableScan' class.
...
[+] ParquetTest (1)
| Change | Effect |
---|
1 | This interface has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
configuration ( )This abstract method is from 'ParquetTest' interface.
makeParquetFile ( org.apache.spark.sql.DataFrame, java.io.File, scala.reflect.ClassTag<T>, scala.reflect.api.TypeTags.TypeTag<T> )This abstract method is from 'ParquetTest' interface.
makeParquetFile ( scala.collection.Seq<T>, java.io.File, scala.reflect.ClassTag<T>, scala.reflect.api.TypeTags.TypeTag<T> )This abstract method is from 'ParquetTest' interface.
makePartitionDir ( java.io.File, java.lang.String, scala.collection.Seq<scala.Tuple2<java.lang.String,java.lang.Object>> )This abstract method is from 'ParquetTest' interface.
sqlContext ( )This abstract method is from 'ParquetTest' interface.
withParquetDataFrame ( scala.collection.Seq<T>, scala.Function1<org.apache.spark.sql.DataFrame,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<T>, scala.reflect.api.TypeTags.TypeTag<T> )This abstract method is from 'ParquetTest' interface.
withParquetFile ( scala.collection.Seq<T>, scala.Function1<java.lang.String,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<T>, scala.reflect.api.TypeTags.TypeTag<T> )This abstract method is from 'ParquetTest' interface.
withParquetTable ( scala.collection.Seq<T>, java.lang.String, scala.Function0<scala.runtime.BoxedUnit>, scala.reflect.ClassTag<T>, scala.reflect.api.TypeTags.TypeTag<T> )This abstract method is from 'ParquetTest' interface.
withSQLConf ( scala.collection.Seq<scala.Tuple2<java.lang.String,java.lang.String>>, scala.Function0<scala.runtime.BoxedUnit> )This abstract method is from 'ParquetTest' interface.
withTempDir ( scala.Function1<java.io.File,scala.runtime.BoxedUnit> )This abstract method is from 'ParquetTest' interface.
withTempPath ( scala.Function1<java.io.File,scala.runtime.BoxedUnit> )This abstract method is from 'ParquetTest' interface.
...
[+] ParquetTypeInfo (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'ParquetTypeInfo' class.
copy ( parquet.schema.PrimitiveType.PrimitiveTypeName, scala.Option<parquet.schema.OriginalType>, scala.Option<parquet.schema.DecimalMetadata>, scala.Option<java.lang.Object> )This method is from 'ParquetTypeInfo' class.
curried ( )This method is from 'ParquetTypeInfo' class.
decimalMetadata ( )This method is from 'ParquetTypeInfo' class.
equals ( java.lang.Object )This method is from 'ParquetTypeInfo' class.
hashCode ( )This method is from 'ParquetTypeInfo' class.
length ( )This method is from 'ParquetTypeInfo' class.
originalType ( )This method is from 'ParquetTypeInfo' class.
ParquetTypeInfo ( parquet.schema.PrimitiveType.PrimitiveTypeName, scala.Option<parquet.schema.OriginalType>, scala.Option<parquet.schema.DecimalMetadata>, scala.Option<java.lang.Object> )This constructor is from 'ParquetTypeInfo' class.
primitiveType ( )This method is from 'ParquetTypeInfo' class.
productArity ( )This method is from 'ParquetTypeInfo' class.
...
[+] Partition (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'Partition' class.
copy ( org.apache.spark.sql.Row, java.lang.String )This method is from 'Partition' class.
curried ( )This method is from 'Partition' class.
equals ( java.lang.Object )This method is from 'Partition' class.
hashCode ( )This method is from 'Partition' class.
Partition ( org.apache.spark.sql.Row, java.lang.String )This constructor is from 'Partition' class.
path ( )This method is from 'Partition' class.
productArity ( )This method is from 'Partition' class.
productElement ( int )This method is from 'Partition' class.
productIterator ( )This method is from 'Partition' class.
productPrefix ( )This method is from 'Partition' class.
...
[+] PartitionSpec (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'PartitionSpec' class.
copy ( org.apache.spark.sql.types.StructType, scala.collection.Seq<Partition> )This method is from 'PartitionSpec' class.
curried ( )This method is from 'PartitionSpec' class.
equals ( java.lang.Object )This method is from 'PartitionSpec' class.
hashCode ( )This method is from 'PartitionSpec' class.
partitionColumns ( )This method is from 'PartitionSpec' class.
partitions ( )This method is from 'PartitionSpec' class.
PartitionSpec ( org.apache.spark.sql.types.StructType, scala.collection.Seq<Partition> )This constructor is from 'PartitionSpec' class.
productArity ( )This method is from 'PartitionSpec' class.
productElement ( int )This method is from 'PartitionSpec' class.
productIterator ( )This method is from 'PartitionSpec' class.
...
[+] RowReadSupport (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
RowReadSupport ( )This constructor is from 'RowReadSupport' class.
[+] RowRecordMaterializer (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
RowRecordMaterializer ( parquet.schema.MessageType, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> )This constructor is from 'RowRecordMaterializer' class.
[+] RowWriteSupport (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
attributes ( )This method is from 'RowWriteSupport' class.
attributes_.eq ( org.apache.spark.sql.catalyst.expressions.Attribute[ ] )This method is from 'RowWriteSupport' class.
getSchema ( org.apache.hadoop.conf.Configuration )This method is from 'RowWriteSupport' class.
init ( org.apache.hadoop.conf.Configuration )This method is from 'RowWriteSupport' class.
isTraceEnabled ( )This method is from 'RowWriteSupport' class.
log ( )This method is from 'RowWriteSupport' class.
logDebug ( scala.Function0<java.lang.String> )This method is from 'RowWriteSupport' class.
logDebug ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'RowWriteSupport' class.
logError ( scala.Function0<java.lang.String> )This method is from 'RowWriteSupport' class.
logError ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'RowWriteSupport' class.
logInfo ( scala.Function0<java.lang.String> )This method is from 'RowWriteSupport' class.
...
[+] TestGroupWriteSupport (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
TestGroupWriteSupport ( parquet.schema.MessageType )This constructor is from 'TestGroupWriteSupport' class.
package org.apache.spark.sql.parquet.timestamp
[+] NanoTime (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (5)
getJulianDay ( )This method is from 'NanoTime' class.
getTimeOfDayNanos ( )This method is from 'NanoTime' class.
NanoTime ( )This constructor is from 'NanoTime' class.
set ( int, long )This method is from 'NanoTime' class.
toBinary ( )This method is from 'NanoTime' class.
package org.apache.spark.sql.sources
[+] CaseInsensitiveMap (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
CaseInsensitiveMap ( scala.collection.immutable.Map<java.lang.String,java.lang.String> )This constructor is from 'CaseInsensitiveMap' class.
[+] CreateTableUsing (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
allowExisting ( )This method is from 'CreateTableUsing' class.
canEqual ( java.lang.Object )This method is from 'CreateTableUsing' class.
copy ( java.lang.String, scala.Option<org.apache.spark.sql.types.StructType>, java.lang.String, boolean, scala.collection.immutable.Map<java.lang.String,java.lang.String>, boolean, boolean )This method is from 'CreateTableUsing' class.
CreateTableUsing ( java.lang.String, scala.Option<org.apache.spark.sql.types.StructType>, java.lang.String, boolean, scala.collection.immutable.Map<java.lang.String,java.lang.String>, boolean, boolean )This constructor is from 'CreateTableUsing' class.
curried ( )This method is from 'CreateTableUsing' class.
equals ( java.lang.Object )This method is from 'CreateTableUsing' class.
hashCode ( )This method is from 'CreateTableUsing' class.
managedIfNoPath ( )This method is from 'CreateTableUsing' class.
options ( )This method is from 'CreateTableUsing' class.
productArity ( )This method is from 'CreateTableUsing' class.
productElement ( int )This method is from 'CreateTableUsing' class.
...
[+] CreateTableUsingAsSelect (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'CreateTableUsingAsSelect' class.
child ( )This method is from 'CreateTableUsingAsSelect' class.
child ( )This method is from 'CreateTableUsingAsSelect' class.
copy ( java.lang.String, java.lang.String, boolean, org.apache.spark.sql.SaveMode, scala.collection.immutable.Map<java.lang.String,java.lang.String>, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan )This method is from 'CreateTableUsingAsSelect' class.
CreateTableUsingAsSelect ( java.lang.String, java.lang.String, boolean, org.apache.spark.sql.SaveMode, scala.collection.immutable.Map<java.lang.String,java.lang.String>, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan )This constructor is from 'CreateTableUsingAsSelect' class.
curried ( )This method is from 'CreateTableUsingAsSelect' class.
equals ( java.lang.Object )This method is from 'CreateTableUsingAsSelect' class.
hashCode ( )This method is from 'CreateTableUsingAsSelect' class.
mode ( )This method is from 'CreateTableUsingAsSelect' class.
options ( )This method is from 'CreateTableUsingAsSelect' class.
output ( )This method is from 'CreateTableUsingAsSelect' class.
...
[+] CreateTempTableUsing (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'CreateTempTableUsing' class.
copy ( java.lang.String, scala.Option<org.apache.spark.sql.types.StructType>, java.lang.String, scala.collection.immutable.Map<java.lang.String,java.lang.String> )This method is from 'CreateTempTableUsing' class.
CreateTempTableUsing ( java.lang.String, scala.Option<org.apache.spark.sql.types.StructType>, java.lang.String, scala.collection.immutable.Map<java.lang.String,java.lang.String> )This constructor is from 'CreateTempTableUsing' class.
curried ( )This method is from 'CreateTempTableUsing' class.
equals ( java.lang.Object )This method is from 'CreateTempTableUsing' class.
hashCode ( )This method is from 'CreateTempTableUsing' class.
options ( )This method is from 'CreateTempTableUsing' class.
productArity ( )This method is from 'CreateTempTableUsing' class.
productElement ( int )This method is from 'CreateTempTableUsing' class.
productIterator ( )This method is from 'CreateTempTableUsing' class.
productPrefix ( )This method is from 'CreateTempTableUsing' class.
...
[+] CreateTempTableUsingAsSelect (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'CreateTempTableUsingAsSelect' class.
copy ( java.lang.String, java.lang.String, org.apache.spark.sql.SaveMode, scala.collection.immutable.Map<java.lang.String,java.lang.String>, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan )This method is from 'CreateTempTableUsingAsSelect' class.
CreateTempTableUsingAsSelect ( java.lang.String, java.lang.String, org.apache.spark.sql.SaveMode, scala.collection.immutable.Map<java.lang.String,java.lang.String>, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan )This constructor is from 'CreateTempTableUsingAsSelect' class.
curried ( )This method is from 'CreateTempTableUsingAsSelect' class.
equals ( java.lang.Object )This method is from 'CreateTempTableUsingAsSelect' class.
hashCode ( )This method is from 'CreateTempTableUsingAsSelect' class.
mode ( )This method is from 'CreateTempTableUsingAsSelect' class.
options ( )This method is from 'CreateTempTableUsingAsSelect' class.
productArity ( )This method is from 'CreateTempTableUsingAsSelect' class.
productElement ( int )This method is from 'CreateTempTableUsingAsSelect' class.
productIterator ( )This method is from 'CreateTempTableUsingAsSelect' class.
...
[+] DDLParser (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (2)
apply ( java.lang.String, boolean )This method is from 'DDLParser' class.
DDLParser ( scala.Function1<java.lang.String,org.apache.spark.sql.catalyst.plans.logical.LogicalPlan> )This constructor is from 'DDLParser' class.
[+] DescribeCommand (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'DescribeCommand' class.
copy ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, boolean )This method is from 'DescribeCommand' class.
curried ( )This method is from 'DescribeCommand' class.
DescribeCommand ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, boolean )This constructor is from 'DescribeCommand' class.
equals ( java.lang.Object )This method is from 'DescribeCommand' class.
hashCode ( )This method is from 'DescribeCommand' class.
isExtended ( )This method is from 'DescribeCommand' class.
output ( )This method is from 'DescribeCommand' class.
productArity ( )This method is from 'DescribeCommand' class.
productElement ( int )This method is from 'DescribeCommand' class.
productIterator ( )This method is from 'DescribeCommand' class.
...
[+] InsertIntoDataSource (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'InsertIntoDataSource' class.
copy ( LogicalRelation, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, boolean )This method is from 'InsertIntoDataSource' class.
curried ( )This method is from 'InsertIntoDataSource' class.
equals ( java.lang.Object )This method is from 'InsertIntoDataSource' class.
hashCode ( )This method is from 'InsertIntoDataSource' class.
InsertIntoDataSource ( LogicalRelation, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, boolean )This constructor is from 'InsertIntoDataSource' class.
logicalRelation ( )This method is from 'InsertIntoDataSource' class.
overwrite ( )This method is from 'InsertIntoDataSource' class.
productArity ( )This method is from 'InsertIntoDataSource' class.
productElement ( int )This method is from 'InsertIntoDataSource' class.
productIterator ( )This method is from 'InsertIntoDataSource' class.
...
[+] LogicalRelation (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
andThen ( scala.Function1<LogicalRelation,A> )This method is from 'LogicalRelation' class.
attributeMap ( )This method is from 'LogicalRelation' class.
canEqual ( java.lang.Object )This method is from 'LogicalRelation' class.
compose ( scala.Function1<A,BaseRelation> )This method is from 'LogicalRelation' class.
copy ( BaseRelation )This method is from 'LogicalRelation' class.
equals ( java.lang.Object )This method is from 'LogicalRelation' class.
hashCode ( )This method is from 'LogicalRelation' class.
LogicalRelation ( BaseRelation )This constructor is from 'LogicalRelation' class.
newInstance ( )This method is from 'LogicalRelation' class.
newInstance ( )This method is from 'LogicalRelation' class.
output ( )This method is from 'LogicalRelation' class.
...
[+] PreWriteCheck (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
andThen ( scala.Function1<scala.runtime.BoxedUnit,A> )This method is from 'PreWriteCheck' class.
andThen.mcDD.sp ( scala.Function1<java.lang.Object,A> )This method is from 'PreWriteCheck' class.
andThen.mcDF.sp ( scala.Function1<java.lang.Object,A> )This method is from 'PreWriteCheck' class.
andThen.mcDI.sp ( scala.Function1<java.lang.Object,A> )This method is from 'PreWriteCheck' class.
andThen.mcDJ.sp ( scala.Function1<java.lang.Object,A> )This method is from 'PreWriteCheck' class.
andThen.mcFD.sp ( scala.Function1<java.lang.Object,A> )This method is from 'PreWriteCheck' class.
andThen.mcFF.sp ( scala.Function1<java.lang.Object,A> )This method is from 'PreWriteCheck' class.
andThen.mcFI.sp ( scala.Function1<java.lang.Object,A> )This method is from 'PreWriteCheck' class.
andThen.mcFJ.sp ( scala.Function1<java.lang.Object,A> )This method is from 'PreWriteCheck' class.
andThen.mcID.sp ( scala.Function1<java.lang.Object,A> )This method is from 'PreWriteCheck' class.
andThen.mcIF.sp ( scala.Function1<java.lang.Object,A> )This method is from 'PreWriteCheck' class.
...
[+] RefreshTable (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'RefreshTable' class.
copy ( java.lang.String, java.lang.String )This method is from 'RefreshTable' class.
curried ( )This method is from 'RefreshTable' class.
databaseName ( )This method is from 'RefreshTable' class.
equals ( java.lang.Object )This method is from 'RefreshTable' class.
hashCode ( )This method is from 'RefreshTable' class.
productArity ( )This method is from 'RefreshTable' class.
productElement ( int )This method is from 'RefreshTable' class.
productIterator ( )This method is from 'RefreshTable' class.
productPrefix ( )This method is from 'RefreshTable' class.
RefreshTable ( java.lang.String, java.lang.String )This constructor is from 'RefreshTable' class.
...
[+] ResolvedDataSource (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (>10)
apply ( org.apache.spark.sql.SQLContext, java.lang.String, org.apache.spark.sql.SaveMode, scala.collection.immutable.Map<java.lang.String,java.lang.String>, org.apache.spark.sql.DataFrame )This method is from 'ResolvedDataSource' class.
apply ( org.apache.spark.sql.SQLContext, scala.Option<org.apache.spark.sql.types.StructType>, java.lang.String, scala.collection.immutable.Map<java.lang.String,java.lang.String> )This method is from 'ResolvedDataSource' class.
canEqual ( java.lang.Object )This method is from 'ResolvedDataSource' class.
copy ( java.lang.Class<?>, BaseRelation )This method is from 'ResolvedDataSource' class.
equals ( java.lang.Object )This method is from 'ResolvedDataSource' class.
hashCode ( )This method is from 'ResolvedDataSource' class.
lookupDataSource ( java.lang.String )This method is from 'ResolvedDataSource' class.
productArity ( )This method is from 'ResolvedDataSource' class.
productElement ( int )This method is from 'ResolvedDataSource' class.
productIterator ( )This method is from 'ResolvedDataSource' class.
productPrefix ( )This method is from 'ResolvedDataSource' class.
...
to the top
Problems with Data Types, Medium Severity (13)
spark-sql_2.10-1.3.0.jar
package org.apache.spark.sql.columnar
[+] BinaryColumnAccessor (1)
| Change | Effect |
---|
1 | Superclass has been changed from BasicColumnAccessor<org.apache.spark.sql.types.BinaryType.,byte[]> to BasicColumnAccessor<byte[]>. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (1)
BinaryColumnAccessor ( java.nio.ByteBuffer )This constructor is from 'BinaryColumnAccessor' class.
[+] BinaryColumnBuilder (1)
| Change | Effect |
---|
1 | Superclass has been changed from ComplexColumnBuilder<org.apache.spark.sql.types.BinaryType.,byte[]> to ComplexColumnBuilder<byte[]>. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (1)
BinaryColumnBuilder ( )This constructor is from 'BinaryColumnBuilder' class.
[+] NativeColumnType<T> (1)
| Change | Effect |
---|
1 | Superclass has been changed from ColumnType<T,java.lang.Object> to ColumnType<java.lang.Object>. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (3)
decoder ( java.nio.ByteBuffer, NativeColumnType<T> )2nd parameter 'p2' of this abstract method has type 'NativeColumnType<T>'.
encoder ( NativeColumnType<T> )1st parameter 'p1' of this abstract method has type 'NativeColumnType<T>'.
scalaTag ( )This method is from 'NativeColumnType<T>' abstract class.
package org.apache.spark.sql.execution
[+] Aggregate.ComputedAggregate. (1)
| Change | Effect |
---|
1 | Superclass has been changed from scala.runtime.AbstractFunction3<org.apache.spark.sql.catalyst.expressions.AggregateExpression,org.apache.spark.sql.catalyst.expressions.AggregateExpression,org.apache.spark.sql.catalyst.expressions.AttributeReference,Aggregate.ComputedAggregate> to scala.runtime.AbstractFunction3<org.apache.spark.sql.catalyst.expressions.AggregateExpression1,org.apache.spark.sql.catalyst.expressions.AggregateExpression1,org.apache.spark.sql.catalyst.expressions.AttributeReference,Aggregate.ComputedAggregate>. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (1)
ComputedAggregate ( )Return value of this method has type 'Aggregate.ComputedAggregate.'.
[+] CacheTableCommand (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (>10)
CacheTableCommand ( java.lang.String, scala.Option<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>, boolean )This constructor is from 'CacheTableCommand' class.
canEqual ( java.lang.Object )This method is from 'CacheTableCommand' class.
copy ( java.lang.String, scala.Option<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>, boolean )Return value of this method has type 'CacheTableCommand'.
curried ( )This method is from 'CacheTableCommand' class.
equals ( java.lang.Object )This method is from 'CacheTableCommand' class.
hashCode ( )This method is from 'CacheTableCommand' class.
isLazy ( )This method is from 'CacheTableCommand' class.
output ( )This method is from 'CacheTableCommand' class.
plan ( )This method is from 'CacheTableCommand' class.
productArity ( )This method is from 'CacheTableCommand' class.
productElement ( int )This method is from 'CacheTableCommand' class.
...
[+] DescribeCommand (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'DescribeCommand' class.
child ( )This method is from 'DescribeCommand' class.
copy ( SparkPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )Return value of this method has type 'DescribeCommand'.
curried ( )This method is from 'DescribeCommand' class.
DescribeCommand ( SparkPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )This constructor is from 'DescribeCommand' class.
equals ( java.lang.Object )This method is from 'DescribeCommand' class.
hashCode ( )This method is from 'DescribeCommand' class.
isExtended ( )This method is from 'DescribeCommand' class.
output ( )This method is from 'DescribeCommand' class.
productArity ( )This method is from 'DescribeCommand' class.
productElement ( int )This method is from 'DescribeCommand' class.
...
[+] ExplainCommand (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (>10)
canEqual ( java.lang.Object )This method is from 'ExplainCommand' class.
copy ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )Return value of this method has type 'ExplainCommand'.
curried ( )This method is from 'ExplainCommand' class.
equals ( java.lang.Object )This method is from 'ExplainCommand' class.
ExplainCommand ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )This constructor is from 'ExplainCommand' class.
extended ( )This method is from 'ExplainCommand' class.
hashCode ( )This method is from 'ExplainCommand' class.
logicalPlan ( )This method is from 'ExplainCommand' class.
output ( )This method is from 'ExplainCommand' class.
productArity ( )This method is from 'ExplainCommand' class.
productElement ( int )This method is from 'ExplainCommand' class.
...
[+] RunnableCommand (1)
| Change | Effect |
---|
1 | Abstract method output ( ) has been added to this interface. | A client program may be interrupted by AbstractMethodError exception. Added abstract method is called in 2nd library version by the method output ( ) and may not be implemented by old clients. |
[+] affected methods (4)
cmd ( )Return value of this method has type 'RunnableCommand'.
copy ( RunnableCommand )1st parameter 'cmd' of this method has type 'RunnableCommand'.
ExecutedCommand ( RunnableCommand )1st parameter 'cmd' of this method has type 'RunnableCommand'.
run ( org.apache.spark.sql.SQLContext )This abstract method is from 'RunnableCommand' interface.
[+] SetCommand (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (10)
canEqual ( java.lang.Object )This method is from 'SetCommand' class.
equals ( java.lang.Object )This method is from 'SetCommand' class.
hashCode ( )This method is from 'SetCommand' class.
kv ( )This method is from 'SetCommand' class.
output ( )This method is from 'SetCommand' class.
productArity ( )This method is from 'SetCommand' class.
productElement ( int )This method is from 'SetCommand' class.
productIterator ( )This method is from 'SetCommand' class.
productPrefix ( )This method is from 'SetCommand' class.
run ( org.apache.spark.sql.SQLContext )This method is from 'SetCommand' class.
[+] ShowTablesCommand (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (>10)
andThen ( scala.Function1<ShowTablesCommand,A> )This method is from 'ShowTablesCommand' class.
canEqual ( java.lang.Object )This method is from 'ShowTablesCommand' class.
compose ( scala.Function1<A,scala.Option<java.lang.String>> )This method is from 'ShowTablesCommand' class.
copy ( scala.Option<java.lang.String> )Return value of this method has type 'ShowTablesCommand'.
databaseName ( )This method is from 'ShowTablesCommand' class.
equals ( java.lang.Object )This method is from 'ShowTablesCommand' class.
hashCode ( )This method is from 'ShowTablesCommand' class.
output ( )This method is from 'ShowTablesCommand' class.
productArity ( )This method is from 'ShowTablesCommand' class.
productElement ( int )This method is from 'ShowTablesCommand' class.
productIterator ( )This method is from 'ShowTablesCommand' class.
...
[+] UnaryNode (1)
| Change | Effect |
---|
1 | Abstract method child ( ) has been added to this interface. | A client program may be interrupted by AbstractMethodError exception. Added abstract method is called in 2nd library version by the method outputPartitioning ( UnaryNode ) and may not be implemented by old clients. |
[+] affected methods (1)
outputPartitioning ( )This abstract method is from 'UnaryNode' interface.
[+] UncacheTableCommand (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (>10)
andThen ( scala.Function1<UncacheTableCommand,A> )This method is from 'UncacheTableCommand' class.
canEqual ( java.lang.Object )This method is from 'UncacheTableCommand' class.
compose ( scala.Function1<A,java.lang.String> )This method is from 'UncacheTableCommand' class.
copy ( java.lang.String )Return value of this method has type 'UncacheTableCommand'.
equals ( java.lang.Object )This method is from 'UncacheTableCommand' class.
hashCode ( )This method is from 'UncacheTableCommand' class.
output ( )This method is from 'UncacheTableCommand' class.
productArity ( )This method is from 'UncacheTableCommand' class.
productElement ( int )This method is from 'UncacheTableCommand' class.
productIterator ( )This method is from 'UncacheTableCommand' class.
productPrefix ( )This method is from 'UncacheTableCommand' class.
...
package org.apache.spark.sql.execution.joins
[+] HashJoin (1)
| Change | Effect |
---|
1 | Abstract method isUnsafeMode ( ) has been added to this interface. | A client program may be interrupted by AbstractMethodError exception. Added abstract method is called in 2nd library version by the method buildSideKeyGenerator ( org.apache.spark.sql.execution.SparkPlan ) and may not be implemented by old clients. |
[+] affected methods (>10)
buildKeys ( )This abstract method is from 'HashJoin' interface.
buildPlan ( )This abstract method is from 'HashJoin' interface.
buildSide ( )This abstract method is from 'HashJoin' interface.
buildSideKeyGenerator ( )This abstract method is from 'HashJoin' interface.
left ( )This abstract method is from 'HashJoin' interface.
leftKeys ( )This abstract method is from 'HashJoin' interface.
output ( )This abstract method is from 'HashJoin' interface.
right ( )This abstract method is from 'HashJoin' interface.
rightKeys ( )This abstract method is from 'HashJoin' interface.
streamedKeys ( )This abstract method is from 'HashJoin' interface.
streamedPlan ( )This abstract method is from 'HashJoin' interface.
...
to the top
Problems with Methods, Medium Severity (1)
spark-sql_2.10-1.3.0.jar, SparkPlan
package org.apache.spark.sql.execution
[+] SparkPlan.execute ( ) [abstract] : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row> (1)
[mangled: org/apache/spark/sql/execution/SparkPlan.execute:()Lorg/apache/spark/rdd/RDD;]
| Change | Effect |
---|
1 | Method became final.
| A client program trying to reimplement this method may be interrupted by VerifyError exception. |
to the top
Problems with Data Types, Low Severity (36)
spark-sql_2.10-1.3.0.jar
package org.apache.spark.sql
[+] DataFrame (1)
| Change | Effect |
---|
1 | Added super-class . | A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (>10)
agg ( java.util.Map<java.lang.String,java.lang.String> )Return value of this method has type 'DataFrame'.
agg ( Column, Column... )Return value of this method has type 'DataFrame'.
agg ( Column, scala.collection.Seq<Column> )Return value of this method has type 'DataFrame'.
agg ( scala.collection.immutable.Map<java.lang.String,java.lang.String> )Return value of this method has type 'DataFrame'.
agg ( scala.Tuple2<java.lang.String,java.lang.String>, scala.collection.Seq<scala.Tuple2<java.lang.String,java.lang.String>> )Return value of this method has type 'DataFrame'.
apply ( java.lang.String )This method is from 'DataFrame' class.
as ( java.lang.String )Return value of this method has type 'DataFrame'.
as ( scala.Symbol )Return value of this method has type 'DataFrame'.
cache ( )Return value of this method has type 'DataFrame'.
col ( java.lang.String )This method is from 'DataFrame' class.
collect ( )This method is from 'DataFrame' class.
...
[+] SQLContext.implicits. (1)
| Change | Effect |
---|
1 | Added super-class SQLImplicits. | A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (1)
implicits ( )Return value of this method has type 'SQLContext.implicits.'.
package org.apache.spark.sql.columnar
[+] InMemoryColumnarTableScan (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] TimestampColumnStats (1)
| Change | Effect |
---|
1 | Added super-class LongColumnStats. | A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (1)
TimestampColumnStats ( )This constructor is from 'TimestampColumnStats' class.
package org.apache.spark.sql.execution
[+] Aggregate (2)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
2 | Method requiredChildDistribution ( ) has been overridden by requiredChildDistribution ( ) | Method requiredChildDistribution ( ) will be called instead of requiredChildDistribution ( ) in a client program. |
[+] affected methods (2)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
requiredChildDistribution ( )Method 'requiredChildDistribution ( )' will be called instead of this method in a client program.
[+] BatchPythonEvaluation (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Except (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Exchange (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] ExecutedCommand (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Expand (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] ExternalSort (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Filter (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Generate (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Intersect (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Limit (2)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
2 | Method outputPartitioning ( ) has been moved up type hierarchy to outputPartitioning ( ) | Method outputPartitioning ( ) will be called instead of outputPartitioning ( ) in a client program. |
[+] affected methods (2)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
outputPartitioning ( )Method 'outputPartitioning ( )' will be called instead of this method in a client program.
[+] LocalTableScan (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] OutputFaker (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] PhysicalRDD (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Project (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Sample (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Sort (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] SparkPlan (1)
| Change | Effect |
---|
1 | Abstract method execute ( ) became non-abstract. | Some methods in this class may change behavior. |
[+] affected methods (1)
execute ( )This abstract method is from 'SparkPlan' abstract class.
[+] Union (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
package org.apache.spark.sql.execution.joins
[+] BroadcastHashJoin (2)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
2 | Method requiredChildDistribution ( ) has been moved up type hierarchy to requiredChildDistribution ( ) | Method requiredChildDistribution ( ) will be called instead of requiredChildDistribution ( ) in a client program. |
[+] affected methods (2)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
requiredChildDistribution ( )Method 'requiredChildDistribution ( )' will be called instead of this method in a client program.
[+] BroadcastLeftSemiJoinHash (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] BroadcastNestedLoopJoin (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] CartesianProduct (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] LeftSemiJoinBNL (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] LeftSemiJoinHash (3)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
2 | Method requiredChildDistribution ( ) has been moved up type hierarchy to requiredChildDistribution ( ) | Method requiredChildDistribution ( ) will be called instead of requiredChildDistribution ( ) in a client program. |
3 | Method outputPartitioning ( ) has been overridden by outputPartitioning ( ) | Method outputPartitioning ( ) will be called instead of outputPartitioning ( ) in a client program. |
[+] affected methods (3)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
requiredChildDistribution ( )Method 'requiredChildDistribution ( )' will be called instead of this method in a client program.
outputPartitioning ( )Method 'outputPartitioning ( )' will be called instead of this method in a client program.
[+] ShuffledHashJoin (2)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
2 | Method requiredChildDistribution ( ) has been moved up type hierarchy to requiredChildDistribution ( ) | Method requiredChildDistribution ( ) will be called instead of requiredChildDistribution ( ) in a client program. |
[+] affected methods (2)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
requiredChildDistribution ( )Method 'requiredChildDistribution ( )' will be called instead of this method in a client program.
to the top
Problems with Methods, Low Severity (1)
spark-sql_2.10-1.3.0.jar, SparkPlan
package org.apache.spark.sql.execution
[+] SparkPlan.execute ( ) [abstract] : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row> (1)
[mangled: org/apache/spark/sql/execution/SparkPlan.execute:()Lorg/apache/spark/rdd/RDD;]
| Change | Effect |
---|
1 | Method became non-abstract.
| A client program may change behavior. |
to the top
Other Changes in Data Types (14)
spark-sql_2.10-1.3.0.jar
package org.apache.spark.sql.columnar
[+] ColumnBuilder (1)
| Change | Effect |
---|
1 | Abstract method appendFrom ( org.apache.spark.sql.catalyst.InternalRow, int ) has been added to this interface. | No effect. |
[+] affected methods (3)
build ( )This abstract method is from 'ColumnBuilder' interface.
columnStats ( )This abstract method is from 'ColumnBuilder' interface.
initialize ( int, java.lang.String, boolean )This abstract method is from 'ColumnBuilder' interface.
[+] ColumnStats (2)
| Change | Effect |
---|
1 | Abstract method collectedStatistics ( ) has been added to this interface. | No effect. |
2 | Abstract method gatherStats ( org.apache.spark.sql.catalyst.InternalRow, int ) has been added to this interface. | No effect. |
[+] affected methods (7)
columnStats ( )Return value of this abstract method has type 'ColumnStats'.
count ( )This abstract method is from 'ColumnStats' interface.
count_.eq ( int )This abstract method is from 'ColumnStats' interface.
nullCount ( )This abstract method is from 'ColumnStats' interface.
nullCount_.eq ( int )This abstract method is from 'ColumnStats' interface.
sizeInBytes ( )This abstract method is from 'ColumnStats' interface.
sizeInBytes_.eq ( long )This abstract method is from 'ColumnStats' interface.
[+] NullableColumnBuilder (2)
| Change | Effect |
---|
1 | Abstract method appendFrom ( org.apache.spark.sql.catalyst.InternalRow, int ) has been added to this interface. | No effect. |
2 | Abstract method NullableColumnBuilder..super.appendFrom ( org.apache.spark.sql.catalyst.InternalRow, int ) has been added to this interface. | No effect. |
[+] affected methods (>10)
build ( )This abstract method is from 'NullableColumnBuilder' interface.
buildNonNulls ( )This abstract method is from 'NullableColumnBuilder' interface.
initialize ( int, java.lang.String, boolean )This abstract method is from 'NullableColumnBuilder' interface.
nullCount ( )This abstract method is from 'NullableColumnBuilder' interface.
nullCount_.eq ( int )This abstract method is from 'NullableColumnBuilder' interface.
nulls ( )This abstract method is from 'NullableColumnBuilder' interface.
nulls_.eq ( java.nio.ByteBuffer )This abstract method is from 'NullableColumnBuilder' interface.
NullableColumnBuilder..pos ( )This abstract method is from 'NullableColumnBuilder' interface.
NullableColumnBuilder..pos_.eq ( int )This abstract method is from 'NullableColumnBuilder' interface.
NullableColumnBuilder..super.build ( )This abstract method is from 'NullableColumnBuilder' interface.
NullableColumnBuilder..super.initialize ( int, java.lang.String, boolean )This abstract method is from 'NullableColumnBuilder' interface.
...
package org.apache.spark.sql.columnar.compression
[+] Encoder<T> (1)
| Change | Effect |
---|
1 | Abstract method gatherCompressibilityStats ( org.apache.spark.sql.catalyst.InternalRow, int ) has been added to this interface. | No effect. |
[+] affected methods (5)
encoder ( org.apache.spark.sql.columnar.NativeColumnType<T> )Return value of this abstract method has type 'Encoder<T>'.
compress ( java.nio.ByteBuffer, java.nio.ByteBuffer )This abstract method is from 'Encoder<T>' interface.
compressedSize ( )This abstract method is from 'Encoder<T>' interface.
compressionRatio ( )This abstract method is from 'Encoder<T>' interface.
uncompressedSize ( )This abstract method is from 'Encoder<T>' interface.
package org.apache.spark.sql.execution
[+] RunnableCommand (1)
| Change | Effect |
---|
1 | Abstract method children ( ) has been added to this interface. | No effect. |
[+] affected methods (4)
cmd ( )Return value of this method has type 'RunnableCommand'.
copy ( RunnableCommand )1st parameter 'cmd' of this method has type 'RunnableCommand'.
ExecutedCommand ( RunnableCommand )1st parameter 'cmd' of this method has type 'RunnableCommand'.
run ( org.apache.spark.sql.SQLContext )This abstract method is from 'RunnableCommand' interface.
[+] SparkPlan (1)
| Change | Effect |
---|
1 | Abstract method doExecute ( ) has been added to this class. | No effect. |
[+] affected methods (>10)
child ( )Return value of this method has type 'SparkPlan'.
Aggregate ( boolean, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression>, SparkPlan )4th parameter 'child' of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( boolean, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression>, SparkPlan )4th parameter 'child' of this method has type 'SparkPlan'.
BatchPythonEvaluation ( PythonUDF, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )3rd parameter 'child' of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( PythonUDF, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )3rd parameter 'child' of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( SparkPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )1st parameter 'child' of this method has type 'SparkPlan'.
DescribeCommand ( SparkPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )1st parameter 'child' of this method has type 'SparkPlan'.
copy ( SparkPlan, SparkPlan )2nd parameter 'right' of this method has type 'SparkPlan'.
...
[+] UnaryNode (1)
| Change | Effect |
---|
1 | Abstract method children ( ) has been added to this interface. | No effect. |
[+] affected methods (1)
outputPartitioning ( )This abstract method is from 'UnaryNode' interface.
package org.apache.spark.sql.execution.joins
[+] HashJoin (5)
| Change | Effect |
---|
1 | Abstract method canProcessSafeRows ( ) has been added to this interface. | No effect. |
2 | Abstract method canProcessUnsafeRows ( ) has been added to this interface. | No effect. |
3 | Abstract method hashJoin ( scala.collection.Iterator<org.apache.spark.sql.catalyst.InternalRow>, org.apache.spark.sql.execution.metric.LongSQLMetric, HashedRelation, org.apache.spark.sql.execution.metric.LongSQLMetric ) has been added to this interface. | No effect. |
4 | Abstract method outputsUnsafeRows ( ) has been added to this interface. | No effect. |
5 | Abstract method streamSideKeyGenerator ( ) has been added to this interface. | No effect. |
[+] affected methods (>10)
buildKeys ( )This abstract method is from 'HashJoin' interface.
buildPlan ( )This abstract method is from 'HashJoin' interface.
buildSide ( )This abstract method is from 'HashJoin' interface.
buildSideKeyGenerator ( )This abstract method is from 'HashJoin' interface.
left ( )This abstract method is from 'HashJoin' interface.
leftKeys ( )This abstract method is from 'HashJoin' interface.
output ( )This abstract method is from 'HashJoin' interface.
right ( )This abstract method is from 'HashJoin' interface.
rightKeys ( )This abstract method is from 'HashJoin' interface.
streamedKeys ( )This abstract method is from 'HashJoin' interface.
streamedPlan ( )This abstract method is from 'HashJoin' interface.
...
to the top
Java ARchives (1)
spark-sql_2.10-1.3.0.jar
to the top
Generated on Wed Oct 28 11:09:44 2015 for succinct-0.1.2 by Java API Compliance Checker 1.4.1
A tool for checking backward compatibility of a Java library API