Binary compatibility report for the succinct-0.1.2 library between 1.3.0 and 1.4.0 versions (relating to the portability of client application succinct-0.1.2.jar)
Test Info
Library Name | succinct-0.1.2 |
Version #1 | 1.3.0 |
Version #2 | 1.4.0 |
Java Version | 1.7.0_75 |
Test Results
Total Java ARchives | 1 |
---|
Total Methods / Classes | 2480 / 463 |
---|
Verdict | Incompatible (13.1%) |
Problem Summary
| Severity | Count |
---|
Added Methods | - | 246 |
---|
Removed Methods | High | 198 |
---|
Problems with Data Types | High | 21 |
---|
Medium | 15 |
Low | 44 |
Problems with Methods | High | 3 |
---|
Medium | 1 |
Low | 1 |
Other Changes in Data Types | - | 4 |
Added Methods (246)
spark-sql_2.10-1.4.0.jar, Aggregate.class
package org.apache.spark.sql.execution
Aggregate.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/Aggregate.doExecute:()Lorg/apache/spark/rdd/RDD;]
Aggregate.requiredChildDistribution ( ) : scala.collection.immutable.List<org.apache.spark.sql.catalyst.plans.physical.Distribution>
[mangled: org/apache/spark/sql/execution/Aggregate.requiredChildDistribution:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.4.0.jar, AppendingParquetOutputFormat.class
package org.apache.spark.sql.parquet
AppendingParquetOutputFormat.getOutputCommitter ( org.apache.hadoop.mapreduce.TaskAttemptContext context ) : org.apache.hadoop.mapreduce.OutputCommitter
[mangled: org/apache/spark/sql/parquet/AppendingParquetOutputFormat.getOutputCommitter:(Lorg/apache/hadoop/mapreduce/TaskAttemptContext;)Lorg/apache/hadoop/mapreduce/OutputCommitter;]
spark-sql_2.10-1.4.0.jar, BaseRelation.class
package org.apache.spark.sql.sources
BaseRelation.needConversion ( ) : boolean
[mangled: org/apache/spark/sql/sources/BaseRelation.needConversion:()Z]
spark-sql_2.10-1.4.0.jar, BatchPythonEvaluation.class
package org.apache.spark.sql.execution
BatchPythonEvaluation.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/BatchPythonEvaluation.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, BroadcastHashJoin.class
package org.apache.spark.sql.execution.joins
BroadcastHashJoin.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, BroadcastLeftSemiJoinHash.class
package org.apache.spark.sql.execution.joins
BroadcastLeftSemiJoinHash.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, BroadcastNestedLoopJoin.class
package org.apache.spark.sql.execution.joins
BroadcastNestedLoopJoin.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/BroadcastNestedLoopJoin.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, CacheTableCommand.class
package org.apache.spark.sql.execution
CacheTableCommand.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/execution/CacheTableCommand.children:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, CartesianProduct.class
package org.apache.spark.sql.execution.joins
CartesianProduct.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/CartesianProduct.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, CatalystConverter.class
package org.apache.spark.sql.parquet
CatalystConverter.readDecimal ( org.apache.spark.sql.types.Decimal dest, parquet.io.api.Binary value, org.apache.spark.sql.types.DecimalType ctype ) : org.apache.spark.sql.types.Decimal
[mangled: org/apache/spark/sql/parquet/CatalystConverter.readDecimal:(Lorg/apache/spark/sql/types/Decimal;Lparquet/io/api/Binary;Lorg/apache/spark/sql/types/DecimalType;)Lorg/apache/spark/sql/types/Decimal;]
CatalystConverter.updateDate ( int fieldIndex, int value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateDate:(II)V]
CatalystConverter.updateString ( int fieldIndex, byte[ ] value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateString:(I[B)V]
spark-sql_2.10-1.4.0.jar, Column.class
package org.apache.spark.sql
Column.alias ( String alias ) : Column
[mangled: org/apache/spark/sql/Column.alias:(Ljava/lang/String;)Lorg/apache/spark/sql/Column;]
Column.apply ( Object extraction ) : Column
[mangled: org/apache/spark/sql/Column.apply:(Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.as ( scala.collection.Seq<String> aliases ) : Column
[mangled: org/apache/spark/sql/Column.as:(Lscala/collection/Seq;)Lorg/apache/spark/sql/Column;]
Column.as ( String alias, types.Metadata metadata ) : Column
[mangled: org/apache/spark/sql/Column.as:(Ljava/lang/String;Lorg/apache/spark/sql/types/Metadata;)Lorg/apache/spark/sql/Column;]
Column.as ( String[ ] aliases ) : Column
[mangled: org/apache/spark/sql/Column.as:([Ljava/lang/String;)Lorg/apache/spark/sql/Column;]
Column.between ( Object lowerBound, Object upperBound ) : Column
[mangled: org/apache/spark/sql/Column.between:(Ljava/lang/Object;Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.bitwiseAND ( Object other ) : Column
[mangled: org/apache/spark/sql/Column.bitwiseAND:(Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.bitwiseOR ( Object other ) : Column
[mangled: org/apache/spark/sql/Column.bitwiseOR:(Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.bitwiseXOR ( Object other ) : Column
[mangled: org/apache/spark/sql/Column.bitwiseXOR:(Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.equals ( Object that ) : boolean
[mangled: org/apache/spark/sql/Column.equals:(Ljava/lang/Object;)Z]
Column.getItem ( Object key ) : Column
[mangled: org/apache/spark/sql/Column.getItem:(Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.hashCode ( ) : int
[mangled: org/apache/spark/sql/Column.hashCode:()I]
Column.isTraceEnabled ( ) : boolean
[mangled: org/apache/spark/sql/Column.isTraceEnabled:()Z]
Column.log ( ) : org.slf4j.Logger
[mangled: org/apache/spark/sql/Column.log:()Lorg/slf4j/Logger;]
Column.logDebug ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/Column.logDebug:(Lscala/Function0;)V]
Column.logDebug ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/Column.logDebug:(Lscala/Function0;Ljava/lang/Throwable;)V]
Column.logError ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/Column.logError:(Lscala/Function0;)V]
Column.logError ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/Column.logError:(Lscala/Function0;Ljava/lang/Throwable;)V]
Column.logInfo ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/Column.logInfo:(Lscala/Function0;)V]
Column.logInfo ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/Column.logInfo:(Lscala/Function0;Ljava/lang/Throwable;)V]
Column.logName ( ) : String
[mangled: org/apache/spark/sql/Column.logName:()Ljava/lang/String;]
Column.logTrace ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/Column.logTrace:(Lscala/Function0;)V]
Column.logTrace ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/Column.logTrace:(Lscala/Function0;Ljava/lang/Throwable;)V]
Column.logWarning ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/Column.logWarning:(Lscala/Function0;)V]
Column.logWarning ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/Column.logWarning:(Lscala/Function0;Ljava/lang/Throwable;)V]
Column.org.apache.spark.Logging..log_ ( ) : org.slf4j.Logger
[mangled: org/apache/spark/sql/Column.org.apache.spark.Logging..log_:()Lorg/slf4j/Logger;]
Column.org.apache.spark.Logging..log__.eq ( org.slf4j.Logger p1 ) : void
[mangled: org/apache/spark/sql/Column.org.apache.spark.Logging..log__.eq:(Lorg/slf4j/Logger;)V]
Column.otherwise ( Object value ) : Column
[mangled: org/apache/spark/sql/Column.otherwise:(Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
Column.over ( expressions.WindowSpec window ) : Column
[mangled: org/apache/spark/sql/Column.over:(Lorg/apache/spark/sql/expressions/WindowSpec;)Lorg/apache/spark/sql/Column;]
Column.when ( Column condition, Object value ) : Column
[mangled: org/apache/spark/sql/Column.when:(Lorg/apache/spark/sql/Column;Ljava/lang/Object;)Lorg/apache/spark/sql/Column;]
spark-sql_2.10-1.4.0.jar, CreateTableUsing.class
package org.apache.spark.sql.sources
CreateTableUsing.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/sources/CreateTableUsing.children:()Lscala/collection/Seq;]
CreateTableUsing.output ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/sources/CreateTableUsing.output:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, CreateTableUsingAsSelect.class
package org.apache.spark.sql.sources
CreateTableUsingAsSelect.copy ( String tableName, String provider, boolean temporary, String[ ] partitionColumns, org.apache.spark.sql.SaveMode mode, scala.collection.immutable.Map<String,String> options, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan child ) : CreateTableUsingAsSelect
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.copy:(Ljava/lang/String;Ljava/lang/String;Z[Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/sources/CreateTableUsingAsSelect;]
CreateTableUsingAsSelect.CreateTableUsingAsSelect ( String tableName, String provider, boolean temporary, String[ ] partitionColumns, org.apache.spark.sql.SaveMode mode, scala.collection.immutable.Map<String,String> options, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan child )
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect."<init>":(Ljava/lang/String;Ljava/lang/String;Z[Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)V]
CreateTableUsingAsSelect.partitionColumns ( ) : String[ ]
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.partitionColumns:()[Ljava/lang/String;]
spark-sql_2.10-1.4.0.jar, CreateTempTableUsing.class
package org.apache.spark.sql.sources
CreateTempTableUsing.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.children:()Lscala/collection/Seq;]
CreateTempTableUsing.output ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsing.output:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, CreateTempTableUsingAsSelect.class
package org.apache.spark.sql.sources
CreateTempTableUsingAsSelect.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.children:()Lscala/collection/Seq;]
CreateTempTableUsingAsSelect.copy ( String tableName, String provider, String[ ] partitionColumns, org.apache.spark.sql.SaveMode mode, scala.collection.immutable.Map<String,String> options, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan query ) : CreateTempTableUsingAsSelect
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.copy:(Ljava/lang/String;Ljava/lang/String;[Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/sources/CreateTempTableUsingAsSelect;]
CreateTempTableUsingAsSelect.CreateTempTableUsingAsSelect ( String tableName, String provider, String[ ] partitionColumns, org.apache.spark.sql.SaveMode mode, scala.collection.immutable.Map<String,String> options, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan query )
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect."<init>":(Ljava/lang/String;Ljava/lang/String;[Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)V]
CreateTempTableUsingAsSelect.output ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.output:()Lscala/collection/Seq;]
CreateTempTableUsingAsSelect.partitionColumns ( ) : String[ ]
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.partitionColumns:()[Ljava/lang/String;]
spark-sql_2.10-1.4.0.jar, DataFrame.class
package org.apache.spark.sql
DataFrame.coalesce ( int numPartitions ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.coalesce:(I)Lorg/apache/spark/sql/DataFrame;]
DataFrame.cube ( Column... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.cube ( scala.collection.Seq<Column> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:(Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.cube ( String col1, scala.collection.Seq<String> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.cube ( String col1, String... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.describe ( scala.collection.Seq<String> cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.describe:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.describe ( String... cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.describe:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.drop ( String colName ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.drop:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.dropDuplicates ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.dropDuplicates:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.dropDuplicates ( scala.collection.Seq<String> colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.dropDuplicates:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.dropDuplicates ( String[ ] colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.dropDuplicates:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.join ( DataFrame right, String usingColumn ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.na ( ) : DataFrameNaFunctions
[mangled: org/apache/spark/sql/DataFrame.na:()Lorg/apache/spark/sql/DataFrameNaFunctions;]
DataFrame.DataFrame..logicalPlanToDataFrame ( catalyst.plans.logical.LogicalPlan logicalPlan ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.org.apache.spark.sql.DataFrame..logicalPlanToDataFrame:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.randomSplit ( double[ ] weights ) : DataFrame[ ]
[mangled: org/apache/spark/sql/DataFrame.randomSplit:([D)[Lorg/apache/spark/sql/DataFrame;]
DataFrame.randomSplit ( double[ ] weights, long seed ) : DataFrame[ ]
[mangled: org/apache/spark/sql/DataFrame.randomSplit:([DJ)[Lorg/apache/spark/sql/DataFrame;]
DataFrame.randomSplit ( scala.collection.immutable.List<Object> weights, long seed ) : DataFrame[ ]
[mangled: org/apache/spark/sql/DataFrame.randomSplit:(Lscala/collection/immutable/List;J)[Lorg/apache/spark/sql/DataFrame;]
DataFrame.rollup ( Column... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.rollup ( scala.collection.Seq<Column> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:(Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.rollup ( String col1, scala.collection.Seq<String> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.rollup ( String col1, String... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.stat ( ) : DataFrameStatFunctions
[mangled: org/apache/spark/sql/DataFrame.stat:()Lorg/apache/spark/sql/DataFrameStatFunctions;]
DataFrame.write ( ) : DataFrameWriter
[mangled: org/apache/spark/sql/DataFrame.write:()Lorg/apache/spark/sql/DataFrameWriter;]
spark-sql_2.10-1.4.0.jar, DataFrameNaFunctions.class
package org.apache.spark.sql
DataFrameNaFunctions.DataFrameNaFunctions ( DataFrame df )
[mangled: org/apache/spark/sql/DataFrameNaFunctions."<init>":(Lorg/apache/spark/sql/DataFrame;)V]
spark-sql_2.10-1.4.0.jar, DataFrameWriter.class
package org.apache.spark.sql
DataFrameWriter.format ( String source ) : DataFrameWriter
[mangled: org/apache/spark/sql/DataFrameWriter.format:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrameWriter;]
DataFrameWriter.save ( String path ) : void
[mangled: org/apache/spark/sql/DataFrameWriter.save:(Ljava/lang/String;)V]
spark-sql_2.10-1.4.0.jar, DescribeCommand.class
package org.apache.spark.sql.execution
DescribeCommand.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/execution/DescribeCommand.children:()Lscala/collection/Seq;]
package org.apache.spark.sql.sources
DescribeCommand.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/sources/DescribeCommand.children:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, Distinct.class
package org.apache.spark.sql.execution
Distinct.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/Distinct.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, Except.class
package org.apache.spark.sql.execution
Except.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/Except.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, Exchange.class
package org.apache.spark.sql.execution
Exchange.canSortWithShuffle ( org.apache.spark.sql.catalyst.plans.physical.Partitioning p1, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder> p2 ) [static] : boolean
[mangled: org/apache/spark/sql/execution/Exchange.canSortWithShuffle:(Lorg/apache/spark/sql/catalyst/plans/physical/Partitioning;Lscala/collection/Seq;)Z]
Exchange.copy ( org.apache.spark.sql.catalyst.plans.physical.Partitioning newPartitioning, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder> newOrdering, SparkPlan child ) : Exchange
[mangled: org/apache/spark/sql/execution/Exchange.copy:(Lorg/apache/spark/sql/catalyst/plans/physical/Partitioning;Lscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/Exchange;]
Exchange.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/Exchange.doExecute:()Lorg/apache/spark/rdd/RDD;]
Exchange.Exchange ( org.apache.spark.sql.catalyst.plans.physical.Partitioning newPartitioning, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder> newOrdering, SparkPlan child )
[mangled: org/apache/spark/sql/execution/Exchange."<init>":(Lorg/apache/spark/sql/catalyst/plans/physical/Partitioning;Lscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;)V]
Exchange.newOrdering ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>
[mangled: org/apache/spark/sql/execution/Exchange.newOrdering:()Lscala/collection/Seq;]
Exchange.Exchange..getSerializer ( org.apache.spark.sql.types.DataType[ ] keySchema, org.apache.spark.sql.types.DataType[ ] valueSchema, boolean hasKeyOrdering, int numPartitions ) : org.apache.spark.serializer.Serializer
[mangled: org/apache/spark/sql/execution/Exchange.org.apache.spark.sql.execution.Exchange..getSerializer:([Lorg/apache/spark/sql/types/DataType;[Lorg/apache/spark/sql/types/DataType;ZI)Lorg/apache/spark/serializer/Serializer;]
Exchange.Exchange..keyOrdering ( ) : org.apache.spark.sql.catalyst.expressions.RowOrdering
[mangled: org/apache/spark/sql/execution/Exchange.org.apache.spark.sql.execution.Exchange..keyOrdering:()Lorg/apache/spark/sql/catalyst/expressions/RowOrdering;]
Exchange.Exchange..needToCopyObjectsBeforeShuffle ( org.apache.spark.Partitioner partitioner, org.apache.spark.serializer.Serializer serializer ) : boolean
[mangled: org/apache/spark/sql/execution/Exchange.org.apache.spark.sql.execution.Exchange..needToCopyObjectsBeforeShuffle:(Lorg/apache/spark/Partitioner;Lorg/apache/spark/serializer/Serializer;)Z]
Exchange.outputOrdering ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>
[mangled: org/apache/spark/sql/execution/Exchange.outputOrdering:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, ExecutedCommand.class
package org.apache.spark.sql.execution
ExecutedCommand.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/ExecutedCommand.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, Expand.class
package org.apache.spark.sql.execution
Expand.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/Expand.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, ExplainCommand.class
package org.apache.spark.sql.execution
ExplainCommand.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/execution/ExplainCommand.children:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, ExternalSort.class
package org.apache.spark.sql.execution
ExternalSort.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/ExternalSort.doExecute:()Lorg/apache/spark/rdd/RDD;]
ExternalSort.outputOrdering ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>
[mangled: org/apache/spark/sql/execution/ExternalSort.outputOrdering:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, Filter.class
package org.apache.spark.sql.execution
Filter.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/Filter.doExecute:()Lorg/apache/spark/rdd/RDD;]
Filter.outputOrdering ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>
[mangled: org/apache/spark/sql/execution/Filter.outputOrdering:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, Generate.class
package org.apache.spark.sql.execution
Generate.copy ( org.apache.spark.sql.catalyst.expressions.Generator generator, boolean join, boolean outer, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, SparkPlan child ) : Generate
[mangled: org/apache/spark/sql/execution/Generate.copy:(Lorg/apache/spark/sql/catalyst/expressions/Generator;ZZLscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/Generate;]
Generate.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/Generate.doExecute:()Lorg/apache/spark/rdd/RDD;]
Generate.Generate ( org.apache.spark.sql.catalyst.expressions.Generator generator, boolean join, boolean outer, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, SparkPlan child )
[mangled: org/apache/spark/sql/execution/Generate."<init>":(Lorg/apache/spark/sql/catalyst/expressions/Generator;ZZLscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;)V]
spark-sql_2.10-1.4.0.jar, GeneratedAggregate.class
package org.apache.spark.sql.execution
GeneratedAggregate.copy ( boolean partial, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> groupingExpressions, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression> aggregateExpressions, boolean unsafeEnabled, SparkPlan child ) : GeneratedAggregate
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.copy:(ZLscala/collection/Seq;Lscala/collection/Seq;ZLorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/GeneratedAggregate;]
GeneratedAggregate.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.doExecute:()Lorg/apache/spark/rdd/RDD;]
GeneratedAggregate.GeneratedAggregate ( boolean partial, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> groupingExpressions, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression> aggregateExpressions, boolean unsafeEnabled, SparkPlan child )
[mangled: org/apache/spark/sql/execution/GeneratedAggregate."<init>":(ZLscala/collection/Seq;Lscala/collection/Seq;ZLorg/apache/spark/sql/execution/SparkPlan;)V]
GeneratedAggregate.unsafeEnabled ( ) : boolean
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.unsafeEnabled:()Z]
spark-sql_2.10-1.4.0.jar, HashedRelation.class
package org.apache.spark.sql.execution.joins
HashedRelation.readBytes ( java.io.ObjectInput p1 ) [abstract] : byte[ ]
[mangled: org/apache/spark/sql/execution/joins/HashedRelation.readBytes:(Ljava/io/ObjectInput;)[B]
HashedRelation.writeBytes ( java.io.ObjectOutput p1, byte[ ] p2 ) [abstract] : void
[mangled: org/apache/spark/sql/execution/joins/HashedRelation.writeBytes:(Ljava/io/ObjectOutput;[B)V]
spark-sql_2.10-1.4.0.jar, HashOuterJoin.class
package org.apache.spark.sql.execution.joins
HashOuterJoin.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/HashOuterJoin.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, InMemoryColumnarTableScan.class
package org.apache.spark.sql.columnar
InMemoryColumnarTableScan.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/columnar/InMemoryColumnarTableScan.doExecute:()Lorg/apache/spark/rdd/RDD;]
InMemoryColumnarTableScan.enableAccumulators ( ) : boolean
[mangled: org/apache/spark/sql/columnar/InMemoryColumnarTableScan.enableAccumulators:()Z]
spark-sql_2.10-1.4.0.jar, InMemoryRelation.class
package org.apache.spark.sql.columnar
InMemoryRelation.copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, boolean useCompression, int batchSize, org.apache.spark.storage.StorageLevel storageLevel, org.apache.spark.sql.execution.SparkPlan child, scala.Option<String> tableName, org.apache.spark.rdd.RDD<CachedBatch> _cachedColumnBuffers, org.apache.spark.sql.catalyst.plans.logical.Statistics _statistics, org.apache.spark.Accumulable<scala.collection.mutable.ArrayBuffer<org.apache.spark.sql.Row>,org.apache.spark.sql.Row> _batchStats ) : InMemoryRelation
[mangled: org/apache/spark/sql/columnar/InMemoryRelation.copy:(Lscala/collection/Seq;ZILorg/apache/spark/storage/StorageLevel;Lorg/apache/spark/sql/execution/SparkPlan;Lscala/Option;Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/catalyst/plans/logical/Statistics;Lorg/apache/spark/Accumulable;)Lorg/apache/spark/sql/columnar/InMemoryRelation;]
InMemoryRelation.InMemoryRelation ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, boolean useCompression, int batchSize, org.apache.spark.storage.StorageLevel storageLevel, org.apache.spark.sql.execution.SparkPlan child, scala.Option<String> tableName, org.apache.spark.rdd.RDD<CachedBatch> _cachedColumnBuffers, org.apache.spark.sql.catalyst.plans.logical.Statistics _statistics, org.apache.spark.Accumulable<scala.collection.mutable.ArrayBuffer<org.apache.spark.sql.Row>,org.apache.spark.sql.Row> _batchStats )
[mangled: org/apache/spark/sql/columnar/InMemoryRelation."<init>":(Lscala/collection/Seq;ZILorg/apache/spark/storage/StorageLevel;Lorg/apache/spark/sql/execution/SparkPlan;Lscala/Option;Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/catalyst/plans/logical/Statistics;Lorg/apache/spark/Accumulable;)V]
InMemoryRelation.newInstance ( ) : org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/columnar/InMemoryRelation.newInstance:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
InMemoryRelation.uncache ( boolean blocking ) : void
[mangled: org/apache/spark/sql/columnar/InMemoryRelation.uncache:(Z)V]
spark-sql_2.10-1.4.0.jar, InsertIntoDataSource.class
package org.apache.spark.sql.sources
InsertIntoDataSource.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.children:()Lscala/collection/Seq;]
InsertIntoDataSource.output ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/sources/InsertIntoDataSource.output:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, InsertIntoParquetTable.class
package org.apache.spark.sql.parquet
InsertIntoParquetTable.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, Intersect.class
package org.apache.spark.sql.execution
Intersect.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/Intersect.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, JDBCRDD.class
package org.apache.spark.sql.jdbc
JDBCRDD.getConnector ( String p1, String p2, java.util.Properties p3 ) [static] : scala.Function0<java.sql.Connection>
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.getConnector:(Ljava/lang/String;Ljava/lang/String;Ljava/util/Properties;)Lscala/Function0;]
JDBCRDD.JDBCRDD ( org.apache.spark.SparkContext sc, scala.Function0<java.sql.Connection> getConnection, org.apache.spark.sql.types.StructType schema, String fqTable, String[ ] columns, org.apache.spark.sql.sources.Filter[ ] filters, org.apache.spark.Partition[ ] partitions, java.util.Properties properties )
[mangled: org/apache/spark/sql/jdbc/JDBCRDD."<init>":(Lorg/apache/spark/SparkContext;Lscala/Function0;Lorg/apache/spark/sql/types/StructType;Ljava/lang/String;[Ljava/lang/String;[Lorg/apache/spark/sql/sources/Filter;[Lorg/apache/spark/Partition;Ljava/util/Properties;)V]
JDBCRDD.resolveTable ( String p1, String p2, java.util.Properties p3 ) [static] : org.apache.spark.sql.types.StructType
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.resolveTable:(Ljava/lang/String;Ljava/lang/String;Ljava/util/Properties;)Lorg/apache/spark/sql/types/StructType;]
JDBCRDD.scanTable ( org.apache.spark.SparkContext p1, org.apache.spark.sql.types.StructType p2, String p3, String p4, java.util.Properties p5, String p6, String[ ] p7, org.apache.spark.sql.sources.Filter[ ] p8, org.apache.spark.Partition[ ] p9 ) [static] : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.scanTable:(Lorg/apache/spark/SparkContext;Lorg/apache/spark/sql/types/StructType;Ljava/lang/String;Ljava/lang/String;Ljava/util/Properties;Ljava/lang/String;[Ljava/lang/String;[Lorg/apache/spark/sql/sources/Filter;[Lorg/apache/spark/Partition;)Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, JDBCRelation.class
package org.apache.spark.sql.jdbc
JDBCRelation.copy ( String url, String table, org.apache.spark.Partition[ ] parts, java.util.Properties properties, org.apache.spark.sql.SQLContext sqlContext ) : JDBCRelation
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.copy:(Ljava/lang/String;Ljava/lang/String;[Lorg/apache/spark/Partition;Ljava/util/Properties;Lorg/apache/spark/sql/SQLContext;)Lorg/apache/spark/sql/jdbc/JDBCRelation;]
JDBCRelation.insert ( org.apache.spark.sql.DataFrame data, boolean overwrite ) : void
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.insert:(Lorg/apache/spark/sql/DataFrame;Z)V]
JDBCRelation.JDBCRelation ( String url, String table, org.apache.spark.Partition[ ] parts, java.util.Properties properties, org.apache.spark.sql.SQLContext sqlContext )
[mangled: org/apache/spark/sql/jdbc/JDBCRelation."<init>":(Ljava/lang/String;Ljava/lang/String;[Lorg/apache/spark/Partition;Ljava/util/Properties;Lorg/apache/spark/sql/SQLContext;)V]
JDBCRelation.needConversion ( ) : boolean
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.needConversion:()Z]
JDBCRelation.properties ( ) : java.util.Properties
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.properties:()Ljava/util/Properties;]
spark-sql_2.10-1.4.0.jar, JSONRelation.class
package org.apache.spark.sql.json
JSONRelation.buildScan ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> requiredColumns, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> filters ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/json/JSONRelation.buildScan:(Lscala/collection/Seq;Lscala/collection/Seq;)Lorg/apache/spark/rdd/RDD;]
JSONRelation.JSONRelation ( scala.Function0<org.apache.spark.rdd.RDD<String>> baseRDD, scala.Option<String> path, double samplingRatio, scala.Option<org.apache.spark.sql.types.StructType> userSpecifiedSchema, org.apache.spark.sql.SQLContext sqlContext )
[mangled: org/apache/spark/sql/json/JSONRelation."<init>":(Lscala/Function0;Lscala/Option;DLscala/Option;Lorg/apache/spark/sql/SQLContext;)V]
JSONRelation.needConversion ( ) : boolean
[mangled: org/apache/spark/sql/json/JSONRelation.needConversion:()Z]
JSONRelation.JSONRelation..useJacksonStreamingAPI ( ) : boolean
[mangled: org/apache/spark/sql/json/JSONRelation.org.apache.spark.sql.json.JSONRelation..useJacksonStreamingAPI:()Z]
JSONRelation.path ( ) : scala.Option<String>
[mangled: org/apache/spark/sql/json/JSONRelation.path:()Lscala/Option;]
spark-sql_2.10-1.4.0.jar, LeftSemiJoinBNL.class
package org.apache.spark.sql.execution.joins
LeftSemiJoinBNL.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinBNL.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, LeftSemiJoinHash.class
package org.apache.spark.sql.execution.joins
LeftSemiJoinHash.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, Limit.class
package org.apache.spark.sql.execution
Limit.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/Limit.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, LocalTableScan.class
package org.apache.spark.sql.execution
LocalTableScan.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/LocalTableScan.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, LogicalLocalTable.class
package org.apache.spark.sql.execution
LogicalLocalTable.newInstance ( ) : org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/execution/LogicalLocalTable.newInstance:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
spark-sql_2.10-1.4.0.jar, LogicalRDD.class
package org.apache.spark.sql.execution
LogicalRDD.newInstance ( ) : org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/execution/LogicalRDD.newInstance:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
spark-sql_2.10-1.4.0.jar, LogicalRelation.class
package org.apache.spark.sql.sources
LogicalRelation.newInstance ( ) : org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/sources/LogicalRelation.newInstance:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
spark-sql_2.10-1.4.0.jar, NativeColumnType<T>.class
package org.apache.spark.sql.columnar
NativeColumnType<T>.dataType ( ) : T
[mangled: org/apache/spark/sql/columnar/NativeColumnType<T>.dataType:()Lorg/apache/spark/sql/types/AtomicType;]
NativeColumnType<T>.NativeColumnType ( T dataType, int typeId, int defaultSize ) : public
[mangled: org/apache/spark/sql/columnar/NativeColumnType<T>.org.apache.spark.sql.columnar.NativeColumnType:(Lorg/apache/spark/sql/types/AtomicType;II)V]
spark-sql_2.10-1.4.0.jar, OutputFaker.class
package org.apache.spark.sql.execution
OutputFaker.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/OutputFaker.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, ParquetRelation.class
package org.apache.spark.sql.parquet
ParquetRelation.newInstance ( ) : org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/parquet/ParquetRelation.newInstance:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
spark-sql_2.10-1.4.0.jar, ParquetRelation2.class
package org.apache.spark.sql.parquet
ParquetRelation2.buildScan ( String[ ] requiredColumns, org.apache.spark.sql.sources.Filter[ ] filters, org.apache.hadoop.fs.FileStatus[ ] inputFiles, org.apache.spark.broadcast.Broadcast<org.apache.spark.SerializableWritable<org.apache.hadoop.conf.Configuration>> broadcastedConf ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.buildScan:([Ljava/lang/String;[Lorg/apache/spark/sql/sources/Filter;[Lorg/apache/hadoop/fs/FileStatus;Lorg/apache/spark/broadcast/Broadcast;)Lorg/apache/spark/rdd/RDD;]
ParquetRelation2.dataSchema ( ) : org.apache.spark.sql.types.StructType
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.dataSchema:()Lorg/apache/spark/sql/types/StructType;]
ParquetRelation2.needConversion ( ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.needConversion:()Z]
ParquetRelation2.ParquetRelation2..maybeDataSchema ( ) : scala.Option<org.apache.spark.sql.types.StructType>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.org.apache.spark.sql.parquet.ParquetRelation2..maybeDataSchema:()Lscala/Option;]
ParquetRelation2.ParquetRelation2 ( String[ ] paths, scala.Option<org.apache.spark.sql.types.StructType> maybeDataSchema, scala.Option<org.apache.spark.sql.sources.PartitionSpec> maybePartitionSpec, scala.collection.immutable.Map<String,String> parameters, org.apache.spark.sql.SQLContext sqlContext )
[mangled: org/apache/spark/sql/parquet/ParquetRelation2."<init>":([Ljava/lang/String;Lscala/Option;Lscala/Option;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/SQLContext;)V]
ParquetRelation2.ParquetRelation2 ( String[ ] paths, scala.Option<org.apache.spark.sql.types.StructType> maybeDataSchema, scala.Option<org.apache.spark.sql.sources.PartitionSpec> maybePartitionSpec, scala.Option<org.apache.spark.sql.types.StructType> userDefinedPartitionColumns, scala.collection.immutable.Map<String,String> parameters, org.apache.spark.sql.SQLContext sqlContext )
[mangled: org/apache/spark/sql/parquet/ParquetRelation2."<init>":([Ljava/lang/String;Lscala/Option;Lscala/Option;Lscala/Option;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/SQLContext;)V]
ParquetRelation2.paths ( ) : String[ ]
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.paths:()[Ljava/lang/String;]
ParquetRelation2.prepareJobForWrite ( org.apache.hadoop.mapreduce.Job job ) : org.apache.spark.sql.sources.OutputWriterFactory
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.prepareJobForWrite:(Lorg/apache/hadoop/mapreduce/Job;)Lorg/apache/spark/sql/sources/OutputWriterFactory;]
ParquetRelation2.refresh ( ) : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.refresh:()V]
ParquetRelation2.userDefinedPartitionColumns ( ) : scala.Option<org.apache.spark.sql.types.StructType>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.userDefinedPartitionColumns:()Lscala/Option;]
spark-sql_2.10-1.4.0.jar, ParquetTableScan.class
package org.apache.spark.sql.parquet
ParquetTableScan.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, PhysicalRDD.class
package org.apache.spark.sql.execution
PhysicalRDD.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/PhysicalRDD.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, PreWriteCheck.class
package org.apache.spark.sql.sources
PreWriteCheck.failAnalysis ( String msg ) : void
[mangled: org/apache/spark/sql/sources/PreWriteCheck.failAnalysis:(Ljava/lang/String;)V]
spark-sql_2.10-1.4.0.jar, Project.class
package org.apache.spark.sql.execution
Project.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/Project.doExecute:()Lorg/apache/spark/rdd/RDD;]
Project.outputOrdering ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>
[mangled: org/apache/spark/sql/execution/Project.outputOrdering:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, PythonUDF.class
package org.apache.spark.sql.execution
PythonUDF.copy ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, String pythonVer, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, org.apache.spark.sql.types.DataType dataType, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> children ) : PythonUDF
[mangled: org/apache/spark/sql/execution/PythonUDF.copy:(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;Lscala/collection/Seq;)Lorg/apache/spark/sql/execution/PythonUDF;]
PythonUDF.PythonUDF ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, String pythonVer, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, org.apache.spark.sql.types.DataType dataType, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> children )
[mangled: org/apache/spark/sql/execution/PythonUDF."<init>":(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;Lscala/collection/Seq;)V]
PythonUDF.pythonVer ( ) : String
[mangled: org/apache/spark/sql/execution/PythonUDF.pythonVer:()Ljava/lang/String;]
spark-sql_2.10-1.4.0.jar, RefreshTable.class
package org.apache.spark.sql.sources
RefreshTable.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/sources/RefreshTable.children:()Lscala/collection/Seq;]
RefreshTable.output ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/sources/RefreshTable.output:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, ResolvedDataSource.class
package org.apache.spark.sql.sources
ResolvedDataSource.apply ( org.apache.spark.sql.SQLContext p1, scala.Option<org.apache.spark.sql.types.StructType> p2, String[ ] p3, String p4, scala.collection.immutable.Map<String,String> p5 ) [static] : ResolvedDataSource
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.apply:(Lorg/apache/spark/sql/SQLContext;Lscala/Option;[Ljava/lang/String;Ljava/lang/String;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/sources/ResolvedDataSource;]
ResolvedDataSource.apply ( org.apache.spark.sql.SQLContext p1, String p2, String[ ] p3, org.apache.spark.sql.SaveMode p4, scala.collection.immutable.Map<String,String> p5, org.apache.spark.sql.DataFrame p6 ) [static] : ResolvedDataSource
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.apply:(Lorg/apache/spark/sql/SQLContext;Ljava/lang/String;[Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/sources/ResolvedDataSource;]
spark-sql_2.10-1.4.0.jar, RunnableCommand.class
package org.apache.spark.sql.execution
RunnableCommand.children ( ) [abstract] : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/execution/RunnableCommand.children:()Lscala/collection/Seq;]
RunnableCommand.output ( ) [abstract] : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/execution/RunnableCommand.output:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, Sample.class
package org.apache.spark.sql.execution
Sample.copy ( double lowerBound, double upperBound, boolean withReplacement, long seed, SparkPlan child ) : Sample
[mangled: org/apache/spark/sql/execution/Sample.copy:(DDZJLorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/Sample;]
Sample.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/Sample.doExecute:()Lorg/apache/spark/rdd/RDD;]
Sample.lowerBound ( ) : double
[mangled: org/apache/spark/sql/execution/Sample.lowerBound:()D]
Sample.Sample ( double lowerBound, double upperBound, boolean withReplacement, long seed, SparkPlan child )
[mangled: org/apache/spark/sql/execution/Sample."<init>":(DDZJLorg/apache/spark/sql/execution/SparkPlan;)V]
Sample.upperBound ( ) : double
[mangled: org/apache/spark/sql/execution/Sample.upperBound:()D]
spark-sql_2.10-1.4.0.jar, SetCommand.class
package org.apache.spark.sql.execution
SetCommand.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/execution/SetCommand.children:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, ShowTablesCommand.class
package org.apache.spark.sql.execution
ShowTablesCommand.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/execution/ShowTablesCommand.children:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, ShuffledHashJoin.class
package org.apache.spark.sql.execution.joins
ShuffledHashJoin.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/joins/ShuffledHashJoin.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, Sort.class
package org.apache.spark.sql.execution
Sort.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/Sort.doExecute:()Lorg/apache/spark/rdd/RDD;]
Sort.outputOrdering ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>
[mangled: org/apache/spark/sql/execution/Sort.outputOrdering:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, SparkPlan.class
package org.apache.spark.sql.execution
SparkPlan.doExecute ( ) [abstract] : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/SparkPlan.doExecute:()Lorg/apache/spark/rdd/RDD;]
SparkPlan.outputOrdering ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>
[mangled: org/apache/spark/sql/execution/SparkPlan.outputOrdering:()Lscala/collection/Seq;]
SparkPlan.requiredChildOrdering ( ) : scala.collection.Seq<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>>
[mangled: org/apache/spark/sql/execution/SparkPlan.requiredChildOrdering:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, SQLContext.class
package org.apache.spark.sql
SQLContext.cacheManager ( ) : execution.CacheManager
[mangled: org/apache/spark/sql/SQLContext.cacheManager:()Lorg/apache/spark/sql/execution/CacheManager;]
SQLContext.createDataFrame ( org.apache.spark.rdd.RDD<Row> rowRDD, types.StructType schema, boolean needsConversion ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;Z)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.createSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.currentSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.currentSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.defaultSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.defaultSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.detachSession ( ) : void
[mangled: org/apache/spark/sql/SQLContext.detachSession:()V]
SQLContext.dialectClassName ( ) : String
[mangled: org/apache/spark/sql/SQLContext.dialectClassName:()Ljava/lang/String;]
SQLContext.getOrCreate ( org.apache.spark.SparkContext p1 ) [static] : SQLContext
[mangled: org/apache/spark/sql/SQLContext.getOrCreate:(Lorg/apache/spark/SparkContext;)Lorg/apache/spark/sql/SQLContext;]
SQLContext.getSQLDialect ( ) : catalyst.ParserDialect
[mangled: org/apache/spark/sql/SQLContext.getSQLDialect:()Lorg/apache/spark/sql/catalyst/ParserDialect;]
SQLContext.openSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.openSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.range ( long start, long end ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.range:(JJ)Lorg/apache/spark/sql/DataFrame;]
SQLContext.range ( long start, long end, long step, int numPartitions ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.range:(JJJI)Lorg/apache/spark/sql/DataFrame;]
SQLContext.read ( ) : DataFrameReader
[mangled: org/apache/spark/sql/SQLContext.read:()Lorg/apache/spark/sql/DataFrameReader;]
SQLContext.tlSession ( ) : ThreadLocal<SQLContext.SQLSession>
[mangled: org/apache/spark/sql/SQLContext.tlSession:()Ljava/lang/ThreadLocal;]
spark-sql_2.10-1.4.0.jar, StringContains.class
package org.apache.spark.sql.sources
StringContains.attribute ( ) : String
[mangled: org/apache/spark/sql/sources/StringContains.attribute:()Ljava/lang/String;]
StringContains.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/StringContains.canEqual:(Ljava/lang/Object;)Z]
StringContains.copy ( String attribute, String value ) : StringContains
[mangled: org/apache/spark/sql/sources/StringContains.copy:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/sources/StringContains;]
StringContains.curried ( ) [static] : scala.Function1<String,scala.Function1<String,StringContains>>
[mangled: org/apache/spark/sql/sources/StringContains.curried:()Lscala/Function1;]
StringContains.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/StringContains.equals:(Ljava/lang/Object;)Z]
StringContains.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/StringContains.hashCode:()I]
StringContains.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/StringContains.productArity:()I]
StringContains.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/StringContains.productElement:(I)Ljava/lang/Object;]
StringContains.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/StringContains.productIterator:()Lscala/collection/Iterator;]
StringContains.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/StringContains.productPrefix:()Ljava/lang/String;]
StringContains.StringContains ( String attribute, String value )
[mangled: org/apache/spark/sql/sources/StringContains."<init>":(Ljava/lang/String;Ljava/lang/String;)V]
StringContains.toString ( ) : String
[mangled: org/apache/spark/sql/sources/StringContains.toString:()Ljava/lang/String;]
StringContains.tupled ( ) [static] : scala.Function1<scala.Tuple2<String,String>,StringContains>
[mangled: org/apache/spark/sql/sources/StringContains.tupled:()Lscala/Function1;]
StringContains.value ( ) : String
[mangled: org/apache/spark/sql/sources/StringContains.value:()Ljava/lang/String;]
spark-sql_2.10-1.4.0.jar, StringEndsWith.class
package org.apache.spark.sql.sources
StringEndsWith.attribute ( ) : String
[mangled: org/apache/spark/sql/sources/StringEndsWith.attribute:()Ljava/lang/String;]
StringEndsWith.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/StringEndsWith.canEqual:(Ljava/lang/Object;)Z]
StringEndsWith.copy ( String attribute, String value ) : StringEndsWith
[mangled: org/apache/spark/sql/sources/StringEndsWith.copy:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/sources/StringEndsWith;]
StringEndsWith.curried ( ) [static] : scala.Function1<String,scala.Function1<String,StringEndsWith>>
[mangled: org/apache/spark/sql/sources/StringEndsWith.curried:()Lscala/Function1;]
StringEndsWith.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/StringEndsWith.equals:(Ljava/lang/Object;)Z]
StringEndsWith.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/StringEndsWith.hashCode:()I]
StringEndsWith.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/StringEndsWith.productArity:()I]
StringEndsWith.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/StringEndsWith.productElement:(I)Ljava/lang/Object;]
StringEndsWith.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/StringEndsWith.productIterator:()Lscala/collection/Iterator;]
StringEndsWith.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/StringEndsWith.productPrefix:()Ljava/lang/String;]
StringEndsWith.StringEndsWith ( String attribute, String value )
[mangled: org/apache/spark/sql/sources/StringEndsWith."<init>":(Ljava/lang/String;Ljava/lang/String;)V]
StringEndsWith.toString ( ) : String
[mangled: org/apache/spark/sql/sources/StringEndsWith.toString:()Ljava/lang/String;]
StringEndsWith.tupled ( ) [static] : scala.Function1<scala.Tuple2<String,String>,StringEndsWith>
[mangled: org/apache/spark/sql/sources/StringEndsWith.tupled:()Lscala/Function1;]
StringEndsWith.value ( ) : String
[mangled: org/apache/spark/sql/sources/StringEndsWith.value:()Ljava/lang/String;]
spark-sql_2.10-1.4.0.jar, StringStartsWith.class
package org.apache.spark.sql.sources
StringStartsWith.attribute ( ) : String
[mangled: org/apache/spark/sql/sources/StringStartsWith.attribute:()Ljava/lang/String;]
StringStartsWith.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/StringStartsWith.canEqual:(Ljava/lang/Object;)Z]
StringStartsWith.copy ( String attribute, String value ) : StringStartsWith
[mangled: org/apache/spark/sql/sources/StringStartsWith.copy:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/sources/StringStartsWith;]
StringStartsWith.curried ( ) [static] : scala.Function1<String,scala.Function1<String,StringStartsWith>>
[mangled: org/apache/spark/sql/sources/StringStartsWith.curried:()Lscala/Function1;]
StringStartsWith.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/sources/StringStartsWith.equals:(Ljava/lang/Object;)Z]
StringStartsWith.hashCode ( ) : int
[mangled: org/apache/spark/sql/sources/StringStartsWith.hashCode:()I]
StringStartsWith.productArity ( ) : int
[mangled: org/apache/spark/sql/sources/StringStartsWith.productArity:()I]
StringStartsWith.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/sources/StringStartsWith.productElement:(I)Ljava/lang/Object;]
StringStartsWith.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/sources/StringStartsWith.productIterator:()Lscala/collection/Iterator;]
StringStartsWith.productPrefix ( ) : String
[mangled: org/apache/spark/sql/sources/StringStartsWith.productPrefix:()Ljava/lang/String;]
StringStartsWith.StringStartsWith ( String attribute, String value )
[mangled: org/apache/spark/sql/sources/StringStartsWith."<init>":(Ljava/lang/String;Ljava/lang/String;)V]
StringStartsWith.toString ( ) : String
[mangled: org/apache/spark/sql/sources/StringStartsWith.toString:()Ljava/lang/String;]
StringStartsWith.tupled ( ) [static] : scala.Function1<scala.Tuple2<String,String>,StringStartsWith>
[mangled: org/apache/spark/sql/sources/StringStartsWith.tupled:()Lscala/Function1;]
StringStartsWith.value ( ) : String
[mangled: org/apache/spark/sql/sources/StringStartsWith.value:()Ljava/lang/String;]
spark-sql_2.10-1.4.0.jar, TakeOrdered.class
package org.apache.spark.sql.execution
TakeOrdered.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/TakeOrdered.doExecute:()Lorg/apache/spark/rdd/RDD;]
TakeOrdered.outputOrdering ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>
[mangled: org/apache/spark/sql/execution/TakeOrdered.outputOrdering:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, UncacheTableCommand.class
package org.apache.spark.sql.execution
UncacheTableCommand.children ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/execution/UncacheTableCommand.children:()Lscala/collection/Seq;]
spark-sql_2.10-1.4.0.jar, Union.class
package org.apache.spark.sql.execution
Union.doExecute ( ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/execution/Union.doExecute:()Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, UserDefinedPythonFunction.class
package org.apache.spark.sql
UserDefinedPythonFunction.copy ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, String pythonVer, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, types.DataType dataType ) : UserDefinedPythonFunction
[mangled: org/apache/spark/sql/UserDefinedPythonFunction.copy:(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;)Lorg/apache/spark/sql/UserDefinedPythonFunction;]
UserDefinedPythonFunction.pythonVer ( ) : String
[mangled: org/apache/spark/sql/UserDefinedPythonFunction.pythonVer:()Ljava/lang/String;]
UserDefinedPythonFunction.UserDefinedPythonFunction ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, String pythonVer, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, types.DataType dataType )
[mangled: org/apache/spark/sql/UserDefinedPythonFunction."<init>":(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;)V]
to the top
Removed Methods (198)
spark-sql_2.10-1.3.0.jar, AddExchange.class
package org.apache.spark.sql.execution
AddExchange.AddExchange ( org.apache.spark.sql.SQLContext sqlContext )
[mangled: org/apache/spark/sql/execution/AddExchange."<init>":(Lorg/apache/spark/sql/SQLContext;)V]
AddExchange.andThen ( scala.Function1<AddExchange,A> p1 ) [static] : scala.Function1<org.apache.spark.sql.SQLContext,A>
[mangled: org/apache/spark/sql/execution/AddExchange.andThen:(Lscala/Function1;)Lscala/Function1;]
AddExchange.apply ( org.apache.spark.sql.catalyst.trees.TreeNode plan ) : org.apache.spark.sql.catalyst.trees.TreeNode
[mangled: org/apache/spark/sql/execution/AddExchange.apply:(Lorg/apache/spark/sql/catalyst/trees/TreeNode;)Lorg/apache/spark/sql/catalyst/trees/TreeNode;]
AddExchange.apply ( SparkPlan plan ) : SparkPlan
[mangled: org/apache/spark/sql/execution/AddExchange.apply:(Lorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/SparkPlan;]
AddExchange.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/execution/AddExchange.canEqual:(Ljava/lang/Object;)Z]
AddExchange.compose ( scala.Function1<A,org.apache.spark.sql.SQLContext> p1 ) [static] : scala.Function1<A,AddExchange>
[mangled: org/apache/spark/sql/execution/AddExchange.compose:(Lscala/Function1;)Lscala/Function1;]
AddExchange.copy ( org.apache.spark.sql.SQLContext sqlContext ) : AddExchange
[mangled: org/apache/spark/sql/execution/AddExchange.copy:(Lorg/apache/spark/sql/SQLContext;)Lorg/apache/spark/sql/execution/AddExchange;]
AddExchange.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/execution/AddExchange.equals:(Ljava/lang/Object;)Z]
AddExchange.hashCode ( ) : int
[mangled: org/apache/spark/sql/execution/AddExchange.hashCode:()I]
AddExchange.numPartitions ( ) : int
[mangled: org/apache/spark/sql/execution/AddExchange.numPartitions:()I]
AddExchange.productArity ( ) : int
[mangled: org/apache/spark/sql/execution/AddExchange.productArity:()I]
AddExchange.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/execution/AddExchange.productElement:(I)Ljava/lang/Object;]
AddExchange.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/execution/AddExchange.productIterator:()Lscala/collection/Iterator;]
AddExchange.productPrefix ( ) : String
[mangled: org/apache/spark/sql/execution/AddExchange.productPrefix:()Ljava/lang/String;]
AddExchange.sqlContext ( ) : org.apache.spark.sql.SQLContext
[mangled: org/apache/spark/sql/execution/AddExchange.sqlContext:()Lorg/apache/spark/sql/SQLContext;]
AddExchange.toString ( ) : String
[mangled: org/apache/spark/sql/execution/AddExchange.toString:()Ljava/lang/String;]
spark-sql_2.10-1.3.0.jar, Aggregate.class
package org.apache.spark.sql.execution
Aggregate.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Aggregate.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, BatchPythonEvaluation.class
package org.apache.spark.sql.execution
BatchPythonEvaluation.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/BatchPythonEvaluation.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, BroadcastHashJoin.class
package org.apache.spark.sql.execution.joins
BroadcastHashJoin.curried ( ) [static] : scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,scala.Function1<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,scala.Function1<package.BuildSide,scala.Function1<org.apache.spark.sql.execution.SparkPlan,scala.Function1<org.apache.spark.sql.execution.SparkPlan,BroadcastHashJoin>>>>>
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.curried:()Lscala/Function1;]
BroadcastHashJoin.tupled ( ) [static] : scala.Function1<scala.Tuple5<scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>,package.BuildSide,org.apache.spark.sql.execution.SparkPlan,org.apache.spark.sql.execution.SparkPlan>,BroadcastHashJoin>
[mangled: org/apache/spark/sql/execution/joins/BroadcastHashJoin.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, BroadcastLeftSemiJoinHash.class
package org.apache.spark.sql.execution.joins
BroadcastLeftSemiJoinHash.buildSide ( ) : package.BuildRight.
[mangled: org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.buildSide:()Lorg/apache/spark/sql/execution/joins/package$BuildRight$;]
spark-sql_2.10-1.3.0.jar, CachedData.class
package org.apache.spark.sql
CachedData.CachedData ( catalyst.plans.logical.LogicalPlan plan, columnar.InMemoryRelation cachedRepresentation )
[mangled: org/apache/spark/sql/CachedData."<init>":(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;Lorg/apache/spark/sql/columnar/InMemoryRelation;)V]
CachedData.cachedRepresentation ( ) : columnar.InMemoryRelation
[mangled: org/apache/spark/sql/CachedData.cachedRepresentation:()Lorg/apache/spark/sql/columnar/InMemoryRelation;]
CachedData.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/CachedData.canEqual:(Ljava/lang/Object;)Z]
CachedData.copy ( catalyst.plans.logical.LogicalPlan plan, columnar.InMemoryRelation cachedRepresentation ) : CachedData
[mangled: org/apache/spark/sql/CachedData.copy:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;Lorg/apache/spark/sql/columnar/InMemoryRelation;)Lorg/apache/spark/sql/CachedData;]
CachedData.curried ( ) [static] : scala.Function1<catalyst.plans.logical.LogicalPlan,scala.Function1<columnar.InMemoryRelation,CachedData>>
[mangled: org/apache/spark/sql/CachedData.curried:()Lscala/Function1;]
CachedData.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/CachedData.equals:(Ljava/lang/Object;)Z]
CachedData.hashCode ( ) : int
[mangled: org/apache/spark/sql/CachedData.hashCode:()I]
CachedData.plan ( ) : catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/CachedData.plan:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
CachedData.productArity ( ) : int
[mangled: org/apache/spark/sql/CachedData.productArity:()I]
CachedData.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/CachedData.productElement:(I)Ljava/lang/Object;]
CachedData.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/CachedData.productIterator:()Lscala/collection/Iterator;]
CachedData.productPrefix ( ) : String
[mangled: org/apache/spark/sql/CachedData.productPrefix:()Ljava/lang/String;]
CachedData.toString ( ) : String
[mangled: org/apache/spark/sql/CachedData.toString:()Ljava/lang/String;]
CachedData.tupled ( ) [static] : scala.Function1<scala.Tuple2<catalyst.plans.logical.LogicalPlan,columnar.InMemoryRelation>,CachedData>
[mangled: org/apache/spark/sql/CachedData.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, CacheManager.class
package org.apache.spark.sql
CacheManager.CacheManager ( SQLContext sqlContext )
[mangled: org/apache/spark/sql/CacheManager."<init>":(Lorg/apache/spark/sql/SQLContext;)V]
CacheManager.cacheQuery ( DataFrame query, scala.Option<String> tableName, org.apache.spark.storage.StorageLevel storageLevel ) : void
[mangled: org/apache/spark/sql/CacheManager.cacheQuery:(Lorg/apache/spark/sql/DataFrame;Lscala/Option;Lorg/apache/spark/storage/StorageLevel;)V]
CacheManager.cacheTable ( String tableName ) : void
[mangled: org/apache/spark/sql/CacheManager.cacheTable:(Ljava/lang/String;)V]
CacheManager.clearCache ( ) : void
[mangled: org/apache/spark/sql/CacheManager.clearCache:()V]
CacheManager.invalidateCache ( catalyst.plans.logical.LogicalPlan plan ) : void
[mangled: org/apache/spark/sql/CacheManager.invalidateCache:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)V]
CacheManager.isCached ( String tableName ) : boolean
[mangled: org/apache/spark/sql/CacheManager.isCached:(Ljava/lang/String;)Z]
CacheManager.tryUncacheQuery ( DataFrame query, boolean blocking ) : boolean
[mangled: org/apache/spark/sql/CacheManager.tryUncacheQuery:(Lorg/apache/spark/sql/DataFrame;Z)Z]
CacheManager.uncacheTable ( String tableName ) : void
[mangled: org/apache/spark/sql/CacheManager.uncacheTable:(Ljava/lang/String;)V]
CacheManager.useCachedData ( catalyst.plans.logical.LogicalPlan plan ) : catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/CacheManager.useCachedData:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
spark-sql_2.10-1.3.0.jar, CatalystConverter.class
package org.apache.spark.sql.parquet
CatalystConverter.readDecimal ( org.apache.spark.sql.types.Decimal dest, parquet.io.api.Binary value, org.apache.spark.sql.types.DecimalType ctype ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.readDecimal:(Lorg/apache/spark/sql/types/Decimal;Lparquet/io/api/Binary;Lorg/apache/spark/sql/types/DecimalType;)V]
CatalystConverter.updateString ( int fieldIndex, String value ) : void
[mangled: org/apache/spark/sql/parquet/CatalystConverter.updateString:(ILjava/lang/String;)V]
spark-sql_2.10-1.3.0.jar, CatalystNativeArrayConverter.class
package org.apache.spark.sql.parquet
CatalystNativeArrayConverter.CatalystNativeArrayConverter ( org.apache.spark.sql.types.NativeType elementType, int index, CatalystConverter parent, int capacity )
[mangled: org/apache/spark/sql/parquet/CatalystNativeArrayConverter."<init>":(Lorg/apache/spark/sql/types/NativeType;ILorg/apache/spark/sql/parquet/CatalystConverter;I)V]
spark-sql_2.10-1.3.0.jar, Column.class
package org.apache.spark.sql
Column.apply ( catalyst.expressions.Expression p1 ) [static] : Column
[mangled: org/apache/spark/sql/Column.apply:(Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/Column;]
Column.apply ( String p1 ) [static] : Column
[mangled: org/apache/spark/sql/Column.apply:(Ljava/lang/String;)Lorg/apache/spark/sql/Column;]
Column.getItem ( int ordinal ) : Column
[mangled: org/apache/spark/sql/Column.getItem:(I)Lorg/apache/spark/sql/Column;]
spark-sql_2.10-1.3.0.jar, CreateTableUsingAsSelect.class
package org.apache.spark.sql.sources
CreateTableUsingAsSelect.copy ( String tableName, String provider, boolean temporary, org.apache.spark.sql.SaveMode mode, scala.collection.immutable.Map<String,String> options, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan child ) : CreateTableUsingAsSelect
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect.copy:(Ljava/lang/String;Ljava/lang/String;ZLorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/sources/CreateTableUsingAsSelect;]
CreateTableUsingAsSelect.CreateTableUsingAsSelect ( String tableName, String provider, boolean temporary, org.apache.spark.sql.SaveMode mode, scala.collection.immutable.Map<String,String> options, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan child )
[mangled: org/apache/spark/sql/sources/CreateTableUsingAsSelect."<init>":(Ljava/lang/String;Ljava/lang/String;ZLorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)V]
spark-sql_2.10-1.3.0.jar, CreateTempTableUsingAsSelect.class
package org.apache.spark.sql.sources
CreateTempTableUsingAsSelect.copy ( String tableName, String provider, org.apache.spark.sql.SaveMode mode, scala.collection.immutable.Map<String,String> options, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan query ) : CreateTempTableUsingAsSelect
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect.copy:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/sources/CreateTempTableUsingAsSelect;]
CreateTempTableUsingAsSelect.CreateTempTableUsingAsSelect ( String tableName, String provider, org.apache.spark.sql.SaveMode mode, scala.collection.immutable.Map<String,String> options, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan query )
[mangled: org/apache/spark/sql/sources/CreateTempTableUsingAsSelect."<init>":(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)V]
spark-sql_2.10-1.3.0.jar, DDLParser.class
package org.apache.spark.sql.sources
DDLParser.apply ( String input, boolean exceptionOnError ) : scala.Option<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
[mangled: org/apache/spark/sql/sources/DDLParser.apply:(Ljava/lang/String;Z)Lscala/Option;]
spark-sql_2.10-1.3.0.jar, Distinct.class
package org.apache.spark.sql.execution
Distinct.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Distinct.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, DriverQuirks.class
package org.apache.spark.sql.jdbc
DriverQuirks.DriverQuirks ( )
[mangled: org/apache/spark/sql/jdbc/DriverQuirks."<init>":()V]
DriverQuirks.get ( String p1 ) [static] : DriverQuirks
[mangled: org/apache/spark/sql/jdbc/DriverQuirks.get:(Ljava/lang/String;)Lorg/apache/spark/sql/jdbc/DriverQuirks;]
DriverQuirks.getCatalystType ( int p1, String p2, int p3, org.apache.spark.sql.types.MetadataBuilder p4 ) [abstract] : org.apache.spark.sql.types.DataType
[mangled: org/apache/spark/sql/jdbc/DriverQuirks.getCatalystType:(ILjava/lang/String;ILorg/apache/spark/sql/types/MetadataBuilder;)Lorg/apache/spark/sql/types/DataType;]
DriverQuirks.getJDBCType ( org.apache.spark.sql.types.DataType p1 ) [abstract] : scala.Tuple2<String,scala.Option<Object>>
[mangled: org/apache/spark/sql/jdbc/DriverQuirks.getJDBCType:(Lorg/apache/spark/sql/types/DataType;)Lscala/Tuple2;]
spark-sql_2.10-1.3.0.jar, Exchange.class
package org.apache.spark.sql.execution
Exchange.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Exchange.children:()Lscala/collection/immutable/List;]
Exchange.copy ( org.apache.spark.sql.catalyst.plans.physical.Partitioning newPartitioning, SparkPlan child ) : Exchange
[mangled: org/apache/spark/sql/execution/Exchange.copy:(Lorg/apache/spark/sql/catalyst/plans/physical/Partitioning;Lorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/Exchange;]
Exchange.curried ( ) [static] : scala.Function1<org.apache.spark.sql.catalyst.plans.physical.Partitioning,scala.Function1<SparkPlan,Exchange>>
[mangled: org/apache/spark/sql/execution/Exchange.curried:()Lscala/Function1;]
Exchange.Exchange ( org.apache.spark.sql.catalyst.plans.physical.Partitioning newPartitioning, SparkPlan child )
[mangled: org/apache/spark/sql/execution/Exchange."<init>":(Lorg/apache/spark/sql/catalyst/plans/physical/Partitioning;Lorg/apache/spark/sql/execution/SparkPlan;)V]
Exchange.Exchange..bypassMergeThreshold ( ) : int
[mangled: org/apache/spark/sql/execution/Exchange.org.apache.spark.sql.execution.Exchange..bypassMergeThreshold:()I]
Exchange.sortBasedShuffleOn ( ) : boolean
[mangled: org/apache/spark/sql/execution/Exchange.sortBasedShuffleOn:()Z]
Exchange.tupled ( ) [static] : scala.Function1<scala.Tuple2<org.apache.spark.sql.catalyst.plans.physical.Partitioning,SparkPlan>,Exchange>
[mangled: org/apache/spark/sql/execution/Exchange.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, ExecutedCommand.class
package org.apache.spark.sql.execution
ExecutedCommand.children ( ) : scala.collection.immutable.Nil.
[mangled: org/apache/spark/sql/execution/ExecutedCommand.children:()Lscala/collection/immutable/Nil$;]
spark-sql_2.10-1.3.0.jar, Expand.class
package org.apache.spark.sql.execution
Expand.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Expand.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, ExternalSort.class
package org.apache.spark.sql.execution
ExternalSort.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/ExternalSort.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, Filter.class
package org.apache.spark.sql.execution
Filter.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Filter.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, Generate.class
package org.apache.spark.sql.execution
Generate.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Generate.children:()Lscala/collection/immutable/List;]
Generate.copy ( org.apache.spark.sql.catalyst.expressions.Generator generator, boolean join, boolean outer, SparkPlan child ) : Generate
[mangled: org/apache/spark/sql/execution/Generate.copy:(Lorg/apache/spark/sql/catalyst/expressions/Generator;ZZLorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/Generate;]
Generate.Generate ( org.apache.spark.sql.catalyst.expressions.Generator generator, boolean join, boolean outer, SparkPlan child )
[mangled: org/apache/spark/sql/execution/Generate."<init>":(Lorg/apache/spark/sql/catalyst/expressions/Generator;ZZLorg/apache/spark/sql/execution/SparkPlan;)V]
Generate.generatorOutput ( ) : scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>
[mangled: org/apache/spark/sql/execution/Generate.generatorOutput:()Lscala/collection/Seq;]
spark-sql_2.10-1.3.0.jar, GeneratedAggregate.class
package org.apache.spark.sql.execution
GeneratedAggregate.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.children:()Lscala/collection/immutable/List;]
GeneratedAggregate.copy ( boolean partial, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> groupingExpressions, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression> aggregateExpressions, SparkPlan child ) : GeneratedAggregate
[mangled: org/apache/spark/sql/execution/GeneratedAggregate.copy:(ZLscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/GeneratedAggregate;]
GeneratedAggregate.GeneratedAggregate ( boolean partial, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> groupingExpressions, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression> aggregateExpressions, SparkPlan child )
[mangled: org/apache/spark/sql/execution/GeneratedAggregate."<init>":(ZLscala/collection/Seq;Lscala/collection/Seq;Lorg/apache/spark/sql/execution/SparkPlan;)V]
spark-sql_2.10-1.3.0.jar, GroupedData.class
package org.apache.spark.sql
GroupedData.GroupedData ( DataFrame df, scala.collection.Seq<catalyst.expressions.Expression> groupingExprs )
[mangled: org/apache/spark/sql/GroupedData."<init>":(Lorg/apache/spark/sql/DataFrame;Lscala/collection/Seq;)V]
spark-sql_2.10-1.3.0.jar, InMemoryColumnarTableScan.class
package org.apache.spark.sql.columnar
InMemoryColumnarTableScan.children ( ) : scala.collection.immutable.Nil.
[mangled: org/apache/spark/sql/columnar/InMemoryColumnarTableScan.children:()Lscala/collection/immutable/Nil$;]
spark-sql_2.10-1.3.0.jar, InMemoryRelation.class
package org.apache.spark.sql.columnar
InMemoryRelation.copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, boolean useCompression, int batchSize, org.apache.spark.storage.StorageLevel storageLevel, org.apache.spark.sql.execution.SparkPlan child, scala.Option<String> tableName, org.apache.spark.rdd.RDD<CachedBatch> _cachedColumnBuffers, org.apache.spark.sql.catalyst.plans.logical.Statistics _statistics ) : InMemoryRelation
[mangled: org/apache/spark/sql/columnar/InMemoryRelation.copy:(Lscala/collection/Seq;ZILorg/apache/spark/storage/StorageLevel;Lorg/apache/spark/sql/execution/SparkPlan;Lscala/Option;Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/catalyst/plans/logical/Statistics;)Lorg/apache/spark/sql/columnar/InMemoryRelation;]
InMemoryRelation.InMemoryRelation ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, boolean useCompression, int batchSize, org.apache.spark.storage.StorageLevel storageLevel, org.apache.spark.sql.execution.SparkPlan child, scala.Option<String> tableName, org.apache.spark.rdd.RDD<CachedBatch> _cachedColumnBuffers, org.apache.spark.sql.catalyst.plans.logical.Statistics _statistics )
[mangled: org/apache/spark/sql/columnar/InMemoryRelation."<init>":(Lscala/collection/Seq;ZILorg/apache/spark/storage/StorageLevel;Lorg/apache/spark/sql/execution/SparkPlan;Lscala/Option;Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/catalyst/plans/logical/Statistics;)V]
InMemoryRelation.newInstance ( ) : org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation
[mangled: org/apache/spark/sql/columnar/InMemoryRelation.newInstance:()Lorg/apache/spark/sql/catalyst/analysis/MultiInstanceRelation;]
spark-sql_2.10-1.3.0.jar, InsertIntoParquetTable.class
package org.apache.spark.sql.parquet
InsertIntoParquetTable.children ( ) : scala.collection.immutable.List<org.apache.spark.sql.execution.SparkPlan>
[mangled: org/apache/spark/sql/parquet/InsertIntoParquetTable.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, JDBCRDD.class
package org.apache.spark.sql.jdbc
JDBCRDD.getConnector ( String p1, String p2 ) [static] : scala.Function0<java.sql.Connection>
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.getConnector:(Ljava/lang/String;Ljava/lang/String;)Lscala/Function0;]
JDBCRDD.JDBCRDD ( org.apache.spark.SparkContext sc, scala.Function0<java.sql.Connection> getConnection, org.apache.spark.sql.types.StructType schema, String fqTable, String[ ] columns, org.apache.spark.sql.sources.Filter[ ] filters, org.apache.spark.Partition[ ] partitions )
[mangled: org/apache/spark/sql/jdbc/JDBCRDD."<init>":(Lorg/apache/spark/SparkContext;Lscala/Function0;Lorg/apache/spark/sql/types/StructType;Ljava/lang/String;[Ljava/lang/String;[Lorg/apache/spark/sql/sources/Filter;[Lorg/apache/spark/Partition;)V]
JDBCRDD.resolveTable ( String p1, String p2 ) [static] : org.apache.spark.sql.types.StructType
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.resolveTable:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/types/StructType;]
JDBCRDD.scanTable ( org.apache.spark.SparkContext p1, org.apache.spark.sql.types.StructType p2, String p3, String p4, String p5, String[ ] p6, org.apache.spark.sql.sources.Filter[ ] p7, org.apache.spark.Partition[ ] p8 ) [static] : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/jdbc/JDBCRDD.scanTable:(Lorg/apache/spark/SparkContext;Lorg/apache/spark/sql/types/StructType;Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;[Ljava/lang/String;[Lorg/apache/spark/sql/sources/Filter;[Lorg/apache/spark/Partition;)Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.3.0.jar, JDBCRelation.class
package org.apache.spark.sql.jdbc
JDBCRelation.copy ( String url, String table, org.apache.spark.Partition[ ] parts, org.apache.spark.sql.SQLContext sqlContext ) : JDBCRelation
[mangled: org/apache/spark/sql/jdbc/JDBCRelation.copy:(Ljava/lang/String;Ljava/lang/String;[Lorg/apache/spark/Partition;Lorg/apache/spark/sql/SQLContext;)Lorg/apache/spark/sql/jdbc/JDBCRelation;]
JDBCRelation.JDBCRelation ( String url, String table, org.apache.spark.Partition[ ] parts, org.apache.spark.sql.SQLContext sqlContext )
[mangled: org/apache/spark/sql/jdbc/JDBCRelation."<init>":(Ljava/lang/String;Ljava/lang/String;[Lorg/apache/spark/Partition;Lorg/apache/spark/sql/SQLContext;)V]
spark-sql_2.10-1.3.0.jar, JSONRelation.class
package org.apache.spark.sql.json
JSONRelation.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/json/JSONRelation.canEqual:(Ljava/lang/Object;)Z]
JSONRelation.copy ( String path, double samplingRatio, scala.Option<org.apache.spark.sql.types.StructType> userSpecifiedSchema, org.apache.spark.sql.SQLContext sqlContext ) : JSONRelation
[mangled: org/apache/spark/sql/json/JSONRelation.copy:(Ljava/lang/String;DLscala/Option;Lorg/apache/spark/sql/SQLContext;)Lorg/apache/spark/sql/json/JSONRelation;]
JSONRelation.JSONRelation..baseRDD ( ) : org.apache.spark.rdd.RDD<String>
[mangled: org/apache/spark/sql/json/JSONRelation.org.apache.spark.sql.json.JSONRelation..baseRDD:()Lorg/apache/spark/rdd/RDD;]
JSONRelation.path ( ) : String
[mangled: org/apache/spark/sql/json/JSONRelation.path:()Ljava/lang/String;]
JSONRelation.productArity ( ) : int
[mangled: org/apache/spark/sql/json/JSONRelation.productArity:()I]
JSONRelation.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/json/JSONRelation.productElement:(I)Ljava/lang/Object;]
JSONRelation.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/json/JSONRelation.productIterator:()Lscala/collection/Iterator;]
JSONRelation.productPrefix ( ) : String
[mangled: org/apache/spark/sql/json/JSONRelation.productPrefix:()Ljava/lang/String;]
JSONRelation.toString ( ) : String
[mangled: org/apache/spark/sql/json/JSONRelation.toString:()Ljava/lang/String;]
JSONRelation.userSpecifiedSchema ( ) : scala.Option<org.apache.spark.sql.types.StructType>
[mangled: org/apache/spark/sql/json/JSONRelation.userSpecifiedSchema:()Lscala/Option;]
spark-sql_2.10-1.3.0.jar, LeftSemiJoinHash.class
package org.apache.spark.sql.execution.joins
LeftSemiJoinHash.buildSide ( ) : package.BuildRight.
[mangled: org/apache/spark/sql/execution/joins/LeftSemiJoinHash.buildSide:()Lorg/apache/spark/sql/execution/joins/package$BuildRight$;]
spark-sql_2.10-1.3.0.jar, Limit.class
package org.apache.spark.sql.execution
Limit.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Limit.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, LocalTableScan.class
package org.apache.spark.sql.execution
LocalTableScan.children ( ) : scala.collection.immutable.Nil.
[mangled: org/apache/spark/sql/execution/LocalTableScan.children:()Lscala/collection/immutable/Nil$;]
spark-sql_2.10-1.3.0.jar, LogicalLocalTable.class
package org.apache.spark.sql.execution
LogicalLocalTable.children ( ) : scala.collection.immutable.Nil.
[mangled: org/apache/spark/sql/execution/LogicalLocalTable.children:()Lscala/collection/immutable/Nil$;]
LogicalLocalTable.newInstance ( ) : org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation
[mangled: org/apache/spark/sql/execution/LogicalLocalTable.newInstance:()Lorg/apache/spark/sql/catalyst/analysis/MultiInstanceRelation;]
spark-sql_2.10-1.3.0.jar, LogicalRDD.class
package org.apache.spark.sql.execution
LogicalRDD.children ( ) : scala.collection.immutable.Nil.
[mangled: org/apache/spark/sql/execution/LogicalRDD.children:()Lscala/collection/immutable/Nil$;]
LogicalRDD.newInstance ( ) : org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation
[mangled: org/apache/spark/sql/execution/LogicalRDD.newInstance:()Lorg/apache/spark/sql/catalyst/analysis/MultiInstanceRelation;]
spark-sql_2.10-1.3.0.jar, LogicalRelation.class
package org.apache.spark.sql.sources
LogicalRelation.newInstance ( ) : org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation
[mangled: org/apache/spark/sql/sources/LogicalRelation.newInstance:()Lorg/apache/spark/sql/catalyst/analysis/MultiInstanceRelation;]
spark-sql_2.10-1.3.0.jar, MySQLQuirks.class
package org.apache.spark.sql.jdbc
MySQLQuirks.MySQLQuirks ( )
[mangled: org/apache/spark/sql/jdbc/MySQLQuirks."<init>":()V]
spark-sql_2.10-1.3.0.jar, NativeColumnType<T>.class
package org.apache.spark.sql.columnar
NativeColumnType<T>.dataType ( ) : T
[mangled: org/apache/spark/sql/columnar/NativeColumnType<T>.dataType:()Lorg/apache/spark/sql/types/NativeType;]
NativeColumnType<T>.NativeColumnType ( T dataType, int typeId, int defaultSize ) : public
[mangled: org/apache/spark/sql/columnar/NativeColumnType<T>.org.apache.spark.sql.columnar.NativeColumnType:(Lorg/apache/spark/sql/types/NativeType;II)V]
spark-sql_2.10-1.3.0.jar, NoQuirks.class
package org.apache.spark.sql.jdbc
NoQuirks.NoQuirks ( )
[mangled: org/apache/spark/sql/jdbc/NoQuirks."<init>":()V]
spark-sql_2.10-1.3.0.jar, OutputFaker.class
package org.apache.spark.sql.execution
OutputFaker.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/OutputFaker.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, ParquetRelation.class
package org.apache.spark.sql.parquet
ParquetRelation.newInstance ( ) : org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation
[mangled: org/apache/spark/sql/parquet/ParquetRelation.newInstance:()Lorg/apache/spark/sql/catalyst/analysis/MultiInstanceRelation;]
spark-sql_2.10-1.3.0.jar, ParquetRelation2.class
package org.apache.spark.sql.parquet
ParquetRelation2.buildScan ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> predicates ) : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.buildScan:(Lscala/collection/Seq;Lscala/collection/Seq;)Lorg/apache/spark/rdd/RDD;]
ParquetRelation2.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.canEqual:(Ljava/lang/Object;)Z]
ParquetRelation2.copy ( scala.collection.Seq<String> paths, scala.collection.immutable.Map<String,String> parameters, scala.Option<org.apache.spark.sql.types.StructType> maybeSchema, scala.Option<PartitionSpec> maybePartitionSpec, org.apache.spark.sql.SQLContext sqlContext ) : ParquetRelation2
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.copy:(Lscala/collection/Seq;Lscala/collection/immutable/Map;Lscala/Option;Lscala/Option;Lorg/apache/spark/sql/SQLContext;)Lorg/apache/spark/sql/parquet/ParquetRelation2;]
ParquetRelation2.DEFAULT_PARTITION_NAME ( ) [static] : String
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.DEFAULT_PARTITION_NAME:()Ljava/lang/String;]
ParquetRelation2.insert ( org.apache.spark.sql.DataFrame data, boolean overwrite ) : void
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.insert:(Lorg/apache/spark/sql/DataFrame;Z)V]
ParquetRelation2.isPartitioned ( ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.isPartitioned:()Z]
ParquetRelation2.maybeSchema ( ) : scala.Option<org.apache.spark.sql.types.StructType>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.maybeSchema:()Lscala/Option;]
ParquetRelation2.MERGE_SCHEMA ( ) [static] : String
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.MERGE_SCHEMA:()Ljava/lang/String;]
ParquetRelation2.newJobContext ( org.apache.hadoop.conf.Configuration conf, org.apache.hadoop.mapreduce.JobID jobId ) : org.apache.hadoop.mapreduce.JobContext
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.newJobContext:(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/mapreduce/JobID;)Lorg/apache/hadoop/mapreduce/JobContext;]
ParquetRelation2.newTaskAttemptContext ( org.apache.hadoop.conf.Configuration conf, org.apache.hadoop.mapreduce.TaskAttemptID attemptId ) : org.apache.hadoop.mapreduce.TaskAttemptContext
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.newTaskAttemptContext:(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/mapreduce/TaskAttemptID;)Lorg/apache/hadoop/mapreduce/TaskAttemptContext;]
ParquetRelation2.newTaskAttemptID ( String jtIdentifier, int jobId, boolean isMap, int taskId, int attemptId ) : org.apache.hadoop.mapreduce.TaskAttemptID
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.newTaskAttemptID:(Ljava/lang/String;IZII)Lorg/apache/hadoop/mapreduce/TaskAttemptID;]
ParquetRelation2.ParquetRelation2..defaultPartitionName ( ) : String
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.org.apache.spark.sql.parquet.ParquetRelation2..defaultPartitionName:()Ljava/lang/String;]
ParquetRelation2.ParquetRelation2..isSummaryFile ( org.apache.hadoop.fs.Path file ) : boolean
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.org.apache.spark.sql.parquet.ParquetRelation2..isSummaryFile:(Lorg/apache/hadoop/fs/Path;)Z]
ParquetRelation2.parameters ( ) : scala.collection.immutable.Map<String,String>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.parameters:()Lscala/collection/immutable/Map;]
ParquetRelation2.ParquetRelation2 ( scala.collection.Seq<String> paths, scala.collection.immutable.Map<String,String> parameters, scala.Option<org.apache.spark.sql.types.StructType> maybeSchema, scala.Option<PartitionSpec> maybePartitionSpec, org.apache.spark.sql.SQLContext sqlContext )
[mangled: org/apache/spark/sql/parquet/ParquetRelation2."<init>":(Lscala/collection/Seq;Lscala/collection/immutable/Map;Lscala/Option;Lscala/Option;Lorg/apache/spark/sql/SQLContext;)V]
ParquetRelation2.partitions ( ) : scala.collection.Seq<Partition>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.partitions:()Lscala/collection/Seq;]
ParquetRelation2.productArity ( ) : int
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.productArity:()I]
ParquetRelation2.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.productElement:(I)Ljava/lang/Object;]
ParquetRelation2.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.productIterator:()Lscala/collection/Iterator;]
ParquetRelation2.productPrefix ( ) : String
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.productPrefix:()Ljava/lang/String;]
ParquetRelation2.sparkContext ( ) : org.apache.spark.SparkContext
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.sparkContext:()Lorg/apache/spark/SparkContext;]
ParquetRelation2.toString ( ) : String
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.toString:()Ljava/lang/String;]
spark-sql_2.10-1.3.0.jar, ParquetTableScan.class
package org.apache.spark.sql.parquet
ParquetTableScan.children ( ) : scala.collection.immutable.Nil.
[mangled: org/apache/spark/sql/parquet/ParquetTableScan.children:()Lscala/collection/immutable/Nil$;]
spark-sql_2.10-1.3.0.jar, ParquetTest.class
package org.apache.spark.sql.parquet
ParquetTest.configuration ( ) [abstract] : org.apache.hadoop.conf.Configuration
[mangled: org/apache/spark/sql/parquet/ParquetTest.configuration:()Lorg/apache/hadoop/conf/Configuration;]
ParquetTest.makeParquetFile ( org.apache.spark.sql.DataFrame p1, java.io.File p2, scala.reflect.ClassTag<T> p3, scala.reflect.api.TypeTags.TypeTag<T> p4 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.makeParquetFile:(Lorg/apache/spark/sql/DataFrame;Ljava/io/File;Lscala/reflect/ClassTag;Lscala/reflect/api/TypeTags$TypeTag;)V]
ParquetTest.makeParquetFile ( scala.collection.Seq<T> p1, java.io.File p2, scala.reflect.ClassTag<T> p3, scala.reflect.api.TypeTags.TypeTag<T> p4 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.makeParquetFile:(Lscala/collection/Seq;Ljava/io/File;Lscala/reflect/ClassTag;Lscala/reflect/api/TypeTags$TypeTag;)V]
ParquetTest.makePartitionDir ( java.io.File p1, String p2, scala.collection.Seq<scala.Tuple2<String,Object>> p3 ) [abstract] : java.io.File
[mangled: org/apache/spark/sql/parquet/ParquetTest.makePartitionDir:(Ljava/io/File;Ljava/lang/String;Lscala/collection/Seq;)Ljava/io/File;]
ParquetTest.sqlContext ( ) [abstract] : org.apache.spark.sql.SQLContext
[mangled: org/apache/spark/sql/parquet/ParquetTest.sqlContext:()Lorg/apache/spark/sql/SQLContext;]
ParquetTest.withParquetDataFrame ( scala.collection.Seq<T> p1, scala.Function1<org.apache.spark.sql.DataFrame,scala.runtime.BoxedUnit> p2, scala.reflect.ClassTag<T> p3, scala.reflect.api.TypeTags.TypeTag<T> p4 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.withParquetDataFrame:(Lscala/collection/Seq;Lscala/Function1;Lscala/reflect/ClassTag;Lscala/reflect/api/TypeTags$TypeTag;)V]
ParquetTest.withParquetFile ( scala.collection.Seq<T> p1, scala.Function1<String,scala.runtime.BoxedUnit> p2, scala.reflect.ClassTag<T> p3, scala.reflect.api.TypeTags.TypeTag<T> p4 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.withParquetFile:(Lscala/collection/Seq;Lscala/Function1;Lscala/reflect/ClassTag;Lscala/reflect/api/TypeTags$TypeTag;)V]
ParquetTest.withParquetTable ( scala.collection.Seq<T> p1, String p2, scala.Function0<scala.runtime.BoxedUnit> p3, scala.reflect.ClassTag<T> p4, scala.reflect.api.TypeTags.TypeTag<T> p5 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.withParquetTable:(Lscala/collection/Seq;Ljava/lang/String;Lscala/Function0;Lscala/reflect/ClassTag;Lscala/reflect/api/TypeTags$TypeTag;)V]
ParquetTest.withSQLConf ( scala.collection.Seq<scala.Tuple2<String,String>> p1, scala.Function0<scala.runtime.BoxedUnit> p2 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.withSQLConf:(Lscala/collection/Seq;Lscala/Function0;)V]
ParquetTest.withTempDir ( scala.Function1<java.io.File,scala.runtime.BoxedUnit> p1 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.withTempDir:(Lscala/Function1;)V]
ParquetTest.withTempPath ( scala.Function1<java.io.File,scala.runtime.BoxedUnit> p1 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.withTempPath:(Lscala/Function1;)V]
ParquetTest.withTempTable ( String p1, scala.Function0<scala.runtime.BoxedUnit> p2 ) [abstract] : void
[mangled: org/apache/spark/sql/parquet/ParquetTest.withTempTable:(Ljava/lang/String;Lscala/Function0;)V]
spark-sql_2.10-1.3.0.jar, Partition.class
package org.apache.spark.sql.parquet
Partition.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/Partition.canEqual:(Ljava/lang/Object;)Z]
Partition.copy ( org.apache.spark.sql.Row values, String path ) : Partition
[mangled: org/apache/spark/sql/parquet/Partition.copy:(Lorg/apache/spark/sql/Row;Ljava/lang/String;)Lorg/apache/spark/sql/parquet/Partition;]
Partition.curried ( ) [static] : scala.Function1<org.apache.spark.sql.Row,scala.Function1<String,Partition>>
[mangled: org/apache/spark/sql/parquet/Partition.curried:()Lscala/Function1;]
Partition.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/Partition.equals:(Ljava/lang/Object;)Z]
Partition.hashCode ( ) : int
[mangled: org/apache/spark/sql/parquet/Partition.hashCode:()I]
Partition.Partition ( org.apache.spark.sql.Row values, String path )
[mangled: org/apache/spark/sql/parquet/Partition."<init>":(Lorg/apache/spark/sql/Row;Ljava/lang/String;)V]
Partition.path ( ) : String
[mangled: org/apache/spark/sql/parquet/Partition.path:()Ljava/lang/String;]
Partition.productArity ( ) : int
[mangled: org/apache/spark/sql/parquet/Partition.productArity:()I]
Partition.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/parquet/Partition.productElement:(I)Ljava/lang/Object;]
Partition.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/parquet/Partition.productIterator:()Lscala/collection/Iterator;]
Partition.productPrefix ( ) : String
[mangled: org/apache/spark/sql/parquet/Partition.productPrefix:()Ljava/lang/String;]
Partition.toString ( ) : String
[mangled: org/apache/spark/sql/parquet/Partition.toString:()Ljava/lang/String;]
Partition.tupled ( ) [static] : scala.Function1<scala.Tuple2<org.apache.spark.sql.Row,String>,Partition>
[mangled: org/apache/spark/sql/parquet/Partition.tupled:()Lscala/Function1;]
Partition.values ( ) : org.apache.spark.sql.Row
[mangled: org/apache/spark/sql/parquet/Partition.values:()Lorg/apache/spark/sql/Row;]
spark-sql_2.10-1.3.0.jar, PartitionSpec.class
package org.apache.spark.sql.parquet
PartitionSpec.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/PartitionSpec.canEqual:(Ljava/lang/Object;)Z]
PartitionSpec.copy ( org.apache.spark.sql.types.StructType partitionColumns, scala.collection.Seq<Partition> partitions ) : PartitionSpec
[mangled: org/apache/spark/sql/parquet/PartitionSpec.copy:(Lorg/apache/spark/sql/types/StructType;Lscala/collection/Seq;)Lorg/apache/spark/sql/parquet/PartitionSpec;]
PartitionSpec.curried ( ) [static] : scala.Function1<org.apache.spark.sql.types.StructType,scala.Function1<scala.collection.Seq<Partition>,PartitionSpec>>
[mangled: org/apache/spark/sql/parquet/PartitionSpec.curried:()Lscala/Function1;]
PartitionSpec.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/sql/parquet/PartitionSpec.equals:(Ljava/lang/Object;)Z]
PartitionSpec.hashCode ( ) : int
[mangled: org/apache/spark/sql/parquet/PartitionSpec.hashCode:()I]
PartitionSpec.partitionColumns ( ) : org.apache.spark.sql.types.StructType
[mangled: org/apache/spark/sql/parquet/PartitionSpec.partitionColumns:()Lorg/apache/spark/sql/types/StructType;]
PartitionSpec.partitions ( ) : scala.collection.Seq<Partition>
[mangled: org/apache/spark/sql/parquet/PartitionSpec.partitions:()Lscala/collection/Seq;]
PartitionSpec.PartitionSpec ( org.apache.spark.sql.types.StructType partitionColumns, scala.collection.Seq<Partition> partitions )
[mangled: org/apache/spark/sql/parquet/PartitionSpec."<init>":(Lorg/apache/spark/sql/types/StructType;Lscala/collection/Seq;)V]
PartitionSpec.productArity ( ) : int
[mangled: org/apache/spark/sql/parquet/PartitionSpec.productArity:()I]
PartitionSpec.productElement ( int p1 ) : Object
[mangled: org/apache/spark/sql/parquet/PartitionSpec.productElement:(I)Ljava/lang/Object;]
PartitionSpec.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/sql/parquet/PartitionSpec.productIterator:()Lscala/collection/Iterator;]
PartitionSpec.productPrefix ( ) : String
[mangled: org/apache/spark/sql/parquet/PartitionSpec.productPrefix:()Ljava/lang/String;]
PartitionSpec.toString ( ) : String
[mangled: org/apache/spark/sql/parquet/PartitionSpec.toString:()Ljava/lang/String;]
PartitionSpec.tupled ( ) [static] : scala.Function1<scala.Tuple2<org.apache.spark.sql.types.StructType,scala.collection.Seq<Partition>>,PartitionSpec>
[mangled: org/apache/spark/sql/parquet/PartitionSpec.tupled:()Lscala/Function1;]
spark-sql_2.10-1.3.0.jar, PhysicalRDD.class
package org.apache.spark.sql.execution
PhysicalRDD.children ( ) : scala.collection.immutable.Nil.
[mangled: org/apache/spark/sql/execution/PhysicalRDD.children:()Lscala/collection/immutable/Nil$;]
spark-sql_2.10-1.3.0.jar, PostgresQuirks.class
package org.apache.spark.sql.jdbc
PostgresQuirks.PostgresQuirks ( )
[mangled: org/apache/spark/sql/jdbc/PostgresQuirks."<init>":()V]
spark-sql_2.10-1.3.0.jar, PreWriteCheck.class
package org.apache.spark.sql.sources
PreWriteCheck.failAnalysis ( String msg ) : scala.runtime.Nothing.
[mangled: org/apache/spark/sql/sources/PreWriteCheck.failAnalysis:(Ljava/lang/String;)Lscala/runtime/Nothing$;]
spark-sql_2.10-1.3.0.jar, Project.class
package org.apache.spark.sql.execution
Project.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Project.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, PythonUDF.class
package org.apache.spark.sql.execution
PythonUDF.copy ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, org.apache.spark.sql.types.DataType dataType, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> children ) : PythonUDF
[mangled: org/apache/spark/sql/execution/PythonUDF.copy:(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;Lscala/collection/Seq;)Lorg/apache/spark/sql/execution/PythonUDF;]
PythonUDF.eval ( org.apache.spark.sql.Row input ) : scala.runtime.Nothing.
[mangled: org/apache/spark/sql/execution/PythonUDF.eval:(Lorg/apache/spark/sql/Row;)Lscala/runtime/Nothing$;]
PythonUDF.PythonUDF ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, org.apache.spark.sql.types.DataType dataType, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> children )
[mangled: org/apache/spark/sql/execution/PythonUDF."<init>":(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;Lscala/collection/Seq;)V]
spark-sql_2.10-1.3.0.jar, ResolvedDataSource.class
package org.apache.spark.sql.sources
ResolvedDataSource.apply ( org.apache.spark.sql.SQLContext p1, scala.Option<org.apache.spark.sql.types.StructType> p2, String p3, scala.collection.immutable.Map<String,String> p4 ) [static] : ResolvedDataSource
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.apply:(Lorg/apache/spark/sql/SQLContext;Lscala/Option;Ljava/lang/String;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/sources/ResolvedDataSource;]
ResolvedDataSource.apply ( org.apache.spark.sql.SQLContext p1, String p2, org.apache.spark.sql.SaveMode p3, scala.collection.immutable.Map<String,String> p4, org.apache.spark.sql.DataFrame p5 ) [static] : ResolvedDataSource
[mangled: org/apache/spark/sql/sources/ResolvedDataSource.apply:(Lorg/apache/spark/sql/SQLContext;Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/sources/ResolvedDataSource;]
spark-sql_2.10-1.3.0.jar, Sample.class
package org.apache.spark.sql.execution
Sample.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Sample.children:()Lscala/collection/immutable/List;]
Sample.copy ( double fraction, boolean withReplacement, long seed, SparkPlan child ) : Sample
[mangled: org/apache/spark/sql/execution/Sample.copy:(DZJLorg/apache/spark/sql/execution/SparkPlan;)Lorg/apache/spark/sql/execution/Sample;]
Sample.fraction ( ) : double
[mangled: org/apache/spark/sql/execution/Sample.fraction:()D]
Sample.Sample ( double fraction, boolean withReplacement, long seed, SparkPlan child )
[mangled: org/apache/spark/sql/execution/Sample."<init>":(DZJLorg/apache/spark/sql/execution/SparkPlan;)V]
spark-sql_2.10-1.3.0.jar, Sort.class
package org.apache.spark.sql.execution
Sort.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/Sort.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, SQLContext.class
package org.apache.spark.sql
SQLContext.cacheManager ( ) : CacheManager
[mangled: org/apache/spark/sql/SQLContext.cacheManager:()Lorg/apache/spark/sql/CacheManager;]
SQLContext.checkAnalysis ( ) : catalyst.analysis.CheckAnalysis
[mangled: org/apache/spark/sql/SQLContext.checkAnalysis:()Lorg/apache/spark/sql/catalyst/analysis/CheckAnalysis;]
SQLContext.createDataFrame ( org.apache.spark.api.java.JavaRDD<Row> rowRDD, java.util.List<String> columns ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/api/java/JavaRDD;Ljava/util/List;)Lorg/apache/spark/sql/DataFrame;]
spark-sql_2.10-1.3.0.jar, TakeOrdered.class
package org.apache.spark.sql.execution
TakeOrdered.children ( ) : scala.collection.immutable.List<SparkPlan>
[mangled: org/apache/spark/sql/execution/TakeOrdered.children:()Lscala/collection/immutable/List;]
spark-sql_2.10-1.3.0.jar, TestGroupWriteSupport.class
package org.apache.spark.sql.parquet
TestGroupWriteSupport.TestGroupWriteSupport ( parquet.schema.MessageType schema )
[mangled: org/apache/spark/sql/parquet/TestGroupWriteSupport."<init>":(Lparquet/schema/MessageType;)V]
spark-sql_2.10-1.3.0.jar, UserDefinedPythonFunction.class
package org.apache.spark.sql
UserDefinedPythonFunction.copy ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, types.DataType dataType ) : UserDefinedPythonFunction
[mangled: org/apache/spark/sql/UserDefinedPythonFunction.copy:(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;)Lorg/apache/spark/sql/UserDefinedPythonFunction;]
UserDefinedPythonFunction.UserDefinedPythonFunction ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, java.util.List<org.apache.spark.broadcast.Broadcast<org.apache.spark.api.python.PythonBroadcast>> broadcastVars, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, types.DataType dataType )
[mangled: org/apache/spark/sql/UserDefinedPythonFunction."<init>":(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Ljava/util/List;Lorg/apache/spark/Accumulator;Lorg/apache/spark/sql/types/DataType;)V]
to the top
Problems with Data Types, High Severity (21)
spark-sql_2.10-1.3.0.jar
package org.apache.spark.sql
[+] CachedData (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (14)
CachedData ( catalyst.plans.logical.LogicalPlan, columnar.InMemoryRelation )This constructor is from 'CachedData' class.
cachedRepresentation ( )This method is from 'CachedData' class.
canEqual ( java.lang.Object )This method is from 'CachedData' class.
copy ( catalyst.plans.logical.LogicalPlan, columnar.InMemoryRelation )This method is from 'CachedData' class.
curried ( )This method is from 'CachedData' class.
equals ( java.lang.Object )This method is from 'CachedData' class.
hashCode ( )This method is from 'CachedData' class.
plan ( )This method is from 'CachedData' class.
productArity ( )This method is from 'CachedData' class.
productElement ( int )This method is from 'CachedData' class.
productIterator ( )This method is from 'CachedData' class.
productPrefix ( )This method is from 'CachedData' class.
toString ( )This method is from 'CachedData' class.
tupled ( )This method is from 'CachedData' class.
[+] CacheManager (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (9)
CacheManager ( SQLContext )This constructor is from 'CacheManager' class.
cacheQuery ( DataFrame, scala.Option<java.lang.String>, org.apache.spark.storage.StorageLevel )This method is from 'CacheManager' class.
cacheTable ( java.lang.String )This method is from 'CacheManager' class.
clearCache ( )This method is from 'CacheManager' class.
invalidateCache ( catalyst.plans.logical.LogicalPlan )This method is from 'CacheManager' class.
isCached ( java.lang.String )This method is from 'CacheManager' class.
tryUncacheQuery ( DataFrame, boolean )This method is from 'CacheManager' class.
uncacheTable ( java.lang.String )This method is from 'CacheManager' class.
useCachedData ( catalyst.plans.logical.LogicalPlan )This method is from 'CacheManager' class.
package org.apache.spark.sql.execution
[+] AddExchange (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (16)
AddExchange ( org.apache.spark.sql.SQLContext )This constructor is from 'AddExchange' class.
andThen ( scala.Function1<AddExchange,A> )This method is from 'AddExchange' class.
apply ( org.apache.spark.sql.catalyst.trees.TreeNode )This method is from 'AddExchange' class.
apply ( SparkPlan )This method is from 'AddExchange' class.
canEqual ( java.lang.Object )This method is from 'AddExchange' class.
compose ( scala.Function1<A,org.apache.spark.sql.SQLContext> )This method is from 'AddExchange' class.
copy ( org.apache.spark.sql.SQLContext )This method is from 'AddExchange' class.
equals ( java.lang.Object )This method is from 'AddExchange' class.
hashCode ( )This method is from 'AddExchange' class.
numPartitions ( )This method is from 'AddExchange' class.
productArity ( )This method is from 'AddExchange' class.
productElement ( int )This method is from 'AddExchange' class.
productIterator ( )This method is from 'AddExchange' class.
productPrefix ( )This method is from 'AddExchange' class.
sqlContext ( )This method is from 'AddExchange' class.
toString ( )This method is from 'AddExchange' class.
package org.apache.spark.sql.execution.joins
[+] GeneralHashedRelation (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Serializable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
GeneralHashedRelation ( java.util.HashMap<org.apache.spark.sql.Row,org.apache.spark.util.collection.CompactBuffer<org.apache.spark.sql.Row>> )This constructor is from 'GeneralHashedRelation' class.
[+] UniqueKeyHashedRelation (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Serializable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
UniqueKeyHashedRelation ( java.util.HashMap<org.apache.spark.sql.Row,org.apache.spark.sql.Row> )This constructor is from 'UniqueKeyHashedRelation' class.
package org.apache.spark.sql.jdbc
[+] DriverQuirks (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (4)
DriverQuirks ( )This constructor is from 'DriverQuirks' abstract class.
get ( java.lang.String )This method is from 'DriverQuirks' abstract class.
getCatalystType ( int, java.lang.String, int, org.apache.spark.sql.types.MetadataBuilder )This abstract method is from 'DriverQuirks' abstract class.
getJDBCType ( org.apache.spark.sql.types.DataType )This abstract method is from 'DriverQuirks' abstract class.
[+] JDBCRDD.DecimalConversion. (1)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
DecimalConversion ( )Return value of this method has type 'JDBCRDD.DecimalConversion.'.
[+] MySQLQuirks (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
MySQLQuirks ( )This constructor is from 'MySQLQuirks' class.
[+] NoQuirks (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
NoQuirks ( )This constructor is from 'NoQuirks' class.
[+] PostgresQuirks (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
PostgresQuirks ( )This constructor is from 'PostgresQuirks' class.
package org.apache.spark.sql.json
[+] JSONRelation (2)
| Change | Effect |
---|
1 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
2 | Removed super-interface scala.Serializable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (8)
buildScan ( )This method is from 'JSONRelation' class.
equals ( java.lang.Object )This method is from 'JSONRelation' class.
hashCode ( )This method is from 'JSONRelation' class.
insert ( org.apache.spark.sql.DataFrame, boolean )This method is from 'JSONRelation' class.
JSONRelation ( java.lang.String, double, scala.Option<org.apache.spark.sql.types.StructType>, org.apache.spark.sql.SQLContext )This constructor is from 'JSONRelation' class.
samplingRatio ( )This method is from 'JSONRelation' class.
schema ( )This method is from 'JSONRelation' class.
sqlContext ( )This method is from 'JSONRelation' class.
package org.apache.spark.sql.parquet
[+] ParquetRelation2 (5)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.mapreduce.SparkHadoopMapReduceUtil. | A client program may be interrupted by NoSuchMethodError exception. |
2 | Removed super-interface org.apache.spark.sql.sources.CatalystScan. | A client program may be interrupted by NoSuchMethodError exception. |
3 | Removed super-interface org.apache.spark.sql.sources.InsertableRelation. | A client program may be interrupted by NoSuchMethodError exception. |
4 | Removed super-interface scala.Product. | A client program may be interrupted by NoSuchMethodError exception. |
5 | Removed super-interface scala.Serializable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (23)
equals ( java.lang.Object )This method is from 'ParquetRelation2' class.
hashCode ( )This method is from 'ParquetRelation2' class.
isTraceEnabled ( )This method is from 'ParquetRelation2' class.
log ( )This method is from 'ParquetRelation2' class.
logDebug ( scala.Function0<java.lang.String> )This method is from 'ParquetRelation2' class.
logDebug ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'ParquetRelation2' class.
logError ( scala.Function0<java.lang.String> )This method is from 'ParquetRelation2' class.
logError ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'ParquetRelation2' class.
logInfo ( scala.Function0<java.lang.String> )This method is from 'ParquetRelation2' class.
logInfo ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'ParquetRelation2' class.
logName ( )This method is from 'ParquetRelation2' class.
logTrace ( scala.Function0<java.lang.String> )This method is from 'ParquetRelation2' class.
logTrace ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'ParquetRelation2' class.
logWarning ( scala.Function0<java.lang.String> )This method is from 'ParquetRelation2' class.
logWarning ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'ParquetRelation2' class.
maybePartitionSpec ( )This method is from 'ParquetRelation2' class.
org.apache.spark.Logging..log_ ( )This method is from 'ParquetRelation2' class.
org.apache.spark.Logging..log__.eq ( org.slf4j.Logger )This method is from 'ParquetRelation2' class.
ParquetRelation2..maybeMetastoreSchema ( )This method is from 'ParquetRelation2' class.
ParquetRelation2..metadataCache ( )This method is from 'ParquetRelation2' class.
ParquetRelation2..shouldMergeSchemas ( )This method is from 'ParquetRelation2' class.
sizeInBytes ( )This method is from 'ParquetRelation2' class.
sqlContext ( )This method is from 'ParquetRelation2' class.
[+] ParquetTest (1)
| Change | Effect |
---|
1 | This interface has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (12)
configuration ( )This abstract method is from 'ParquetTest' interface.
makeParquetFile ( org.apache.spark.sql.DataFrame, java.io.File, scala.reflect.ClassTag<T>, scala.reflect.api.TypeTags.TypeTag<T> )This abstract method is from 'ParquetTest' interface.
makeParquetFile ( scala.collection.Seq<T>, java.io.File, scala.reflect.ClassTag<T>, scala.reflect.api.TypeTags.TypeTag<T> )This abstract method is from 'ParquetTest' interface.
makePartitionDir ( java.io.File, java.lang.String, scala.collection.Seq<scala.Tuple2<java.lang.String,java.lang.Object>> )This abstract method is from 'ParquetTest' interface.
sqlContext ( )This abstract method is from 'ParquetTest' interface.
withParquetDataFrame ( scala.collection.Seq<T>, scala.Function1<org.apache.spark.sql.DataFrame,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<T>, scala.reflect.api.TypeTags.TypeTag<T> )This abstract method is from 'ParquetTest' interface.
withParquetFile ( scala.collection.Seq<T>, scala.Function1<java.lang.String,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<T>, scala.reflect.api.TypeTags.TypeTag<T> )This abstract method is from 'ParquetTest' interface.
withParquetTable ( scala.collection.Seq<T>, java.lang.String, scala.Function0<scala.runtime.BoxedUnit>, scala.reflect.ClassTag<T>, scala.reflect.api.TypeTags.TypeTag<T> )This abstract method is from 'ParquetTest' interface.
withSQLConf ( scala.collection.Seq<scala.Tuple2<java.lang.String,java.lang.String>>, scala.Function0<scala.runtime.BoxedUnit> )This abstract method is from 'ParquetTest' interface.
withTempDir ( scala.Function1<java.io.File,scala.runtime.BoxedUnit> )This abstract method is from 'ParquetTest' interface.
withTempPath ( scala.Function1<java.io.File,scala.runtime.BoxedUnit> )This abstract method is from 'ParquetTest' interface.
withTempTable ( java.lang.String, scala.Function0<scala.runtime.BoxedUnit> )This abstract method is from 'ParquetTest' interface.
[+] Partition (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (14)
canEqual ( java.lang.Object )This method is from 'Partition' class.
copy ( org.apache.spark.sql.Row, java.lang.String )This method is from 'Partition' class.
curried ( )This method is from 'Partition' class.
equals ( java.lang.Object )This method is from 'Partition' class.
hashCode ( )This method is from 'Partition' class.
Partition ( org.apache.spark.sql.Row, java.lang.String )This constructor is from 'Partition' class.
path ( )This method is from 'Partition' class.
productArity ( )This method is from 'Partition' class.
productElement ( int )This method is from 'Partition' class.
productIterator ( )This method is from 'Partition' class.
productPrefix ( )This method is from 'Partition' class.
toString ( )This method is from 'Partition' class.
tupled ( )This method is from 'Partition' class.
values ( )This method is from 'Partition' class.
[+] PartitionSpec (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (14)
canEqual ( java.lang.Object )This method is from 'PartitionSpec' class.
copy ( org.apache.spark.sql.types.StructType, scala.collection.Seq<Partition> )This method is from 'PartitionSpec' class.
curried ( )This method is from 'PartitionSpec' class.
equals ( java.lang.Object )This method is from 'PartitionSpec' class.
hashCode ( )This method is from 'PartitionSpec' class.
partitionColumns ( )This method is from 'PartitionSpec' class.
partitions ( )This method is from 'PartitionSpec' class.
PartitionSpec ( org.apache.spark.sql.types.StructType, scala.collection.Seq<Partition> )This constructor is from 'PartitionSpec' class.
productArity ( )This method is from 'PartitionSpec' class.
productElement ( int )This method is from 'PartitionSpec' class.
productIterator ( )This method is from 'PartitionSpec' class.
productPrefix ( )This method is from 'PartitionSpec' class.
toString ( )This method is from 'PartitionSpec' class.
tupled ( )This method is from 'PartitionSpec' class.
[+] TestGroupWriteSupport (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
TestGroupWriteSupport ( parquet.schema.MessageType )This constructor is from 'TestGroupWriteSupport' class.
to the top
Problems with Methods, High Severity (3)
spark-sql_2.10-1.3.0.jar, CatalystConverter
package org.apache.spark.sql.parquet
[+] CatalystConverter.readDecimal ( org.apache.spark.sql.types.Decimal dest, parquet.io.api.Binary value, org.apache.spark.sql.types.DecimalType ctype ) : void (1)
[mangled: org/apache/spark/sql/parquet/CatalystConverter.readDecimal:(Lorg/apache/spark/sql/types/Decimal;Lparquet/io/api/Binary;Lorg/apache/spark/sql/types/DecimalType;)V]
| Change | Effect |
---|
1 | Return value type has been changed from void to org.apache.spark.sql.types.Decimal.
| This method has been removed because the return type is part of the method signature. |
spark-sql_2.10-1.3.0.jar, ParquetRelation2
package org.apache.spark.sql.parquet
[+] ParquetRelation2.maybePartitionSpec ( ) : scala.Option<PartitionSpec> (1)
[mangled: org/apache/spark/sql/parquet/ParquetRelation2.maybePartitionSpec:()Lscala/Option;]
| Change | Effect |
---|
1 | Access level has been changed from public to private. | A client program may be interrupted by IllegalAccessError exception. |
spark-sql_2.10-1.3.0.jar, TakeOrdered
package org.apache.spark.sql.execution
[+] TakeOrdered.ord ( ) : org.apache.spark.sql.catalyst.expressions.RowOrdering (1)
[mangled: org/apache/spark/sql/execution/TakeOrdered.ord:()Lorg/apache/spark/sql/catalyst/expressions/RowOrdering;]
| Change | Effect |
---|
1 | Access level has been changed from public to private. | A client program may be interrupted by IllegalAccessError exception. |
to the top
Problems with Data Types, Medium Severity (15)
spark-sql_2.10-1.3.0.jar
package org.apache.spark.sql.execution
[+] CacheTableCommand (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (16)
CacheTableCommand ( java.lang.String, scala.Option<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>, boolean )This constructor is from 'CacheTableCommand' class.
canEqual ( java.lang.Object )This method is from 'CacheTableCommand' class.
copy ( java.lang.String, scala.Option<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>, boolean )Return value of this method has type 'CacheTableCommand'.
curried ( )This method is from 'CacheTableCommand' class.
equals ( java.lang.Object )This method is from 'CacheTableCommand' class.
hashCode ( )This method is from 'CacheTableCommand' class.
isLazy ( )This method is from 'CacheTableCommand' class.
output ( )This method is from 'CacheTableCommand' class.
plan ( )This method is from 'CacheTableCommand' class.
productArity ( )This method is from 'CacheTableCommand' class.
productElement ( int )This method is from 'CacheTableCommand' class.
productIterator ( )This method is from 'CacheTableCommand' class.
productPrefix ( )This method is from 'CacheTableCommand' class.
run ( org.apache.spark.sql.SQLContext )This method is from 'CacheTableCommand' class.
tableName ( )This method is from 'CacheTableCommand' class.
tupled ( )This method is from 'CacheTableCommand' class.
[+] DescribeCommand (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (15)
canEqual ( java.lang.Object )This method is from 'DescribeCommand' class.
child ( )This method is from 'DescribeCommand' class.
copy ( SparkPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )Return value of this method has type 'DescribeCommand'.
curried ( )This method is from 'DescribeCommand' class.
DescribeCommand ( SparkPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )This constructor is from 'DescribeCommand' class.
equals ( java.lang.Object )This method is from 'DescribeCommand' class.
hashCode ( )This method is from 'DescribeCommand' class.
isExtended ( )This method is from 'DescribeCommand' class.
output ( )This method is from 'DescribeCommand' class.
productArity ( )This method is from 'DescribeCommand' class.
productElement ( int )This method is from 'DescribeCommand' class.
productIterator ( )This method is from 'DescribeCommand' class.
productPrefix ( )This method is from 'DescribeCommand' class.
run ( org.apache.spark.sql.SQLContext )This method is from 'DescribeCommand' class.
tupled ( )This method is from 'DescribeCommand' class.
[+] ExplainCommand (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (15)
canEqual ( java.lang.Object )This method is from 'ExplainCommand' class.
copy ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )Return value of this method has type 'ExplainCommand'.
curried ( )This method is from 'ExplainCommand' class.
equals ( java.lang.Object )This method is from 'ExplainCommand' class.
ExplainCommand ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )This constructor is from 'ExplainCommand' class.
extended ( )This method is from 'ExplainCommand' class.
hashCode ( )This method is from 'ExplainCommand' class.
logicalPlan ( )This method is from 'ExplainCommand' class.
output ( )This method is from 'ExplainCommand' class.
productArity ( )This method is from 'ExplainCommand' class.
productElement ( int )This method is from 'ExplainCommand' class.
productIterator ( )This method is from 'ExplainCommand' class.
productPrefix ( )This method is from 'ExplainCommand' class.
run ( org.apache.spark.sql.SQLContext )This method is from 'ExplainCommand' class.
tupled ( )This method is from 'ExplainCommand' class.
[+] RunnableCommand (1)
| Change | Effect |
---|
1 | Abstract method output ( ) has been added to this interface. | A client program may be interrupted by AbstractMethodError exception. Added abstract method is called in 2nd library version by the method output ( ) and may not be implemented by old clients. |
[+] affected methods (4)
cmd ( )Return value of this method has type 'RunnableCommand'.
copy ( RunnableCommand )1st parameter 'cmd' of this method has type 'RunnableCommand'.
ExecutedCommand ( RunnableCommand )1st parameter 'cmd' of this method has type 'RunnableCommand'.
run ( org.apache.spark.sql.SQLContext )This abstract method is from 'RunnableCommand' interface.
[+] SetCommand (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (14)
canEqual ( java.lang.Object )This method is from 'SetCommand' class.
copy ( scala.Option<scala.Tuple2<java.lang.String,scala.Option<java.lang.String>>>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> )Return value of this method has type 'SetCommand'.
curried ( )This method is from 'SetCommand' class.
equals ( java.lang.Object )This method is from 'SetCommand' class.
hashCode ( )This method is from 'SetCommand' class.
kv ( )This method is from 'SetCommand' class.
output ( )This method is from 'SetCommand' class.
productArity ( )This method is from 'SetCommand' class.
productElement ( int )This method is from 'SetCommand' class.
productIterator ( )This method is from 'SetCommand' class.
productPrefix ( )This method is from 'SetCommand' class.
run ( org.apache.spark.sql.SQLContext )This method is from 'SetCommand' class.
SetCommand ( scala.Option<scala.Tuple2<java.lang.String,scala.Option<java.lang.String>>>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> )This constructor is from 'SetCommand' class.
tupled ( )This method is from 'SetCommand' class.
[+] ShowTablesCommand (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (14)
andThen ( scala.Function1<ShowTablesCommand,A> )This method is from 'ShowTablesCommand' class.
canEqual ( java.lang.Object )This method is from 'ShowTablesCommand' class.
compose ( scala.Function1<A,scala.Option<java.lang.String>> )This method is from 'ShowTablesCommand' class.
copy ( scala.Option<java.lang.String> )Return value of this method has type 'ShowTablesCommand'.
databaseName ( )This method is from 'ShowTablesCommand' class.
equals ( java.lang.Object )This method is from 'ShowTablesCommand' class.
hashCode ( )This method is from 'ShowTablesCommand' class.
output ( )This method is from 'ShowTablesCommand' class.
productArity ( )This method is from 'ShowTablesCommand' class.
productElement ( int )This method is from 'ShowTablesCommand' class.
productIterator ( )This method is from 'ShowTablesCommand' class.
productPrefix ( )This method is from 'ShowTablesCommand' class.
run ( org.apache.spark.sql.SQLContext )This method is from 'ShowTablesCommand' class.
ShowTablesCommand ( scala.Option<java.lang.String> )This constructor is from 'ShowTablesCommand' class.
[+] UncacheTableCommand (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (14)
andThen ( scala.Function1<UncacheTableCommand,A> )This method is from 'UncacheTableCommand' class.
canEqual ( java.lang.Object )This method is from 'UncacheTableCommand' class.
compose ( scala.Function1<A,java.lang.String> )This method is from 'UncacheTableCommand' class.
copy ( java.lang.String )Return value of this method has type 'UncacheTableCommand'.
equals ( java.lang.Object )This method is from 'UncacheTableCommand' class.
hashCode ( )This method is from 'UncacheTableCommand' class.
output ( )This method is from 'UncacheTableCommand' class.
productArity ( )This method is from 'UncacheTableCommand' class.
productElement ( int )This method is from 'UncacheTableCommand' class.
productIterator ( )This method is from 'UncacheTableCommand' class.
productPrefix ( )This method is from 'UncacheTableCommand' class.
run ( org.apache.spark.sql.SQLContext )This method is from 'UncacheTableCommand' class.
tableName ( )This method is from 'UncacheTableCommand' class.
UncacheTableCommand ( java.lang.String )This constructor is from 'UncacheTableCommand' class.
package org.apache.spark.sql.jdbc
[+] JDBCRDD.DecimalConversion. (1)
| Change | Effect |
---|
1 | Superclass has been changed from JDBCRDD.JDBCConversion to scala.runtime.AbstractFunction1<scala.Option<scala.Tuple2<java.lang.Object,java.lang.Object>>,JDBCRDD.DecimalConversion>. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (1)
DecimalConversion ( )Return value of this method has type 'JDBCRDD.DecimalConversion.'.
package org.apache.spark.sql.parquet
[+] ParquetRelation2 (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.sources.BaseRelation to org.apache.spark.sql.sources.HadoopFsRelation. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (23)
equals ( java.lang.Object )This method is from 'ParquetRelation2' class.
hashCode ( )This method is from 'ParquetRelation2' class.
isTraceEnabled ( )This method is from 'ParquetRelation2' class.
log ( )This method is from 'ParquetRelation2' class.
logDebug ( scala.Function0<java.lang.String> )This method is from 'ParquetRelation2' class.
logDebug ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'ParquetRelation2' class.
logError ( scala.Function0<java.lang.String> )This method is from 'ParquetRelation2' class.
logError ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'ParquetRelation2' class.
logInfo ( scala.Function0<java.lang.String> )This method is from 'ParquetRelation2' class.
logInfo ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'ParquetRelation2' class.
logName ( )This method is from 'ParquetRelation2' class.
logTrace ( scala.Function0<java.lang.String> )This method is from 'ParquetRelation2' class.
logTrace ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'ParquetRelation2' class.
logWarning ( scala.Function0<java.lang.String> )This method is from 'ParquetRelation2' class.
logWarning ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'ParquetRelation2' class.
maybePartitionSpec ( )This method is from 'ParquetRelation2' class.
org.apache.spark.Logging..log_ ( )This method is from 'ParquetRelation2' class.
org.apache.spark.Logging..log__.eq ( org.slf4j.Logger )This method is from 'ParquetRelation2' class.
ParquetRelation2..maybeMetastoreSchema ( )This method is from 'ParquetRelation2' class.
ParquetRelation2..metadataCache ( )This method is from 'ParquetRelation2' class.
ParquetRelation2..shouldMergeSchemas ( )This method is from 'ParquetRelation2' class.
sizeInBytes ( )This method is from 'ParquetRelation2' class.
sqlContext ( )This method is from 'ParquetRelation2' class.
package org.apache.spark.sql.sources
[+] CreateTableUsing (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (18)
allowExisting ( )This method is from 'CreateTableUsing' class.
canEqual ( java.lang.Object )This method is from 'CreateTableUsing' class.
copy ( java.lang.String, scala.Option<org.apache.spark.sql.types.StructType>, java.lang.String, boolean, scala.collection.immutable.Map<java.lang.String,java.lang.String>, boolean, boolean )Return value of this method has type 'CreateTableUsing'.
CreateTableUsing ( java.lang.String, scala.Option<org.apache.spark.sql.types.StructType>, java.lang.String, boolean, scala.collection.immutable.Map<java.lang.String,java.lang.String>, boolean, boolean )This constructor is from 'CreateTableUsing' class.
curried ( )This method is from 'CreateTableUsing' class.
equals ( java.lang.Object )This method is from 'CreateTableUsing' class.
hashCode ( )This method is from 'CreateTableUsing' class.
managedIfNoPath ( )This method is from 'CreateTableUsing' class.
options ( )This method is from 'CreateTableUsing' class.
productArity ( )This method is from 'CreateTableUsing' class.
productElement ( int )This method is from 'CreateTableUsing' class.
productIterator ( )This method is from 'CreateTableUsing' class.
productPrefix ( )This method is from 'CreateTableUsing' class.
provider ( )This method is from 'CreateTableUsing' class.
tableName ( )This method is from 'CreateTableUsing' class.
temporary ( )This method is from 'CreateTableUsing' class.
tupled ( )This method is from 'CreateTableUsing' class.
userSpecifiedSchema ( )This method is from 'CreateTableUsing' class.
[+] CreateTempTableUsing (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (16)
canEqual ( java.lang.Object )This method is from 'CreateTempTableUsing' class.
copy ( java.lang.String, scala.Option<org.apache.spark.sql.types.StructType>, java.lang.String, scala.collection.immutable.Map<java.lang.String,java.lang.String> )Return value of this method has type 'CreateTempTableUsing'.
CreateTempTableUsing ( java.lang.String, scala.Option<org.apache.spark.sql.types.StructType>, java.lang.String, scala.collection.immutable.Map<java.lang.String,java.lang.String> )This constructor is from 'CreateTempTableUsing' class.
curried ( )This method is from 'CreateTempTableUsing' class.
equals ( java.lang.Object )This method is from 'CreateTempTableUsing' class.
hashCode ( )This method is from 'CreateTempTableUsing' class.
options ( )This method is from 'CreateTempTableUsing' class.
productArity ( )This method is from 'CreateTempTableUsing' class.
productElement ( int )This method is from 'CreateTempTableUsing' class.
productIterator ( )This method is from 'CreateTempTableUsing' class.
productPrefix ( )This method is from 'CreateTempTableUsing' class.
provider ( )This method is from 'CreateTempTableUsing' class.
run ( org.apache.spark.sql.SQLContext )This method is from 'CreateTempTableUsing' class.
tableName ( )This method is from 'CreateTempTableUsing' class.
tupled ( )This method is from 'CreateTempTableUsing' class.
userSpecifiedSchema ( )This method is from 'CreateTempTableUsing' class.
[+] CreateTempTableUsingAsSelect (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (15)
canEqual ( java.lang.Object )This method is from 'CreateTempTableUsingAsSelect' class.
curried ( )This method is from 'CreateTempTableUsingAsSelect' class.
equals ( java.lang.Object )This method is from 'CreateTempTableUsingAsSelect' class.
hashCode ( )This method is from 'CreateTempTableUsingAsSelect' class.
mode ( )This method is from 'CreateTempTableUsingAsSelect' class.
options ( )This method is from 'CreateTempTableUsingAsSelect' class.
productArity ( )This method is from 'CreateTempTableUsingAsSelect' class.
productElement ( int )This method is from 'CreateTempTableUsingAsSelect' class.
productIterator ( )This method is from 'CreateTempTableUsingAsSelect' class.
productPrefix ( )This method is from 'CreateTempTableUsingAsSelect' class.
provider ( )This method is from 'CreateTempTableUsingAsSelect' class.
query ( )This method is from 'CreateTempTableUsingAsSelect' class.
run ( org.apache.spark.sql.SQLContext )This method is from 'CreateTempTableUsingAsSelect' class.
tableName ( )This method is from 'CreateTempTableUsingAsSelect' class.
tupled ( )This method is from 'CreateTempTableUsingAsSelect' class.
[+] DescribeCommand (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (14)
canEqual ( java.lang.Object )This method is from 'DescribeCommand' class.
copy ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, boolean )Return value of this method has type 'DescribeCommand'.
curried ( )This method is from 'DescribeCommand' class.
DescribeCommand ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, boolean )This constructor is from 'DescribeCommand' class.
equals ( java.lang.Object )This method is from 'DescribeCommand' class.
hashCode ( )This method is from 'DescribeCommand' class.
isExtended ( )This method is from 'DescribeCommand' class.
output ( )This method is from 'DescribeCommand' class.
productArity ( )This method is from 'DescribeCommand' class.
productElement ( int )This method is from 'DescribeCommand' class.
productIterator ( )This method is from 'DescribeCommand' class.
productPrefix ( )This method is from 'DescribeCommand' class.
table ( )This method is from 'DescribeCommand' class.
tupled ( )This method is from 'DescribeCommand' class.
[+] InsertIntoDataSource (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (15)
canEqual ( java.lang.Object )This method is from 'InsertIntoDataSource' class.
copy ( LogicalRelation, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, boolean )Return value of this method has type 'InsertIntoDataSource'.
curried ( )This method is from 'InsertIntoDataSource' class.
equals ( java.lang.Object )This method is from 'InsertIntoDataSource' class.
hashCode ( )This method is from 'InsertIntoDataSource' class.
InsertIntoDataSource ( LogicalRelation, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan, boolean )This constructor is from 'InsertIntoDataSource' class.
logicalRelation ( )This method is from 'InsertIntoDataSource' class.
overwrite ( )This method is from 'InsertIntoDataSource' class.
productArity ( )This method is from 'InsertIntoDataSource' class.
productElement ( int )This method is from 'InsertIntoDataSource' class.
productIterator ( )This method is from 'InsertIntoDataSource' class.
productPrefix ( )This method is from 'InsertIntoDataSource' class.
query ( )This method is from 'InsertIntoDataSource' class.
run ( org.apache.spark.sql.SQLContext )This method is from 'InsertIntoDataSource' class.
tupled ( )This method is from 'InsertIntoDataSource' class.
[+] RefreshTable (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.catalyst.plans.logical.Command to org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (14)
canEqual ( java.lang.Object )This method is from 'RefreshTable' class.
copy ( java.lang.String, java.lang.String )Return value of this method has type 'RefreshTable'.
curried ( )This method is from 'RefreshTable' class.
databaseName ( )This method is from 'RefreshTable' class.
equals ( java.lang.Object )This method is from 'RefreshTable' class.
hashCode ( )This method is from 'RefreshTable' class.
productArity ( )This method is from 'RefreshTable' class.
productElement ( int )This method is from 'RefreshTable' class.
productIterator ( )This method is from 'RefreshTable' class.
productPrefix ( )This method is from 'RefreshTable' class.
RefreshTable ( java.lang.String, java.lang.String )This constructor is from 'RefreshTable' class.
run ( org.apache.spark.sql.SQLContext )This method is from 'RefreshTable' class.
tableName ( )This method is from 'RefreshTable' class.
tupled ( )This method is from 'RefreshTable' class.
to the top
Problems with Methods, Medium Severity (1)
spark-sql_2.10-1.3.0.jar, SparkPlan
package org.apache.spark.sql.execution
[+] SparkPlan.execute ( ) [abstract] : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row> (1)
[mangled: org/apache/spark/sql/execution/SparkPlan.execute:()Lorg/apache/spark/rdd/RDD;]
| Change | Effect |
---|
1 | Method became final.
| A client program trying to reimplement this method may be interrupted by VerifyError exception. |
to the top
Problems with Data Types, Low Severity (44)
spark-sql_2.10-1.3.0.jar
package org.apache.spark.sql.columnar
[+] InMemoryColumnarTableScan (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
package org.apache.spark.sql.execution
[+] Aggregate (2)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
2 | Method requiredChildDistribution ( ) has been overridden by requiredChildDistribution ( ) | Method requiredChildDistribution ( ) will be called instead of requiredChildDistribution ( ) in a client program. |
[+] affected methods (2)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
requiredChildDistribution ( )Method 'requiredChildDistribution ( )' will be called instead of this method in a client program.
[+] BatchPythonEvaluation (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Distinct (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Except (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Exchange (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] ExecutedCommand (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Expand (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] ExternalSort (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Filter (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Generate (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] GeneratedAggregate (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Intersect (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Limit (2)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
2 | Method outputPartitioning ( ) has been moved up type hierarchy to outputPartitioning ( ) | Method outputPartitioning ( ) will be called instead of outputPartitioning ( ) in a client program. |
[+] affected methods (2)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
outputPartitioning ( )Method 'outputPartitioning ( )' will be called instead of this method in a client program.
[+] LocalTableScan (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] OutputFaker (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] PhysicalRDD (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Project (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Sample (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] Sort (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] SparkPlan (1)
| Change | Effect |
---|
1 | Abstract method execute ( ) became non-abstract. | Some methods in this class may change behavior. |
[+] affected methods (1)
execute ( )This abstract method is from 'SparkPlan' abstract class.
[+] TakeOrdered (2)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
2 | Method outputPartitioning ( ) has been moved up type hierarchy to outputPartitioning ( ) | Method outputPartitioning ( ) will be called instead of outputPartitioning ( ) in a client program. |
[+] affected methods (2)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
outputPartitioning ( )Method 'outputPartitioning ( )' will be called instead of this method in a client program.
[+] Union (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
package org.apache.spark.sql.execution.joins
[+] BroadcastHashJoin (2)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
2 | Method requiredChildDistribution ( ) has been moved up type hierarchy to requiredChildDistribution ( ) | Method requiredChildDistribution ( ) will be called instead of requiredChildDistribution ( ) in a client program. |
[+] affected methods (2)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
requiredChildDistribution ( )Method 'requiredChildDistribution ( )' will be called instead of this method in a client program.
[+] BroadcastLeftSemiJoinHash (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] BroadcastNestedLoopJoin (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] CartesianProduct (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] HashOuterJoin (2)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
2 | Method requiredChildDistribution ( ) has been moved up type hierarchy to requiredChildDistribution ( ) | Method requiredChildDistribution ( ) will be called instead of requiredChildDistribution ( ) in a client program. |
[+] affected methods (2)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
requiredChildDistribution ( )Method 'requiredChildDistribution ( )' will be called instead of this method in a client program.
[+] LeftSemiJoinBNL (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] LeftSemiJoinHash (2)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
2 | Method requiredChildDistribution ( ) has been moved up type hierarchy to requiredChildDistribution ( ) | Method requiredChildDistribution ( ) will be called instead of requiredChildDistribution ( ) in a client program. |
[+] affected methods (2)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
requiredChildDistribution ( )Method 'requiredChildDistribution ( )' will be called instead of this method in a client program.
[+] ShuffledHashJoin (2)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
2 | Method requiredChildDistribution ( ) has been moved up type hierarchy to requiredChildDistribution ( ) | Method requiredChildDistribution ( ) will be called instead of requiredChildDistribution ( ) in a client program. |
[+] affected methods (2)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
requiredChildDistribution ( )Method 'requiredChildDistribution ( )' will be called instead of this method in a client program.
package org.apache.spark.sql.parquet
[+] InsertIntoParquetTable (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
[+] ParquetRelation2 (4)
| Change | Effect |
---|
1 | Method partitionColumns ( ) has been moved up type hierarchy to partitionColumns ( ) | Method partitionColumns ( ) will be called instead of partitionColumns ( ) in a client program. |
2 | Method partitionSpec ( ) has been moved up type hierarchy to partitionSpec ( ) | Method partitionSpec ( ) will be called instead of partitionSpec ( ) in a client program. |
3 | Method paths ( ) has been moved up type hierarchy to paths ( ) | Method paths ( ) will be called instead of paths ( ) in a client program. |
4 | Method schema ( ) has been moved up type hierarchy to schema ( ) | Method schema ( ) will be called instead of schema ( ) in a client program. |
[+] affected methods (4)
partitionColumns ( )Method 'partitionColumns ( )' will be called instead of this method in a client program.
partitionSpec ( )Method 'partitionSpec ( )' will be called instead of this method in a client program.
paths ( )Method 'paths ( )' will be called instead of this method in a client program.
schema ( )Method 'schema ( )' will be called instead of this method in a client program.
[+] ParquetTableScan (1)
| Change | Effect |
---|
1 | Method execute ( ) has been moved up type hierarchy to execute ( ) | Method execute ( ) will be called instead of execute ( ) in a client program. |
[+] affected methods (1)
execute ( )Method 'execute ( )' will be called instead of this method in a client program.
to the top
Problems with Methods, Low Severity (1)
spark-sql_2.10-1.3.0.jar, SparkPlan
package org.apache.spark.sql.execution
[+] SparkPlan.execute ( ) [abstract] : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row> (1)
[mangled: org/apache/spark/sql/execution/SparkPlan.execute:()Lorg/apache/spark/rdd/RDD;]
| Change | Effect |
---|
1 | Method became non-abstract.
| A client program may change behavior. |
to the top
Other Changes in Data Types (4)
spark-sql_2.10-1.3.0.jar
package org.apache.spark.sql.execution
[+] RunnableCommand (1)
| Change | Effect |
---|
1 | Abstract method children ( ) has been added to this interface. | No effect. |
[+] affected methods (4)
cmd ( )Return value of this method has type 'RunnableCommand'.
copy ( RunnableCommand )1st parameter 'cmd' of this method has type 'RunnableCommand'.
ExecutedCommand ( RunnableCommand )1st parameter 'cmd' of this method has type 'RunnableCommand'.
run ( org.apache.spark.sql.SQLContext )This abstract method is from 'RunnableCommand' interface.
[+] SparkPlan (1)
| Change | Effect |
---|
1 | Abstract method doExecute ( ) has been added to this class. | No effect. |
[+] affected methods (128)
child ( )Return value of this method has type 'SparkPlan'.
Aggregate ( boolean, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression>, SparkPlan )4th parameter 'child' of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( boolean, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression>, SparkPlan )4th parameter 'child' of this method has type 'SparkPlan'.
BatchPythonEvaluation ( PythonUDF, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )3rd parameter 'child' of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( PythonUDF, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )3rd parameter 'child' of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( SparkPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )1st parameter 'child' of this method has type 'SparkPlan'.
DescribeCommand ( SparkPlan, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, boolean )1st parameter 'child' of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( boolean, SparkPlan )2nd parameter 'child' of this method has type 'SparkPlan'.
Distinct ( boolean, SparkPlan )2nd parameter 'child' of this method has type 'SparkPlan'.
copy ( SparkPlan, SparkPlan )2nd parameter 'right' of this method has type 'SparkPlan'.
Except ( SparkPlan, SparkPlan )2nd parameter 'right' of this method has type 'SparkPlan'.
left ( )Return value of this method has type 'SparkPlan'.
right ( )Return value of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.GroupExpression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )3rd parameter 'child' of this method has type 'SparkPlan'.
Expand ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.GroupExpression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )3rd parameter 'child' of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>, boolean, SparkPlan )3rd parameter 'child' of this method has type 'SparkPlan'.
ExternalSort ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>, boolean, SparkPlan )3rd parameter 'child' of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( org.apache.spark.sql.catalyst.expressions.Expression, SparkPlan )2nd parameter 'child' of this method has type 'SparkPlan'.
Filter ( org.apache.spark.sql.catalyst.expressions.Expression, SparkPlan )2nd parameter 'child' of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( SparkPlan, SparkPlan )2nd parameter 'right' of this method has type 'SparkPlan'.
Intersect ( SparkPlan, SparkPlan )2nd parameter 'right' of this method has type 'SparkPlan'.
left ( )Return value of this method has type 'SparkPlan'.
right ( )Return value of this method has type 'SparkPlan'.
BroadcastHashJoin ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, joins.package.BuildSide, SparkPlan, SparkPlan )5th parameter 'right' of this method has type 'SparkPlan'.
buildPlan ( )Return value of this method has type 'SparkPlan'.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, joins.package.BuildSide, SparkPlan, SparkPlan )5th parameter 'right' of this method has type 'SparkPlan'.
left ( )Return value of this method has type 'SparkPlan'.
right ( )Return value of this method has type 'SparkPlan'.
streamedPlan ( )Return value of this method has type 'SparkPlan'.
BroadcastLeftSemiJoinHash ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, SparkPlan, SparkPlan )4th parameter 'right' of this method has type 'SparkPlan'.
buildPlan ( )Return value of this method has type 'SparkPlan'.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, SparkPlan, SparkPlan )4th parameter 'right' of this method has type 'SparkPlan'.
left ( )Return value of this method has type 'SparkPlan'.
right ( )Return value of this method has type 'SparkPlan'.
streamedPlan ( )Return value of this method has type 'SparkPlan'.
BroadcastNestedLoopJoin ( SparkPlan, SparkPlan, joins.package.BuildSide, org.apache.spark.sql.catalyst.plans.JoinType, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression> )2nd parameter 'right' of this method has type 'SparkPlan'.
copy ( SparkPlan, SparkPlan, joins.package.BuildSide, org.apache.spark.sql.catalyst.plans.JoinType, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression> )2nd parameter 'right' of this method has type 'SparkPlan'.
left ( )Return value of this method has type 'SparkPlan'.
right ( )Return value of this method has type 'SparkPlan'.
CartesianProduct ( SparkPlan, SparkPlan )2nd parameter 'right' of this method has type 'SparkPlan'.
copy ( SparkPlan, SparkPlan )2nd parameter 'right' of this method has type 'SparkPlan'.
left ( )Return value of this method has type 'SparkPlan'.
right ( )Return value of this method has type 'SparkPlan'.
buildPlan ( )Return value of this abstract method has type 'SparkPlan'.
left ( )Return value of this abstract method has type 'SparkPlan'.
right ( )Return value of this abstract method has type 'SparkPlan'.
streamedPlan ( )Return value of this abstract method has type 'SparkPlan'.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, org.apache.spark.sql.catalyst.plans.JoinType, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression>, SparkPlan, SparkPlan )6th parameter 'right' of this method has type 'SparkPlan'.
HashOuterJoin ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, org.apache.spark.sql.catalyst.plans.JoinType, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression>, SparkPlan, SparkPlan )6th parameter 'right' of this method has type 'SparkPlan'.
left ( )Return value of this method has type 'SparkPlan'.
right ( )Return value of this method has type 'SparkPlan'.
broadcast ( )Return value of this method has type 'SparkPlan'.
copy ( SparkPlan, SparkPlan, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression> )1st parameter 'streamed' of this method has type 'SparkPlan'.
left ( )Return value of this method has type 'SparkPlan'.
LeftSemiJoinBNL ( SparkPlan, SparkPlan, scala.Option<org.apache.spark.sql.catalyst.expressions.Expression> )1st parameter 'streamed' of this method has type 'SparkPlan'.
right ( )Return value of this method has type 'SparkPlan'.
streamed ( )Return value of this method has type 'SparkPlan'.
buildPlan ( )Return value of this method has type 'SparkPlan'.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, SparkPlan, SparkPlan )4th parameter 'right' of this method has type 'SparkPlan'.
left ( )Return value of this method has type 'SparkPlan'.
LeftSemiJoinHash ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, SparkPlan, SparkPlan )4th parameter 'right' of this method has type 'SparkPlan'.
right ( )Return value of this method has type 'SparkPlan'.
streamedPlan ( )Return value of this method has type 'SparkPlan'.
buildPlan ( )Return value of this method has type 'SparkPlan'.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, joins.package.BuildSide, SparkPlan, SparkPlan )5th parameter 'right' of this method has type 'SparkPlan'.
left ( )Return value of this method has type 'SparkPlan'.
right ( )Return value of this method has type 'SparkPlan'.
ShuffledHashJoin ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, joins.package.BuildSide, SparkPlan, SparkPlan )5th parameter 'right' of this method has type 'SparkPlan'.
streamedPlan ( )Return value of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( int, SparkPlan )2nd parameter 'child' of this method has type 'SparkPlan'.
Limit ( int, SparkPlan )2nd parameter 'child' of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )2nd parameter 'child' of this method has type 'SparkPlan'.
OutputFaker ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, SparkPlan )2nd parameter 'child' of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression>, SparkPlan )2nd parameter 'child' of this method has type 'SparkPlan'.
Project ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression>, SparkPlan )2nd parameter 'child' of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>, boolean, SparkPlan )3rd parameter 'child' of this method has type 'SparkPlan'.
Sort ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>, boolean, SparkPlan )3rd parameter 'child' of this method has type 'SparkPlan'.
codegenEnabled ( )This method is from 'SparkPlan' abstract class.
execute ( )This abstract method is from 'SparkPlan' abstract class.
executeCollect ( )This method is from 'SparkPlan' abstract class.
executeTake ( int )This method is from 'SparkPlan' abstract class.
isTraceEnabled ( )This method is from 'SparkPlan' abstract class.
log ( )This method is from 'SparkPlan' abstract class.
logDebug ( scala.Function0<java.lang.String> )This method is from 'SparkPlan' abstract class.
logDebug ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkPlan' abstract class.
logError ( scala.Function0<java.lang.String> )This method is from 'SparkPlan' abstract class.
logError ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkPlan' abstract class.
logInfo ( scala.Function0<java.lang.String> )This method is from 'SparkPlan' abstract class.
logInfo ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkPlan' abstract class.
logName ( )This method is from 'SparkPlan' abstract class.
logTrace ( scala.Function0<java.lang.String> )This method is from 'SparkPlan' abstract class.
logTrace ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkPlan' abstract class.
logWarning ( scala.Function0<java.lang.String> )This method is from 'SparkPlan' abstract class.
logWarning ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkPlan' abstract class.
makeCopy ( java.lang.Object[ ] )This method is from 'SparkPlan' abstract class.
makeCopy ( java.lang.Object[ ] )This method is from 'SparkPlan' abstract class.
newMutableProjection ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> )This method is from 'SparkPlan' abstract class.
newOrdering ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> )This method is from 'SparkPlan' abstract class.
newPredicate ( org.apache.spark.sql.catalyst.expressions.Expression, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> )This method is from 'SparkPlan' abstract class.
newProjection ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression>, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> )This method is from 'SparkPlan' abstract class.
org.apache.spark.Logging..log_ ( )This method is from 'SparkPlan' abstract class.
org.apache.spark.Logging..log__.eq ( org.slf4j.Logger )This method is from 'SparkPlan' abstract class.
outputPartitioning ( )This method is from 'SparkPlan' abstract class.
requiredChildDistribution ( )This method is from 'SparkPlan' abstract class.
sparkContext ( )This method is from 'SparkPlan' abstract class.
SparkPlan ( )This constructor is from 'SparkPlan' abstract class.
sqlContext ( )This method is from 'SparkPlan' abstract class.
child ( )Return value of this method has type 'SparkPlan'.
copy ( int, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>, SparkPlan )3rd parameter 'child' of this method has type 'SparkPlan'.
TakeOrdered ( int, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.SortOrder>, SparkPlan )3rd parameter 'child' of this method has type 'SparkPlan'.
child ( )Return value of this method has type 'SparkPlan'.
copy ( org.apache.spark.sql.parquet.ParquetRelation, SparkPlan, boolean )2nd parameter 'child' of this method has type 'SparkPlan'.
InsertIntoParquetTable ( org.apache.spark.sql.parquet.ParquetRelation, SparkPlan, boolean )2nd parameter 'child' of this method has type 'SparkPlan'.
package org.apache.spark.sql.execution.joins
[+] HashedRelation (2)
| Change | Effect |
---|
1 | Abstract method readBytes ( java.io.ObjectInput ) has been added to this interface. | No effect. |
2 | Abstract method writeBytes ( java.io.ObjectOutput, byte[ ] ) has been added to this interface. | No effect. |
[+] affected methods (6)
hashJoin ( scala.collection.Iterator<org.apache.spark.sql.Row>, HashedRelation )2nd parameter 'hashedRelation' of this method has type 'HashedRelation'.
hashJoin ( scala.collection.Iterator<org.apache.spark.sql.Row>, HashedRelation )2nd parameter 'hashedRelation' of this method has type 'HashedRelation'.
get ( org.apache.spark.sql.Row )This abstract method is from 'HashedRelation' interface.
hashJoin ( scala.collection.Iterator<org.apache.spark.sql.Row>, HashedRelation )2nd parameter 'p2' of this abstract method has type 'HashedRelation'.
hashJoin ( scala.collection.Iterator<org.apache.spark.sql.Row>, HashedRelation )2nd parameter 'hashedRelation' of this method has type 'HashedRelation'.
hashJoin ( scala.collection.Iterator<org.apache.spark.sql.Row>, HashedRelation )2nd parameter 'hashedRelation' of this method has type 'HashedRelation'.
to the top
Java ARchives (1)
spark-sql_2.10-1.3.0.jar
to the top
Generated on Wed Oct 28 11:09:12 2015 for succinct-0.1.2 by Java API Compliance Checker 1.4.1
A tool for checking backward compatibility of a Java library API