Binary compatibility report for the spark-testing-base_2.10-1.3.1_0.3.1 library between 1.3.0 and 1.1.0 versions (relating to the portability of client application spark-testing-base_2.10-1.3.1_0.3.1.jar)
Test Info
Library Name | spark-testing-base_2.10-1.3.1_0.3.1 |
Version #1 | 1.3.0 |
Version #2 | 1.1.0 |
Java Version | 1.7.0_85 |
Test Results
Total Java ARchives | 8 |
---|
Total Methods / Classes | 1361 / 3513 |
---|
Verdict | Incompatible (38.3%) |
Problem Summary
| Severity | Count |
---|
Added Methods | - | 109 |
---|
Removed Methods | High | 339 |
---|
Problems with Data Types | High | 15 |
---|
Medium | 1 |
Low | 1 |
Problems with Methods | High | 0 |
---|
Medium | 0 |
Low | 1 |
Added Methods (109)
spark-core_2.10-1.1.0.jar, Clock.class
package org.apache.spark.util
Clock.getTime ( ) [abstract] : long
[mangled: org/apache/spark/util/Clock.getTime:()J]
spark-core_2.10-1.1.0.jar, SparkConf.class
package org.apache.spark
SparkConf.settings ( ) : scala.collection.mutable.HashMap<String,String>
[mangled: org/apache/spark/SparkConf.settings:()Lscala/collection/mutable/HashMap;]
spark-core_2.10-1.1.0.jar, SparkContext.class
package org.apache.spark
SparkContext.ui ( ) : ui.SparkUI
[mangled: org/apache/spark/SparkContext.ui:()Lorg/apache/spark/ui/SparkUI;]
spark-core_2.10-1.1.0.jar, SparkListenerApplicationStart.class
package org.apache.spark.scheduler
SparkListenerApplicationStart.copy ( String appName, long time, String sparkUser ) : SparkListenerApplicationStart
[mangled: org/apache/spark/scheduler/SparkListenerApplicationStart.copy:(Ljava/lang/String;JLjava/lang/String;)Lorg/apache/spark/scheduler/SparkListenerApplicationStart;]
SparkListenerApplicationStart.SparkListenerApplicationStart ( String appName, long time, String sparkUser )
[mangled: org/apache/spark/scheduler/SparkListenerApplicationStart."<init>":(Ljava/lang/String;JLjava/lang/String;)V]
spark-core_2.10-1.1.0.jar, SparkListenerBlockManagerAdded.class
package org.apache.spark.scheduler
SparkListenerBlockManagerAdded.copy ( org.apache.spark.storage.BlockManagerId blockManagerId, long maxMem ) : SparkListenerBlockManagerAdded
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded.copy:(Lorg/apache/spark/storage/BlockManagerId;J)Lorg/apache/spark/scheduler/SparkListenerBlockManagerAdded;]
SparkListenerBlockManagerAdded.SparkListenerBlockManagerAdded ( org.apache.spark.storage.BlockManagerId blockManagerId, long maxMem )
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded."<init>":(Lorg/apache/spark/storage/BlockManagerId;J)V]
spark-core_2.10-1.1.0.jar, SparkListenerBlockManagerRemoved.class
package org.apache.spark.scheduler
SparkListenerBlockManagerRemoved.andThen ( scala.Function1<SparkListenerBlockManagerRemoved,A> p1 ) [static] : scala.Function1<org.apache.spark.storage.BlockManagerId,A>
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.andThen:(Lscala/Function1;)Lscala/Function1;]
SparkListenerBlockManagerRemoved.compose ( scala.Function1<A,org.apache.spark.storage.BlockManagerId> p1 ) [static] : scala.Function1<A,SparkListenerBlockManagerRemoved>
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.compose:(Lscala/Function1;)Lscala/Function1;]
SparkListenerBlockManagerRemoved.copy ( org.apache.spark.storage.BlockManagerId blockManagerId ) : SparkListenerBlockManagerRemoved
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.copy:(Lorg/apache/spark/storage/BlockManagerId;)Lorg/apache/spark/scheduler/SparkListenerBlockManagerRemoved;]
SparkListenerBlockManagerRemoved.SparkListenerBlockManagerRemoved ( org.apache.spark.storage.BlockManagerId blockManagerId )
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved."<init>":(Lorg/apache/spark/storage/BlockManagerId;)V]
spark-core_2.10-1.1.0.jar, SparkListenerJobEnd.class
package org.apache.spark.scheduler
SparkListenerJobEnd.copy ( int jobId, JobResult jobResult ) : SparkListenerJobEnd
[mangled: org/apache/spark/scheduler/SparkListenerJobEnd.copy:(ILorg/apache/spark/scheduler/JobResult;)Lorg/apache/spark/scheduler/SparkListenerJobEnd;]
SparkListenerJobEnd.SparkListenerJobEnd ( int jobId, JobResult jobResult )
[mangled: org/apache/spark/scheduler/SparkListenerJobEnd."<init>":(ILorg/apache/spark/scheduler/JobResult;)V]
spark-core_2.10-1.1.0.jar, SparkListenerJobStart.class
package org.apache.spark.scheduler
SparkListenerJobStart.copy ( int jobId, scala.collection.Seq<Object> stageIds, java.util.Properties properties ) : SparkListenerJobStart
[mangled: org/apache/spark/scheduler/SparkListenerJobStart.copy:(ILscala/collection/Seq;Ljava/util/Properties;)Lorg/apache/spark/scheduler/SparkListenerJobStart;]
SparkListenerJobStart.SparkListenerJobStart ( int jobId, scala.collection.Seq<Object> stageIds, java.util.Properties properties )
[mangled: org/apache/spark/scheduler/SparkListenerJobStart."<init>":(ILscala/collection/Seq;Ljava/util/Properties;)V]
spark-core_2.10-1.1.0.jar, TaskMetrics.class
package org.apache.spark.executor
TaskMetrics.diskBytesSpilled_.eq ( long p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.diskBytesSpilled_.eq:(J)V]
TaskMetrics.executorDeserializeTime_.eq ( long p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.executorDeserializeTime_.eq:(J)V]
TaskMetrics.executorRunTime_.eq ( long p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.executorRunTime_.eq:(J)V]
TaskMetrics.hostname_.eq ( String p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.hostname_.eq:(Ljava/lang/String;)V]
TaskMetrics.inputMetrics_.eq ( scala.Option<InputMetrics> p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.inputMetrics_.eq:(Lscala/Option;)V]
TaskMetrics.jvmGCTime_.eq ( long p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.jvmGCTime_.eq:(J)V]
TaskMetrics.memoryBytesSpilled_.eq ( long p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.memoryBytesSpilled_.eq:(J)V]
TaskMetrics.resultSerializationTime_.eq ( long p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.resultSerializationTime_.eq:(J)V]
TaskMetrics.resultSize_.eq ( long p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.resultSize_.eq:(J)V]
spark-hive_2.10-1.1.0.jar, HiveContext.class
package org.apache.spark.sql.hive
HiveContext.createTable ( String tableName, boolean allowExisting, scala.reflect.api.TypeTags.TypeTag<A> p3 ) : void
[mangled: org/apache/spark/sql/hive/HiveContext.createTable:(Ljava/lang/String;ZLscala/reflect/api/TypeTags$TypeTag;)V]
HiveContext.dialect ( ) : String
[mangled: org/apache/spark/sql/hive/HiveContext.dialect:()Ljava/lang/String;]
HiveContext.sql ( String sqlText ) : org.apache.spark.sql.SchemaRDD
[mangled: org/apache/spark/sql/hive/HiveContext.sql:(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;]
spark-sql_2.10-1.1.0.jar, SQLContext.class
package org.apache.spark.sql
SQLContext.applySchema ( org.apache.spark.rdd.RDD<catalyst.expressions.Row> rowRDD, catalyst.types.StructType schema ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.applySchema:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/catalyst/types/StructType;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.applySchemaToPythonRDD ( org.apache.spark.rdd.RDD<Object[ ]> rdd, catalyst.types.StructType schema ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.applySchemaToPythonRDD:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/catalyst/types/StructType;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.applySchemaToPythonRDD ( org.apache.spark.rdd.RDD<Object[ ]> rdd, String schemaString ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.applySchemaToPythonRDD:(Lorg/apache/spark/rdd/RDD;Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.approxCountDistinct ( catalyst.expressions.Expression e, double rsd ) : catalyst.expressions.ApproxCountDistinct
[mangled: org/apache/spark/sql/SQLContext.approxCountDistinct:(Lorg/apache/spark/sql/catalyst/expressions/Expression;D)Lorg/apache/spark/sql/catalyst/expressions/ApproxCountDistinct;]
SQLContext.autoBroadcastJoinThreshold ( ) : int
[mangled: org/apache/spark/sql/SQLContext.autoBroadcastJoinThreshold:()I]
SQLContext.avg ( catalyst.expressions.Expression e ) : catalyst.expressions.Average
[mangled: org/apache/spark/sql/SQLContext.avg:(Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/catalyst/expressions/Average;]
SQLContext.binaryToLiteral ( byte[ ] a ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.binaryToLiteral:([B)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.booleanToLiteral ( boolean b ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.booleanToLiteral:(Z)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.byteToLiteral ( byte b ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.byteToLiteral:(B)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.clear ( ) : void
[mangled: org/apache/spark/sql/SQLContext.clear:()V]
SQLContext.codegenEnabled ( ) : boolean
[mangled: org/apache/spark/sql/SQLContext.codegenEnabled:()Z]
SQLContext.columnBatchSize ( ) : int
[mangled: org/apache/spark/sql/SQLContext.columnBatchSize:()I]
SQLContext.count ( catalyst.expressions.Expression e ) : catalyst.expressions.Count
[mangled: org/apache/spark/sql/SQLContext.count:(Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/catalyst/expressions/Count;]
SQLContext.countDistinct ( scala.collection.Seq<catalyst.expressions.Expression> e ) : catalyst.expressions.CountDistinct
[mangled: org/apache/spark/sql/SQLContext.countDistinct:(Lscala/collection/Seq;)Lorg/apache/spark/sql/catalyst/expressions/CountDistinct;]
SQLContext.createParquetFile ( String path, boolean allowExisting, org.apache.hadoop.conf.Configuration conf, scala.reflect.api.TypeTags.TypeTag<A> p4 ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.createParquetFile:(Ljava/lang/String;ZLorg/apache/hadoop/conf/Configuration;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.createSchemaRDD ( org.apache.spark.rdd.RDD<A> rdd, scala.reflect.api.TypeTags.TypeTag<A> p2 ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.createSchemaRDD:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.decimalToLiteral ( scala.math.BigDecimal d ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.decimalToLiteral:(Lscala/math/BigDecimal;)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.defaultSizeInBytes ( ) : long
[mangled: org/apache/spark/sql/SQLContext.defaultSizeInBytes:()J]
SQLContext.dialect ( ) : String
[mangled: org/apache/spark/sql/SQLContext.dialect:()Ljava/lang/String;]
SQLContext.doubleToLiteral ( double d ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.doubleToLiteral:(D)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.DslAttribute ( catalyst.expressions.AttributeReference a ) : catalyst.dsl.package.ExpressionConversions.DslAttribute
[mangled: org/apache/spark/sql/SQLContext.DslAttribute:(Lorg/apache/spark/sql/catalyst/expressions/AttributeReference;)Lorg/apache/spark/sql/catalyst/dsl/package$ExpressionConversions$DslAttribute;]
SQLContext.DslExpression ( catalyst.expressions.Expression e ) : catalyst.dsl.package.ExpressionConversions.DslExpression
[mangled: org/apache/spark/sql/SQLContext.DslExpression:(Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/catalyst/dsl/package$ExpressionConversions$DslExpression;]
SQLContext.DslString ( String s ) : catalyst.dsl.package.ExpressionConversions.DslString
[mangled: org/apache/spark/sql/SQLContext.DslString:(Ljava/lang/String;)Lorg/apache/spark/sql/catalyst/dsl/package$ExpressionConversions$DslString;]
SQLContext.DslSymbol ( scala.Symbol sym ) : catalyst.dsl.package.ExpressionConversions.DslSymbol
[mangled: org/apache/spark/sql/SQLContext.DslSymbol:(Lscala/Symbol;)Lorg/apache/spark/sql/catalyst/dsl/package$ExpressionConversions$DslSymbol;]
SQLContext.first ( catalyst.expressions.Expression e ) : catalyst.expressions.First
[mangled: org/apache/spark/sql/SQLContext.first:(Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/catalyst/expressions/First;]
SQLContext.floatToLiteral ( float f ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.floatToLiteral:(F)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.intToLiteral ( int i ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.intToLiteral:(I)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.isParquetBinaryAsString ( ) : boolean
[mangled: org/apache/spark/sql/SQLContext.isParquetBinaryAsString:()Z]
SQLContext.jsonFile ( String path ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.jsonFile:(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.jsonFile ( String path, double samplingRatio ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.jsonFile:(Ljava/lang/String;D)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.jsonFile ( String path, catalyst.types.StructType schema ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.jsonFile:(Ljava/lang/String;Lorg/apache/spark/sql/catalyst/types/StructType;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.jsonRDD ( org.apache.spark.rdd.RDD<String> json ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/rdd/RDD;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.jsonRDD ( org.apache.spark.rdd.RDD<String> json, double samplingRatio ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/rdd/RDD;D)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.jsonRDD ( org.apache.spark.rdd.RDD<String> json, catalyst.types.StructType schema ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/catalyst/types/StructType;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.logicalPlanToSparkQuery ( catalyst.plans.logical.LogicalPlan plan ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.logicalPlanToSparkQuery:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.longToLiteral ( long l ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.longToLiteral:(J)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.lower ( catalyst.expressions.Expression e ) : catalyst.expressions.Lower
[mangled: org/apache/spark/sql/SQLContext.lower:(Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/catalyst/expressions/Lower;]
SQLContext.max ( catalyst.expressions.Expression e ) : catalyst.expressions.Max
[mangled: org/apache/spark/sql/SQLContext.max:(Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/catalyst/expressions/Max;]
SQLContext.min ( catalyst.expressions.Expression e ) : catalyst.expressions.Min
[mangled: org/apache/spark/sql/SQLContext.min:(Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/catalyst/expressions/Min;]
SQLContext.numShufflePartitions ( ) : int
[mangled: org/apache/spark/sql/SQLContext.numShufflePartitions:()I]
SQLContext.optimizer ( ) : catalyst.optimizer.Optimizer.
[mangled: org/apache/spark/sql/SQLContext.optimizer:()Lorg/apache/spark/sql/catalyst/optimizer/Optimizer$;]
SQLContext.SQLConf._setter_.settings_.eq ( java.util.Map p1 ) : void
[mangled: org/apache/spark/sql/SQLContext.org.apache.spark.sql.SQLConf._setter_.settings_.eq:(Ljava/util/Map;)V]
SQLContext.parquetCompressionCodec ( ) : String
[mangled: org/apache/spark/sql/SQLContext.parquetCompressionCodec:()Ljava/lang/String;]
SQLContext.parquetFile ( String path ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.parquetFile:(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.parseDataType ( String dataTypeString ) : catalyst.types.DataType
[mangled: org/apache/spark/sql/SQLContext.parseDataType:(Ljava/lang/String;)Lorg/apache/spark/sql/catalyst/types/DataType;]
SQLContext.parser ( ) : catalyst.SqlParser
[mangled: org/apache/spark/sql/SQLContext.parser:()Lorg/apache/spark/sql/catalyst/SqlParser;]
SQLContext.registerFunction ( String name, scala.Function10<?,?,?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function10;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function11<?,?,?,?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function11;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function12<?,?,?,?,?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function12;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function13<?,?,?,?,?,?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function13;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function14<?,?,?,?,?,?,?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function14;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function15<?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function15;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function16<?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function16;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function17<?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function17;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function18<?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function18;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function19<?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function19;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function1<?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function1;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function20<?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function20;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function21<?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function21;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function22<?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function22;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function2<?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function2;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function3<?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function3;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function4<?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function4;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function5<?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function5;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function6<?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function6;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function7<?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function7;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function8<?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function8;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerFunction ( String name, scala.Function9<?,?,?,?,?,?,?,?,?,T> func, scala.reflect.api.TypeTags.TypeTag<T> p3 ) : void
[mangled: org/apache/spark/sql/SQLContext.registerFunction:(Ljava/lang/String;Lscala/Function9;Lscala/reflect/api/TypeTags$TypeTag;)V]
SQLContext.registerPython ( String name, byte[ ] command, java.util.Map<String,String> envVars, java.util.List<String> pythonIncludes, String pythonExec, org.apache.spark.Accumulator<java.util.List<byte[ ]>> accumulator, String stringDataType ) : void
[mangled: org/apache/spark/sql/SQLContext.registerPython:(Ljava/lang/String;[BLjava/util/Map;Ljava/util/List;Ljava/lang/String;Lorg/apache/spark/Accumulator;Ljava/lang/String;)V]
SQLContext.registerRDDAsTable ( SchemaRDD rdd, String tableName ) : void
[mangled: org/apache/spark/sql/SQLContext.registerRDDAsTable:(Lorg/apache/spark/sql/SchemaRDD;Ljava/lang/String;)V]
SQLContext.settings ( ) : java.util.Map<String,String>
[mangled: org/apache/spark/sql/SQLContext.settings:()Ljava/util/Map;]
SQLContext.shortToLiteral ( short s ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.shortToLiteral:(S)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.sql ( String sqlText ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.sql:(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.stringToLiteral ( String s ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.stringToLiteral:(Ljava/lang/String;)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.sum ( catalyst.expressions.Expression e ) : catalyst.expressions.Sum
[mangled: org/apache/spark/sql/SQLContext.sum:(Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/catalyst/expressions/Sum;]
SQLContext.sumDistinct ( catalyst.expressions.Expression e ) : catalyst.expressions.SumDistinct
[mangled: org/apache/spark/sql/SQLContext.sumDistinct:(Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/catalyst/expressions/SumDistinct;]
SQLContext.symbolToUnresolvedAttribute ( scala.Symbol s ) : catalyst.analysis.UnresolvedAttribute
[mangled: org/apache/spark/sql/SQLContext.symbolToUnresolvedAttribute:(Lscala/Symbol;)Lorg/apache/spark/sql/catalyst/analysis/UnresolvedAttribute;]
SQLContext.table ( String tableName ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.table:(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.timestampToLiteral ( java.sql.Timestamp t ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.timestampToLiteral:(Ljava/sql/Timestamp;)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.upper ( catalyst.expressions.Expression e ) : catalyst.expressions.Upper
[mangled: org/apache/spark/sql/SQLContext.upper:(Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/catalyst/expressions/Upper;]
SQLContext.useCompression ( ) : boolean
[mangled: org/apache/spark/sql/SQLContext.useCompression:()Z]
spark-streaming_2.10-1.1.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.uiTab ( ) : ui.StreamingTab
[mangled: org/apache/spark/streaming/StreamingContext.uiTab:()Lorg/apache/spark/streaming/ui/StreamingTab;]
to the top
Removed Methods (339)
spark-catalyst_2.10-1.3.0.jar, Row.class
package org.apache.spark.sql
Row.anyNull ( ) [abstract] : boolean
[mangled: org/apache/spark/sql/Row.anyNull:()Z]
Row.apply ( int p1 ) [abstract] : Object
[mangled: org/apache/spark/sql/Row.apply:(I)Ljava/lang/Object;]
Row.copy ( ) [abstract] : Row
[mangled: org/apache/spark/sql/Row.copy:()Lorg/apache/spark/sql/Row;]
Row.equals ( Object p1 ) [abstract] : boolean
[mangled: org/apache/spark/sql/Row.equals:(Ljava/lang/Object;)Z]
Row.get ( int p1 ) [abstract] : Object
[mangled: org/apache/spark/sql/Row.get:(I)Ljava/lang/Object;]
Row.getAs ( int p1 ) [abstract] : T
[mangled: org/apache/spark/sql/Row.getAs:(I)Ljava/lang/Object;]
Row.getBoolean ( int p1 ) [abstract] : boolean
[mangled: org/apache/spark/sql/Row.getBoolean:(I)Z]
Row.getByte ( int p1 ) [abstract] : byte
[mangled: org/apache/spark/sql/Row.getByte:(I)B]
Row.getDate ( int p1 ) [abstract] : java.sql.Date
[mangled: org/apache/spark/sql/Row.getDate:(I)Ljava/sql/Date;]
Row.getDecimal ( int p1 ) [abstract] : java.math.BigDecimal
[mangled: org/apache/spark/sql/Row.getDecimal:(I)Ljava/math/BigDecimal;]
Row.getDouble ( int p1 ) [abstract] : double
[mangled: org/apache/spark/sql/Row.getDouble:(I)D]
Row.getFloat ( int p1 ) [abstract] : float
[mangled: org/apache/spark/sql/Row.getFloat:(I)F]
Row.getInt ( int p1 ) [abstract] : int
[mangled: org/apache/spark/sql/Row.getInt:(I)I]
Row.getJavaMap ( int p1 ) [abstract] : java.util.Map<K,V>
[mangled: org/apache/spark/sql/Row.getJavaMap:(I)Ljava/util/Map;]
Row.getList ( int p1 ) [abstract] : java.util.List<T>
[mangled: org/apache/spark/sql/Row.getList:(I)Ljava/util/List;]
Row.getLong ( int p1 ) [abstract] : long
[mangled: org/apache/spark/sql/Row.getLong:(I)J]
Row.getMap ( int p1 ) [abstract] : scala.collection.Map<K,V>
[mangled: org/apache/spark/sql/Row.getMap:(I)Lscala/collection/Map;]
Row.getSeq ( int p1 ) [abstract] : scala.collection.Seq<T>
[mangled: org/apache/spark/sql/Row.getSeq:(I)Lscala/collection/Seq;]
Row.getShort ( int p1 ) [abstract] : short
[mangled: org/apache/spark/sql/Row.getShort:(I)S]
Row.getString ( int p1 ) [abstract] : String
[mangled: org/apache/spark/sql/Row.getString:(I)Ljava/lang/String;]
Row.getStruct ( int p1 ) [abstract] : Row
[mangled: org/apache/spark/sql/Row.getStruct:(I)Lorg/apache/spark/sql/Row;]
Row.hashCode ( ) [abstract] : int
[mangled: org/apache/spark/sql/Row.hashCode:()I]
Row.isNullAt ( int p1 ) [abstract] : boolean
[mangled: org/apache/spark/sql/Row.isNullAt:(I)Z]
Row.length ( ) [abstract] : int
[mangled: org/apache/spark/sql/Row.length:()I]
Row.mkString ( ) [abstract] : String
[mangled: org/apache/spark/sql/Row.mkString:()Ljava/lang/String;]
Row.mkString ( String p1 ) [abstract] : String
[mangled: org/apache/spark/sql/Row.mkString:(Ljava/lang/String;)Ljava/lang/String;]
Row.mkString ( String p1, String p2, String p3 ) [abstract] : String
[mangled: org/apache/spark/sql/Row.mkString:(Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;)Ljava/lang/String;]
Row.schema ( ) [abstract] : types.StructType
[mangled: org/apache/spark/sql/Row.schema:()Lorg/apache/spark/sql/types/StructType;]
Row.size ( ) [abstract] : int
[mangled: org/apache/spark/sql/Row.size:()I]
Row.toSeq ( ) [abstract] : scala.collection.Seq<Object>
[mangled: org/apache/spark/sql/Row.toSeq:()Lscala/collection/Seq;]
Row.toString ( ) [abstract] : String
[mangled: org/apache/spark/sql/Row.toString:()Ljava/lang/String;]
spark-core_2.10-1.3.0.jar, Clock.class
package org.apache.spark.util
Clock.getTimeMillis ( ) [abstract] : long
[mangled: org/apache/spark/util/Clock.getTimeMillis:()J]
Clock.waitTillTime ( long p1 ) [abstract] : long
[mangled: org/apache/spark/util/Clock.waitTillTime:(J)J]
spark-core_2.10-1.3.0.jar, InputMetrics.class
package org.apache.spark.executor
InputMetrics.recordsRead ( ) : long
[mangled: org/apache/spark/executor/InputMetrics.recordsRead:()J]
spark-core_2.10-1.3.0.jar, OutputMetrics.class
package org.apache.spark.executor
OutputMetrics.recordsWritten ( ) : long
[mangled: org/apache/spark/executor/OutputMetrics.recordsWritten:()J]
spark-core_2.10-1.3.0.jar, RDD<T>.class
package org.apache.spark.rdd
RDD<T>.doubleRDDToDoubleRDDFunctions ( RDD<Object> p1 ) [static] : DoubleRDDFunctions
[mangled: org/apache/spark/rdd/RDD<T>.doubleRDDToDoubleRDDFunctions:(Lorg/apache/spark/rdd/RDD;)Lorg/apache/spark/rdd/DoubleRDDFunctions;]
RDD<T>.isEmpty ( ) : boolean
[mangled: org/apache/spark/rdd/RDD<T>.isEmpty:()Z]
RDD<T>.numericRDDToDoubleRDDFunctions ( RDD<T> p1, scala.math.Numeric<T> p2 ) [static] : DoubleRDDFunctions
[mangled: org/apache/spark/rdd/RDD<T>.numericRDDToDoubleRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/math/Numeric;)Lorg/apache/spark/rdd/DoubleRDDFunctions;]
RDD<T>.parent ( int j, scala.reflect.ClassTag<U> p2 ) : RDD<U>
[mangled: org/apache/spark/rdd/RDD<T>.parent:(ILscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
RDD<T>.rddToAsyncRDDActions ( RDD<T> p1, scala.reflect.ClassTag<T> p2 ) [static] : AsyncRDDActions<T>
[mangled: org/apache/spark/rdd/RDD<T>.rddToAsyncRDDActions:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/AsyncRDDActions;]
RDD<T>.rddToOrderedRDDFunctions ( RDD<scala.Tuple2<K,V>> p1, scala.math.Ordering<K> p2, scala.reflect.ClassTag<K> p3, scala.reflect.ClassTag<V> p4 ) [static] : OrderedRDDFunctions<K,V,scala.Tuple2<K,V>>
[mangled: org/apache/spark/rdd/RDD<T>.rddToOrderedRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/math/Ordering;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/OrderedRDDFunctions;]
RDD<T>.rddToPairRDDFunctions ( RDD<scala.Tuple2<K,V>> p1, scala.reflect.ClassTag<K> p2, scala.reflect.ClassTag<V> p3, scala.math.Ordering<K> p4 ) [static] : PairRDDFunctions<K,V>
[mangled: org/apache/spark/rdd/RDD<T>.rddToPairRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/math/Ordering;)Lorg/apache/spark/rdd/PairRDDFunctions;]
RDD<T>.rddToSequenceFileRDDFunctions ( RDD<scala.Tuple2<K,V>> p1, scala.reflect.ClassTag<K> p2, scala.reflect.ClassTag<V> p3, org.apache.spark.WritableFactory<K> p4, org.apache.spark.WritableFactory<V> p5 ) [static] : SequenceFileRDDFunctions<K,V>
[mangled: org/apache/spark/rdd/RDD<T>.rddToSequenceFileRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lorg/apache/spark/WritableFactory;Lorg/apache/spark/WritableFactory;)Lorg/apache/spark/rdd/SequenceFileRDDFunctions;]
RDD<T>.treeAggregate ( U zeroValue, scala.Function2<U,T,U> seqOp, scala.Function2<U,U,U> combOp, int depth, scala.reflect.ClassTag<U> p5 ) : U
[mangled: org/apache/spark/rdd/RDD<T>.treeAggregate:(Ljava/lang/Object;Lscala/Function2;Lscala/Function2;ILscala/reflect/ClassTag;)Ljava/lang/Object;]
RDD<T>.treeReduce ( scala.Function2<T,T,T> f, int depth ) : T
[mangled: org/apache/spark/rdd/RDD<T>.treeReduce:(Lscala/Function2;I)Ljava/lang/Object;]
spark-core_2.10-1.3.0.jar, SparkConf.class
package org.apache.spark
SparkConf.getAppId ( ) : String
[mangled: org/apache/spark/SparkConf.getAppId:()Ljava/lang/String;]
SparkConf.registerKryoClasses ( Class<?>[ ] classes ) : SparkConf
[mangled: org/apache/spark/SparkConf.registerKryoClasses:([Ljava/lang/Class;)Lorg/apache/spark/SparkConf;]
SparkConf.translateConfKey ( String p1, boolean p2 ) [static] : String
[mangled: org/apache/spark/SparkConf.translateConfKey:(Ljava/lang/String;Z)Ljava/lang/String;]
spark-core_2.10-1.3.0.jar, SparkContext.class
package org.apache.spark
SparkContext.addFile ( String path, boolean recursive ) : void
[mangled: org/apache/spark/SparkContext.addFile:(Ljava/lang/String;Z)V]
SparkContext.applicationId ( ) : String
[mangled: org/apache/spark/SparkContext.applicationId:()Ljava/lang/String;]
SparkContext.binaryFiles ( String path, int minPartitions ) : rdd.RDD<scala.Tuple2<String,input.PortableDataStream>>
[mangled: org/apache/spark/SparkContext.binaryFiles:(Ljava/lang/String;I)Lorg/apache/spark/rdd/RDD;]
SparkContext.binaryRecords ( String path, int recordLength, org.apache.hadoop.conf.Configuration conf ) : rdd.RDD<byte[ ]>
[mangled: org/apache/spark/SparkContext.binaryRecords:(Ljava/lang/String;ILorg/apache/hadoop/conf/Configuration;)Lorg/apache/spark/rdd/RDD;]
SparkContext.createSparkEnv ( SparkConf conf, boolean isLocal, scheduler.LiveListenerBus listenerBus ) : SparkEnv
[mangled: org/apache/spark/SparkContext.createSparkEnv:(Lorg/apache/spark/SparkConf;ZLorg/apache/spark/scheduler/LiveListenerBus;)Lorg/apache/spark/SparkEnv;]
SparkContext.eventLogCodec ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.eventLogCodec:()Lscala/Option;]
SparkContext.eventLogDir ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.eventLogDir:()Lscala/Option;]
SparkContext.executorAllocationManager ( ) : scala.Option<ExecutorAllocationManager>
[mangled: org/apache/spark/SparkContext.executorAllocationManager:()Lscala/Option;]
SparkContext.getExecutorThreadDump ( String executorId ) : scala.Option<util.ThreadStackTrace[ ]>
[mangled: org/apache/spark/SparkContext.getExecutorThreadDump:(Ljava/lang/String;)Lscala/Option;]
SparkContext.isEventLogEnabled ( ) : boolean
[mangled: org/apache/spark/SparkContext.isEventLogEnabled:()Z]
SparkContext.jobProgressListener ( ) : ui.jobs.JobProgressListener
[mangled: org/apache/spark/SparkContext.jobProgressListener:()Lorg/apache/spark/ui/jobs/JobProgressListener;]
SparkContext.killExecutor ( String executorId ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutor:(Ljava/lang/String;)Z]
SparkContext.killExecutors ( scala.collection.Seq<String> executorIds ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutors:(Lscala/collection/Seq;)Z]
SparkContext.metricsSystem ( ) : metrics.MetricsSystem
[mangled: org/apache/spark/SparkContext.metricsSystem:()Lorg/apache/spark/metrics/MetricsSystem;]
SparkContext.SparkContext..creationSite ( ) : util.CallSite
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext..creationSite:()Lorg/apache/spark/util/CallSite;]
SparkContext.progressBar ( ) : scala.Option<ui.ConsoleProgressBar>
[mangled: org/apache/spark/SparkContext.progressBar:()Lscala/Option;]
SparkContext.requestExecutors ( int numAdditionalExecutors ) : boolean
[mangled: org/apache/spark/SparkContext.requestExecutors:(I)Z]
SparkContext.requestTotalExecutors ( int numExecutors ) : boolean
[mangled: org/apache/spark/SparkContext.requestTotalExecutors:(I)Z]
SparkContext.schedulerBackend ( ) : scheduler.SchedulerBackend
[mangled: org/apache/spark/SparkContext.schedulerBackend:()Lorg/apache/spark/scheduler/SchedulerBackend;]
SparkContext.schedulerBackend_.eq ( scheduler.SchedulerBackend p1 ) : void
[mangled: org/apache/spark/SparkContext.schedulerBackend_.eq:(Lorg/apache/spark/scheduler/SchedulerBackend;)V]
SparkContext.setCallSite ( util.CallSite callSite ) : void
[mangled: org/apache/spark/SparkContext.setCallSite:(Lorg/apache/spark/util/CallSite;)V]
SparkContext.statusTracker ( ) : SparkStatusTracker
[mangled: org/apache/spark/SparkContext.statusTracker:()Lorg/apache/spark/SparkStatusTracker;]
SparkContext.ui ( ) : scala.Option<ui.SparkUI>
[mangled: org/apache/spark/SparkContext.ui:()Lscala/Option;]
spark-core_2.10-1.3.0.jar, SparkListener.class
package org.apache.spark.scheduler
SparkListener.onExecutorAdded ( SparkListenerExecutorAdded p1 ) [abstract] : void
[mangled: org/apache/spark/scheduler/SparkListener.onExecutorAdded:(Lorg/apache/spark/scheduler/SparkListenerExecutorAdded;)V]
SparkListener.onExecutorRemoved ( SparkListenerExecutorRemoved p1 ) [abstract] : void
[mangled: org/apache/spark/scheduler/SparkListener.onExecutorRemoved:(Lorg/apache/spark/scheduler/SparkListenerExecutorRemoved;)V]
spark-core_2.10-1.3.0.jar, SparkListenerApplicationStart.class
package org.apache.spark.scheduler
SparkListenerApplicationStart.appId ( ) : scala.Option<String>
[mangled: org/apache/spark/scheduler/SparkListenerApplicationStart.appId:()Lscala/Option;]
SparkListenerApplicationStart.copy ( String appName, scala.Option<String> appId, long time, String sparkUser ) : SparkListenerApplicationStart
[mangled: org/apache/spark/scheduler/SparkListenerApplicationStart.copy:(Ljava/lang/String;Lscala/Option;JLjava/lang/String;)Lorg/apache/spark/scheduler/SparkListenerApplicationStart;]
SparkListenerApplicationStart.SparkListenerApplicationStart ( String appName, scala.Option<String> appId, long time, String sparkUser )
[mangled: org/apache/spark/scheduler/SparkListenerApplicationStart."<init>":(Ljava/lang/String;Lscala/Option;JLjava/lang/String;)V]
spark-core_2.10-1.3.0.jar, SparkListenerBlockManagerAdded.class
package org.apache.spark.scheduler
SparkListenerBlockManagerAdded.copy ( long time, org.apache.spark.storage.BlockManagerId blockManagerId, long maxMem ) : SparkListenerBlockManagerAdded
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded.copy:(JLorg/apache/spark/storage/BlockManagerId;J)Lorg/apache/spark/scheduler/SparkListenerBlockManagerAdded;]
SparkListenerBlockManagerAdded.SparkListenerBlockManagerAdded ( long time, org.apache.spark.storage.BlockManagerId blockManagerId, long maxMem )
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded."<init>":(JLorg/apache/spark/storage/BlockManagerId;J)V]
SparkListenerBlockManagerAdded.time ( ) : long
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded.time:()J]
spark-core_2.10-1.3.0.jar, SparkListenerBlockManagerRemoved.class
package org.apache.spark.scheduler
SparkListenerBlockManagerRemoved.copy ( long time, org.apache.spark.storage.BlockManagerId blockManagerId ) : SparkListenerBlockManagerRemoved
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.copy:(JLorg/apache/spark/storage/BlockManagerId;)Lorg/apache/spark/scheduler/SparkListenerBlockManagerRemoved;]
SparkListenerBlockManagerRemoved.curried ( ) [static] : scala.Function1<Object,scala.Function1<org.apache.spark.storage.BlockManagerId,SparkListenerBlockManagerRemoved>>
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.curried:()Lscala/Function1;]
SparkListenerBlockManagerRemoved.SparkListenerBlockManagerRemoved ( long time, org.apache.spark.storage.BlockManagerId blockManagerId )
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved."<init>":(JLorg/apache/spark/storage/BlockManagerId;)V]
SparkListenerBlockManagerRemoved.time ( ) : long
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.time:()J]
SparkListenerBlockManagerRemoved.tupled ( ) [static] : scala.Function1<scala.Tuple2<Object,org.apache.spark.storage.BlockManagerId>,SparkListenerBlockManagerRemoved>
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.tupled:()Lscala/Function1;]
spark-core_2.10-1.3.0.jar, SparkListenerExecutorAdded.class
package org.apache.spark.scheduler
SparkListenerExecutorAdded.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.canEqual:(Ljava/lang/Object;)Z]
SparkListenerExecutorAdded.copy ( long time, String executorId, cluster.ExecutorInfo executorInfo ) : SparkListenerExecutorAdded
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.copy:(JLjava/lang/String;Lorg/apache/spark/scheduler/cluster/ExecutorInfo;)Lorg/apache/spark/scheduler/SparkListenerExecutorAdded;]
SparkListenerExecutorAdded.curried ( ) [static] : scala.Function1<Object,scala.Function1<String,scala.Function1<cluster.ExecutorInfo,SparkListenerExecutorAdded>>>
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.curried:()Lscala/Function1;]
SparkListenerExecutorAdded.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.equals:(Ljava/lang/Object;)Z]
SparkListenerExecutorAdded.executorId ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.executorId:()Ljava/lang/String;]
SparkListenerExecutorAdded.executorInfo ( ) : cluster.ExecutorInfo
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.executorInfo:()Lorg/apache/spark/scheduler/cluster/ExecutorInfo;]
SparkListenerExecutorAdded.hashCode ( ) : int
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.hashCode:()I]
SparkListenerExecutorAdded.productArity ( ) : int
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.productArity:()I]
SparkListenerExecutorAdded.productElement ( int p1 ) : Object
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.productElement:(I)Ljava/lang/Object;]
SparkListenerExecutorAdded.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.productIterator:()Lscala/collection/Iterator;]
SparkListenerExecutorAdded.productPrefix ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.productPrefix:()Ljava/lang/String;]
SparkListenerExecutorAdded.SparkListenerExecutorAdded ( long time, String executorId, cluster.ExecutorInfo executorInfo )
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded."<init>":(JLjava/lang/String;Lorg/apache/spark/scheduler/cluster/ExecutorInfo;)V]
SparkListenerExecutorAdded.time ( ) : long
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.time:()J]
SparkListenerExecutorAdded.toString ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.toString:()Ljava/lang/String;]
SparkListenerExecutorAdded.tupled ( ) [static] : scala.Function1<scala.Tuple3<Object,String,cluster.ExecutorInfo>,SparkListenerExecutorAdded>
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.tupled:()Lscala/Function1;]
spark-core_2.10-1.3.0.jar, SparkListenerExecutorRemoved.class
package org.apache.spark.scheduler
SparkListenerExecutorRemoved.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.canEqual:(Ljava/lang/Object;)Z]
SparkListenerExecutorRemoved.copy ( long time, String executorId, String reason ) : SparkListenerExecutorRemoved
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.copy:(JLjava/lang/String;Ljava/lang/String;)Lorg/apache/spark/scheduler/SparkListenerExecutorRemoved;]
SparkListenerExecutorRemoved.curried ( ) [static] : scala.Function1<Object,scala.Function1<String,scala.Function1<String,SparkListenerExecutorRemoved>>>
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.curried:()Lscala/Function1;]
SparkListenerExecutorRemoved.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.equals:(Ljava/lang/Object;)Z]
SparkListenerExecutorRemoved.executorId ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.executorId:()Ljava/lang/String;]
SparkListenerExecutorRemoved.hashCode ( ) : int
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.hashCode:()I]
SparkListenerExecutorRemoved.productArity ( ) : int
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.productArity:()I]
SparkListenerExecutorRemoved.productElement ( int p1 ) : Object
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.productElement:(I)Ljava/lang/Object;]
SparkListenerExecutorRemoved.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.productIterator:()Lscala/collection/Iterator;]
SparkListenerExecutorRemoved.productPrefix ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.productPrefix:()Ljava/lang/String;]
SparkListenerExecutorRemoved.reason ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.reason:()Ljava/lang/String;]
SparkListenerExecutorRemoved.SparkListenerExecutorRemoved ( long time, String executorId, String reason )
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved."<init>":(JLjava/lang/String;Ljava/lang/String;)V]
SparkListenerExecutorRemoved.time ( ) : long
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.time:()J]
SparkListenerExecutorRemoved.toString ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.toString:()Ljava/lang/String;]
SparkListenerExecutorRemoved.tupled ( ) [static] : scala.Function1<scala.Tuple3<Object,String,String>,SparkListenerExecutorRemoved>
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.tupled:()Lscala/Function1;]
spark-core_2.10-1.3.0.jar, SparkListenerJobEnd.class
package org.apache.spark.scheduler
SparkListenerJobEnd.copy ( int jobId, long time, JobResult jobResult ) : SparkListenerJobEnd
[mangled: org/apache/spark/scheduler/SparkListenerJobEnd.copy:(IJLorg/apache/spark/scheduler/JobResult;)Lorg/apache/spark/scheduler/SparkListenerJobEnd;]
SparkListenerJobEnd.SparkListenerJobEnd ( int jobId, long time, JobResult jobResult )
[mangled: org/apache/spark/scheduler/SparkListenerJobEnd."<init>":(IJLorg/apache/spark/scheduler/JobResult;)V]
SparkListenerJobEnd.time ( ) : long
[mangled: org/apache/spark/scheduler/SparkListenerJobEnd.time:()J]
spark-core_2.10-1.3.0.jar, SparkListenerJobStart.class
package org.apache.spark.scheduler
SparkListenerJobStart.copy ( int jobId, long time, scala.collection.Seq<StageInfo> stageInfos, java.util.Properties properties ) : SparkListenerJobStart
[mangled: org/apache/spark/scheduler/SparkListenerJobStart.copy:(IJLscala/collection/Seq;Ljava/util/Properties;)Lorg/apache/spark/scheduler/SparkListenerJobStart;]
SparkListenerJobStart.SparkListenerJobStart ( int jobId, long time, scala.collection.Seq<StageInfo> stageInfos, java.util.Properties properties )
[mangled: org/apache/spark/scheduler/SparkListenerJobStart."<init>":(IJLscala/collection/Seq;Ljava/util/Properties;)V]
SparkListenerJobStart.stageInfos ( ) : scala.collection.Seq<StageInfo>
[mangled: org/apache/spark/scheduler/SparkListenerJobStart.stageInfos:()Lscala/collection/Seq;]
SparkListenerJobStart.time ( ) : long
[mangled: org/apache/spark/scheduler/SparkListenerJobStart.time:()J]
spark-core_2.10-1.3.0.jar, TaskMetrics.class
package org.apache.spark.executor
TaskMetrics.decDiskBytesSpilled ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.decDiskBytesSpilled:(J)V]
TaskMetrics.decMemoryBytesSpilled ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.decMemoryBytesSpilled:(J)V]
TaskMetrics.getInputMetricsForReadMethod ( scala.Enumeration.Value readMethod ) : InputMetrics
[mangled: org/apache/spark/executor/TaskMetrics.getInputMetricsForReadMethod:(Lscala/Enumeration$Value;)Lorg/apache/spark/executor/InputMetrics;]
TaskMetrics.incDiskBytesSpilled ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.incDiskBytesSpilled:(J)V]
TaskMetrics.incMemoryBytesSpilled ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.incMemoryBytesSpilled:(J)V]
TaskMetrics.outputMetrics ( ) : scala.Option<OutputMetrics>
[mangled: org/apache/spark/executor/TaskMetrics.outputMetrics:()Lscala/Option;]
TaskMetrics.outputMetrics_.eq ( scala.Option<OutputMetrics> p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.outputMetrics_.eq:(Lscala/Option;)V]
TaskMetrics.setExecutorDeserializeTime ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setExecutorDeserializeTime:(J)V]
TaskMetrics.setExecutorRunTime ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setExecutorRunTime:(J)V]
TaskMetrics.setHostname ( String value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setHostname:(Ljava/lang/String;)V]
TaskMetrics.setInputMetrics ( scala.Option<InputMetrics> inputMetrics ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setInputMetrics:(Lscala/Option;)V]
TaskMetrics.setJvmGCTime ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setJvmGCTime:(J)V]
TaskMetrics.setResultSerializationTime ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setResultSerializationTime:(J)V]
TaskMetrics.setResultSize ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setResultSize:(J)V]
TaskMetrics.updateInputMetrics ( ) : void
[mangled: org/apache/spark/executor/TaskMetrics.updateInputMetrics:()V]
spark-hive_2.10-1.3.0.jar, HiveContext.class
package org.apache.spark.sql.hive
HiveContext.conf ( ) : org.apache.spark.sql.SQLConf
[mangled: org/apache/spark/sql/hive/HiveContext.conf:()Lorg/apache/spark/sql/SQLConf;]
HiveContext.convertCTAS ( ) : boolean
[mangled: org/apache/spark/sql/hive/HiveContext.convertCTAS:()Z]
HiveContext.ddlParserWithHiveQL ( ) : org.apache.spark.sql.sources.DDLParser
[mangled: org/apache/spark/sql/hive/HiveContext.ddlParserWithHiveQL:()Lorg/apache/spark/sql/sources/DDLParser;]
HiveContext.invalidateTable ( String tableName ) : void
[mangled: org/apache/spark/sql/hive/HiveContext.invalidateTable:(Ljava/lang/String;)V]
HiveContext.refreshTable ( String tableName ) : void
[mangled: org/apache/spark/sql/hive/HiveContext.refreshTable:(Ljava/lang/String;)V]
HiveContext.toHiveStructString ( scala.Tuple2<Object,org.apache.spark.sql.types.DataType> p1 ) [static] : String
[mangled: org/apache/spark/sql/hive/HiveContext.toHiveStructString:(Lscala/Tuple2;)Ljava/lang/String;]
spark-sql_2.10-1.3.0.jar, DataFrame.class
package org.apache.spark.sql
DataFrame.agg ( java.util.Map<String,String> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Ljava/util/Map;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( Column expr, Column... exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lorg/apache/spark/sql/Column;[Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( Column expr, scala.collection.Seq<Column> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lorg/apache/spark/sql/Column;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( scala.collection.immutable.Map<String,String> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( scala.Tuple2<String,String> aggExpr, scala.collection.Seq<scala.Tuple2<String,String>> aggExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lscala/Tuple2;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.apply ( String colName ) : Column
[mangled: org/apache/spark/sql/DataFrame.apply:(Ljava/lang/String;)Lorg/apache/spark/sql/Column;]
DataFrame.as ( scala.Symbol alias ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.as:(Lscala/Symbol;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.as ( String alias ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.as:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.cache ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.cache:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.cache ( ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.cache:()Lorg/apache/spark/sql/RDDApi;]
DataFrame.col ( String colName ) : Column
[mangled: org/apache/spark/sql/DataFrame.col:(Ljava/lang/String;)Lorg/apache/spark/sql/Column;]
DataFrame.collect ( ) : Object
[mangled: org/apache/spark/sql/DataFrame.collect:()Ljava/lang/Object;]
DataFrame.collect ( ) : Row[ ]
[mangled: org/apache/spark/sql/DataFrame.collect:()[Lorg/apache/spark/sql/Row;]
DataFrame.collectAsList ( ) : java.util.List<Row>
[mangled: org/apache/spark/sql/DataFrame.collectAsList:()Ljava/util/List;]
DataFrame.columns ( ) : String[ ]
[mangled: org/apache/spark/sql/DataFrame.columns:()[Ljava/lang/String;]
DataFrame.count ( ) : long
[mangled: org/apache/spark/sql/DataFrame.count:()J]
DataFrame.createJDBCTable ( String url, String table, boolean allowExisting ) : void
[mangled: org/apache/spark/sql/DataFrame.createJDBCTable:(Ljava/lang/String;Ljava/lang/String;Z)V]
DataFrame.DataFrame ( SQLContext sqlContext, catalyst.plans.logical.LogicalPlan logicalPlan )
[mangled: org/apache/spark/sql/DataFrame."<init>":(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)V]
DataFrame.DataFrame ( SQLContext sqlContext, SQLContext.QueryExecution queryExecution )
[mangled: org/apache/spark/sql/DataFrame."<init>":(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/SQLContext$QueryExecution;)V]
DataFrame.distinct ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.distinct:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.dtypes ( ) : scala.Tuple2<String,String>[ ]
[mangled: org/apache/spark/sql/DataFrame.dtypes:()[Lscala/Tuple2;]
DataFrame.except ( DataFrame other ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.except:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.explain ( ) : void
[mangled: org/apache/spark/sql/DataFrame.explain:()V]
DataFrame.explain ( boolean extended ) : void
[mangled: org/apache/spark/sql/DataFrame.explain:(Z)V]
DataFrame.explode ( scala.collection.Seq<Column> input, scala.Function1<Row,scala.collection.TraversableOnce<A>> f, scala.reflect.api.TypeTags.TypeTag<A> p3 ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.explode:(Lscala/collection/Seq;Lscala/Function1;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.explode ( String inputColumn, String outputColumn, scala.Function1<A,scala.collection.TraversableOnce<B>> f, scala.reflect.api.TypeTags.TypeTag<B> p4 ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.explode:(Ljava/lang/String;Ljava/lang/String;Lscala/Function1;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.filter ( Column condition ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.filter:(Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.filter ( String conditionExpr ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.filter:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.first ( ) : Object
[mangled: org/apache/spark/sql/DataFrame.first:()Ljava/lang/Object;]
DataFrame.first ( ) : Row
[mangled: org/apache/spark/sql/DataFrame.first:()Lorg/apache/spark/sql/Row;]
DataFrame.flatMap ( scala.Function1<Row,scala.collection.TraversableOnce<R>> f, scala.reflect.ClassTag<R> p2 ) : org.apache.spark.rdd.RDD<R>
[mangled: org/apache/spark/sql/DataFrame.flatMap:(Lscala/Function1;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
DataFrame.foreach ( scala.Function1<Row,scala.runtime.BoxedUnit> f ) : void
[mangled: org/apache/spark/sql/DataFrame.foreach:(Lscala/Function1;)V]
DataFrame.foreachPartition ( scala.Function1<scala.collection.Iterator<Row>,scala.runtime.BoxedUnit> f ) : void
[mangled: org/apache/spark/sql/DataFrame.foreachPartition:(Lscala/Function1;)V]
DataFrame.groupBy ( Column... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.groupBy ( scala.collection.Seq<Column> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:(Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.groupBy ( String col1, scala.collection.Seq<String> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.groupBy ( String col1, String... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.head ( ) : Row
[mangled: org/apache/spark/sql/DataFrame.head:()Lorg/apache/spark/sql/Row;]
DataFrame.head ( int n ) : Row[ ]
[mangled: org/apache/spark/sql/DataFrame.head:(I)[Lorg/apache/spark/sql/Row;]
DataFrame.insertInto ( String tableName ) : void
[mangled: org/apache/spark/sql/DataFrame.insertInto:(Ljava/lang/String;)V]
DataFrame.insertInto ( String tableName, boolean overwrite ) : void
[mangled: org/apache/spark/sql/DataFrame.insertInto:(Ljava/lang/String;Z)V]
DataFrame.insertIntoJDBC ( String url, String table, boolean overwrite ) : void
[mangled: org/apache/spark/sql/DataFrame.insertIntoJDBC:(Ljava/lang/String;Ljava/lang/String;Z)V]
DataFrame.intersect ( DataFrame other ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.intersect:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.isLocal ( ) : boolean
[mangled: org/apache/spark/sql/DataFrame.isLocal:()Z]
DataFrame.javaRDD ( ) : org.apache.spark.api.java.JavaRDD<Row>
[mangled: org/apache/spark/sql/DataFrame.javaRDD:()Lorg/apache/spark/api/java/JavaRDD;]
DataFrame.javaToPython ( ) : org.apache.spark.api.java.JavaRDD<byte[ ]>
[mangled: org/apache/spark/sql/DataFrame.javaToPython:()Lorg/apache/spark/api/java/JavaRDD;]
DataFrame.join ( DataFrame right ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.join ( DataFrame right, Column joinExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.join ( DataFrame right, Column joinExprs, String joinType ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Lorg/apache/spark/sql/Column;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.limit ( int n ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.limit:(I)Lorg/apache/spark/sql/DataFrame;]
DataFrame.logicalPlan ( ) : catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/DataFrame.logicalPlan:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
DataFrame.map ( scala.Function1<Row,R> f, scala.reflect.ClassTag<R> p2 ) : org.apache.spark.rdd.RDD<R>
[mangled: org/apache/spark/sql/DataFrame.map:(Lscala/Function1;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
DataFrame.mapPartitions ( scala.Function1<scala.collection.Iterator<Row>,scala.collection.Iterator<R>> f, scala.reflect.ClassTag<R> p2 ) : org.apache.spark.rdd.RDD<R>
[mangled: org/apache/spark/sql/DataFrame.mapPartitions:(Lscala/Function1;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
DataFrame.numericColumns ( ) : scala.collection.Seq<catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/DataFrame.numericColumns:()Lscala/collection/Seq;]
DataFrame.orderBy ( Column... sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.orderBy ( scala.collection.Seq<Column> sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.orderBy ( String sortCol, scala.collection.Seq<String> sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.orderBy ( String sortCol, String... sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.persist ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.persist:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.persist ( ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.persist:()Lorg/apache/spark/sql/RDDApi;]
DataFrame.persist ( org.apache.spark.storage.StorageLevel newLevel ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.persist:(Lorg/apache/spark/storage/StorageLevel;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.persist ( org.apache.spark.storage.StorageLevel newLevel ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.persist:(Lorg/apache/spark/storage/StorageLevel;)Lorg/apache/spark/sql/RDDApi;]
DataFrame.printSchema ( ) : void
[mangled: org/apache/spark/sql/DataFrame.printSchema:()V]
DataFrame.queryExecution ( ) : SQLContext.QueryExecution
[mangled: org/apache/spark/sql/DataFrame.queryExecution:()Lorg/apache/spark/sql/SQLContext$QueryExecution;]
DataFrame.rdd ( ) : org.apache.spark.rdd.RDD<Row>
[mangled: org/apache/spark/sql/DataFrame.rdd:()Lorg/apache/spark/rdd/RDD;]
DataFrame.registerTempTable ( String tableName ) : void
[mangled: org/apache/spark/sql/DataFrame.registerTempTable:(Ljava/lang/String;)V]
DataFrame.repartition ( int numPartitions ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.repartition:(I)Lorg/apache/spark/sql/DataFrame;]
DataFrame.resolve ( String colName ) : catalyst.expressions.NamedExpression
[mangled: org/apache/spark/sql/DataFrame.resolve:(Ljava/lang/String;)Lorg/apache/spark/sql/catalyst/expressions/NamedExpression;]
DataFrame.sample ( boolean withReplacement, double fraction ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sample:(ZD)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sample ( boolean withReplacement, double fraction, long seed ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sample:(ZDJ)Lorg/apache/spark/sql/DataFrame;]
DataFrame.save ( String path ) : void
[mangled: org/apache/spark/sql/DataFrame.save:(Ljava/lang/String;)V]
DataFrame.save ( String path, SaveMode mode ) : void
[mangled: org/apache/spark/sql/DataFrame.save:(Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;)V]
DataFrame.save ( String path, String source ) : void
[mangled: org/apache/spark/sql/DataFrame.save:(Ljava/lang/String;Ljava/lang/String;)V]
DataFrame.save ( String path, String source, SaveMode mode ) : void
[mangled: org/apache/spark/sql/DataFrame.save:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;)V]
DataFrame.save ( String source, SaveMode mode, java.util.Map<String,String> options ) : void
[mangled: org/apache/spark/sql/DataFrame.save:(Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Ljava/util/Map;)V]
DataFrame.save ( String source, SaveMode mode, scala.collection.immutable.Map<String,String> options ) : void
[mangled: org/apache/spark/sql/DataFrame.save:(Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;)V]
DataFrame.saveAsParquetFile ( String path ) : void
[mangled: org/apache/spark/sql/DataFrame.saveAsParquetFile:(Ljava/lang/String;)V]
DataFrame.saveAsTable ( String tableName ) : void
[mangled: org/apache/spark/sql/DataFrame.saveAsTable:(Ljava/lang/String;)V]
DataFrame.saveAsTable ( String tableName, SaveMode mode ) : void
[mangled: org/apache/spark/sql/DataFrame.saveAsTable:(Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;)V]
DataFrame.saveAsTable ( String tableName, String source ) : void
[mangled: org/apache/spark/sql/DataFrame.saveAsTable:(Ljava/lang/String;Ljava/lang/String;)V]
DataFrame.saveAsTable ( String tableName, String source, SaveMode mode ) : void
[mangled: org/apache/spark/sql/DataFrame.saveAsTable:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;)V]
DataFrame.saveAsTable ( String tableName, String source, SaveMode mode, java.util.Map<String,String> options ) : void
[mangled: org/apache/spark/sql/DataFrame.saveAsTable:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Ljava/util/Map;)V]
DataFrame.saveAsTable ( String tableName, String source, SaveMode mode, scala.collection.immutable.Map<String,String> options ) : void
[mangled: org/apache/spark/sql/DataFrame.saveAsTable:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;)V]
DataFrame.schema ( ) : types.StructType
[mangled: org/apache/spark/sql/DataFrame.schema:()Lorg/apache/spark/sql/types/StructType;]
DataFrame.select ( Column... cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.select ( scala.collection.Seq<Column> cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.select ( String col, scala.collection.Seq<String> cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.select ( String col, String... cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.selectExpr ( scala.collection.Seq<String> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.selectExpr:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.selectExpr ( String... exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.selectExpr:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.show ( ) : void
[mangled: org/apache/spark/sql/DataFrame.show:()V]
DataFrame.show ( int numRows ) : void
[mangled: org/apache/spark/sql/DataFrame.show:(I)V]
DataFrame.showString ( int numRows ) : String
[mangled: org/apache/spark/sql/DataFrame.showString:(I)Ljava/lang/String;]
DataFrame.sort ( Column... sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sort ( scala.collection.Seq<Column> sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sort ( String sortCol, scala.collection.Seq<String> sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sort ( String sortCol, String... sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sqlContext ( ) : SQLContext
[mangled: org/apache/spark/sql/DataFrame.sqlContext:()Lorg/apache/spark/sql/SQLContext;]
DataFrame.take ( int n ) : Object
[mangled: org/apache/spark/sql/DataFrame.take:(I)Ljava/lang/Object;]
DataFrame.take ( int n ) : Row[ ]
[mangled: org/apache/spark/sql/DataFrame.take:(I)[Lorg/apache/spark/sql/Row;]
DataFrame.toDF ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.toDF:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.toDF ( scala.collection.Seq<String> colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.toDF:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.toDF ( String... colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.toDF:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.toJavaRDD ( ) : org.apache.spark.api.java.JavaRDD<Row>
[mangled: org/apache/spark/sql/DataFrame.toJavaRDD:()Lorg/apache/spark/api/java/JavaRDD;]
DataFrame.toJSON ( ) : org.apache.spark.rdd.RDD<String>
[mangled: org/apache/spark/sql/DataFrame.toJSON:()Lorg/apache/spark/rdd/RDD;]
DataFrame.toString ( ) : String
[mangled: org/apache/spark/sql/DataFrame.toString:()Ljava/lang/String;]
DataFrame.unionAll ( DataFrame other ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.unionAll:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.unpersist ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.unpersist:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.unpersist ( ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.unpersist:()Lorg/apache/spark/sql/RDDApi;]
DataFrame.unpersist ( boolean blocking ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.unpersist:(Z)Lorg/apache/spark/sql/DataFrame;]
DataFrame.unpersist ( boolean blocking ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.unpersist:(Z)Lorg/apache/spark/sql/RDDApi;]
DataFrame.where ( Column condition ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.where:(Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.withColumn ( String colName, Column col ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.withColumn:(Ljava/lang/String;Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.withColumnRenamed ( String existingName, String newName ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.withColumnRenamed:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
spark-sql_2.10-1.3.0.jar, SQLContext.class
package org.apache.spark.sql
SQLContext.applySchemaToPythonRDD ( org.apache.spark.rdd.RDD<Object[ ]> rdd, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.applySchemaToPythonRDD:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.applySchemaToPythonRDD ( org.apache.spark.rdd.RDD<Object[ ]> rdd, String schemaString ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.applySchemaToPythonRDD:(Lorg/apache/spark/rdd/RDD;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.baseRelationToDataFrame ( sources.BaseRelation baseRelation ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.baseRelationToDataFrame:(Lorg/apache/spark/sql/sources/BaseRelation;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.cacheManager ( ) : CacheManager
[mangled: org/apache/spark/sql/SQLContext.cacheManager:()Lorg/apache/spark/sql/CacheManager;]
SQLContext.checkAnalysis ( ) : catalyst.analysis.CheckAnalysis
[mangled: org/apache/spark/sql/SQLContext.checkAnalysis:()Lorg/apache/spark/sql/catalyst/analysis/CheckAnalysis;]
SQLContext.clearCache ( ) : void
[mangled: org/apache/spark/sql/SQLContext.clearCache:()V]
SQLContext.conf ( ) : SQLConf
[mangled: org/apache/spark/sql/SQLContext.conf:()Lorg/apache/spark/sql/SQLConf;]
SQLContext.createDataFrame ( org.apache.spark.api.java.JavaRDD<?> rdd, Class<?> beanClass ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/api/java/JavaRDD;Ljava/lang/Class;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.api.java.JavaRDD<Row> rowRDD, java.util.List<String> columns ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/api/java/JavaRDD;Ljava/util/List;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.api.java.JavaRDD<Row> rowRDD, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/api/java/JavaRDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.rdd.RDD<?> rdd, Class<?> beanClass ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/rdd/RDD;Ljava/lang/Class;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.rdd.RDD<A> rdd, scala.reflect.api.TypeTags.TypeTag<A> p2 ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.rdd.RDD<Row> rowRDD, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( scala.collection.Seq<A> data, scala.reflect.api.TypeTags.TypeTag<A> p2 ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lscala/collection/Seq;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String path ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String path, String source ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String source, java.util.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Ljava/util/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String source, types.StructType schema, java.util.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/types/StructType;Ljava/util/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String source, types.StructType schema, scala.collection.immutable.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/types/StructType;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String source, scala.collection.immutable.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.ddlParser ( ) : sources.DDLParser
[mangled: org/apache/spark/sql/SQLContext.ddlParser:()Lorg/apache/spark/sql/sources/DDLParser;]
SQLContext.dropTempTable ( String tableName ) : void
[mangled: org/apache/spark/sql/SQLContext.dropTempTable:(Ljava/lang/String;)V]
SQLContext.emptyDataFrame ( ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.emptyDataFrame:()Lorg/apache/spark/sql/DataFrame;]
SQLContext.experimental ( ) : ExperimentalMethods
[mangled: org/apache/spark/sql/SQLContext.experimental:()Lorg/apache/spark/sql/ExperimentalMethods;]
SQLContext.getSchema ( Class<?> beanClass ) : scala.collection.Seq<catalyst.expressions.AttributeReference>
[mangled: org/apache/spark/sql/SQLContext.getSchema:(Ljava/lang/Class;)Lscala/collection/Seq;]
SQLContext.implicits ( ) : SQLContext.implicits.
[mangled: org/apache/spark/sql/SQLContext.implicits:()Lorg/apache/spark/sql/SQLContext$implicits$;]
SQLContext.jdbc ( String url, String table ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jdbc:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jdbc ( String url, String table, String columnName, long lowerBound, long upperBound, int numPartitions ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jdbc:(Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;JJI)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jdbc ( String url, String table, String[ ] theParts ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jdbc:(Ljava/lang/String;Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonFile ( String path ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonFile:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonFile ( String path, double samplingRatio ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonFile:(Ljava/lang/String;D)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonFile ( String path, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonFile:(Ljava/lang/String;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonRDD ( org.apache.spark.api.java.JavaRDD<String> json ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/api/java/JavaRDD;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonRDD ( org.apache.spark.api.java.JavaRDD<String> json, double samplingRatio ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/api/java/JavaRDD;D)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonRDD ( org.apache.spark.api.java.JavaRDD<String> json, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/api/java/JavaRDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonRDD ( org.apache.spark.rdd.RDD<String> json ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/rdd/RDD;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonRDD ( org.apache.spark.rdd.RDD<String> json, double samplingRatio ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/rdd/RDD;D)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonRDD ( org.apache.spark.rdd.RDD<String> json, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.load ( String path ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.load:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.load ( String path, String source ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.load:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.load ( String source, java.util.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.load:(Ljava/lang/String;Ljava/util/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.load ( String source, types.StructType schema, java.util.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.load:(Ljava/lang/String;Lorg/apache/spark/sql/types/StructType;Ljava/util/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.load ( String source, types.StructType schema, scala.collection.immutable.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.load:(Ljava/lang/String;Lorg/apache/spark/sql/types/StructType;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.load ( String source, scala.collection.immutable.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.load:(Ljava/lang/String;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.optimizer ( ) : catalyst.optimizer.Optimizer
[mangled: org/apache/spark/sql/SQLContext.optimizer:()Lorg/apache/spark/sql/catalyst/optimizer/Optimizer;]
SQLContext.parquetFile ( scala.collection.Seq<String> paths ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.parquetFile:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.parquetFile ( String... paths ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.parquetFile:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.parseDataType ( String dataTypeString ) : types.DataType
[mangled: org/apache/spark/sql/SQLContext.parseDataType:(Ljava/lang/String;)Lorg/apache/spark/sql/types/DataType;]
SQLContext.registerDataFrameAsTable ( DataFrame df, String tableName ) : void
[mangled: org/apache/spark/sql/SQLContext.registerDataFrameAsTable:(Lorg/apache/spark/sql/DataFrame;Ljava/lang/String;)V]
SQLContext.sql ( String sqlText ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.sql:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.SQLContext ( org.apache.spark.api.java.JavaSparkContext sparkContext )
[mangled: org/apache/spark/sql/SQLContext."<init>":(Lorg/apache/spark/api/java/JavaSparkContext;)V]
SQLContext.sqlParser ( ) : SparkSQLParser
[mangled: org/apache/spark/sql/SQLContext.sqlParser:()Lorg/apache/spark/sql/SparkSQLParser;]
SQLContext.table ( String tableName ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.table:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.tableNames ( ) : String[ ]
[mangled: org/apache/spark/sql/SQLContext.tableNames:()[Ljava/lang/String;]
SQLContext.tableNames ( String databaseName ) : String[ ]
[mangled: org/apache/spark/sql/SQLContext.tableNames:(Ljava/lang/String;)[Ljava/lang/String;]
SQLContext.tables ( ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.tables:()Lorg/apache/spark/sql/DataFrame;]
SQLContext.tables ( String databaseName ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.tables:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.udf ( ) : UDFRegistration
[mangled: org/apache/spark/sql/SQLContext.udf:()Lorg/apache/spark/sql/UDFRegistration;]
spark-streaming_2.10-1.3.0.jar, DStream<T>.class
package org.apache.spark.streaming.dstream
DStream<T>.creationSite ( ) : org.apache.spark.util.CallSite
[mangled: org/apache/spark/streaming/dstream/DStream<T>.creationSite:()Lorg/apache/spark/util/CallSite;]
DStream<T>.print ( int num ) : void
[mangled: org/apache/spark/streaming/dstream/DStream<T>.print:(I)V]
DStream<T>.toPairDStreamFunctions ( DStream<scala.Tuple2<K,V>> p1, scala.reflect.ClassTag<K> p2, scala.reflect.ClassTag<V> p3, scala.math.Ordering<K> p4 ) [static] : PairDStreamFunctions<K,V>
[mangled: org/apache/spark/streaming/dstream/DStream<T>.toPairDStreamFunctions:(Lorg/apache/spark/streaming/dstream/DStream;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/math/Ordering;)Lorg/apache/spark/streaming/dstream/PairDStreamFunctions;]
spark-streaming_2.10-1.3.0.jar, Duration.class
package org.apache.spark.streaming
Duration.div ( Duration that ) : double
[mangled: org/apache/spark/streaming/Duration.div:(Lorg/apache/spark/streaming/Duration;)D]
Duration.greater ( Duration that ) : boolean
[mangled: org/apache/spark/streaming/Duration.greater:(Lorg/apache/spark/streaming/Duration;)Z]
Duration.greaterEq ( Duration that ) : boolean
[mangled: org/apache/spark/streaming/Duration.greaterEq:(Lorg/apache/spark/streaming/Duration;)Z]
Duration.less ( Duration that ) : boolean
[mangled: org/apache/spark/streaming/Duration.less:(Lorg/apache/spark/streaming/Duration;)Z]
Duration.lessEq ( Duration that ) : boolean
[mangled: org/apache/spark/streaming/Duration.lessEq:(Lorg/apache/spark/streaming/Duration;)Z]
Duration.minus ( Duration that ) : Duration
[mangled: org/apache/spark/streaming/Duration.minus:(Lorg/apache/spark/streaming/Duration;)Lorg/apache/spark/streaming/Duration;]
Duration.Duration..millis ( ) : long
[mangled: org/apache/spark/streaming/Duration.org.apache.spark.streaming.Duration..millis:()J]
Duration.plus ( Duration that ) : Duration
[mangled: org/apache/spark/streaming/Duration.plus:(Lorg/apache/spark/streaming/Duration;)Lorg/apache/spark/streaming/Duration;]
Duration.times ( int times ) : Duration
[mangled: org/apache/spark/streaming/Duration.times:(I)Lorg/apache/spark/streaming/Duration;]
spark-streaming_2.10-1.3.0.jar, JobScheduler.class
package org.apache.spark.streaming.scheduler
JobScheduler.clock ( ) : org.apache.spark.util.Clock
[mangled: org/apache/spark/streaming/scheduler/JobScheduler.clock:()Lorg/apache/spark/util/Clock;]
spark-streaming_2.10-1.3.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.awaitTerminationOrTimeout ( long timeout ) : boolean
[mangled: org/apache/spark/streaming/StreamingContext.awaitTerminationOrTimeout:(J)Z]
StreamingContext.binaryRecordsStream ( String directory, int recordLength ) : dstream.DStream<byte[ ]>
[mangled: org/apache/spark/streaming/StreamingContext.binaryRecordsStream:(Ljava/lang/String;I)Lorg/apache/spark/streaming/dstream/DStream;]
StreamingContext.fileStream ( String directory, scala.Function1<org.apache.hadoop.fs.Path,Object> filter, boolean newFilesOnly, org.apache.hadoop.conf.Configuration conf, scala.reflect.ClassTag<K> p5, scala.reflect.ClassTag<V> p6, scala.reflect.ClassTag<F> p7 ) : dstream.InputDStream<scala.Tuple2<K,V>>
[mangled: org/apache/spark/streaming/StreamingContext.fileStream:(Ljava/lang/String;Lscala/Function1;ZLorg/apache/hadoop/conf/Configuration;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/streaming/dstream/InputDStream;]
StreamingContext.progressListener ( ) : ui.StreamingJobProgressListener
[mangled: org/apache/spark/streaming/StreamingContext.progressListener:()Lorg/apache/spark/streaming/ui/StreamingJobProgressListener;]
StreamingContext.uiTab ( ) : scala.Option<ui.StreamingTab>
[mangled: org/apache/spark/streaming/StreamingContext.uiTab:()Lscala/Option;]
spark-streaming_2.10-1.3.0.jar, Time.class
package org.apache.spark.streaming
Time.greater ( Time that ) : boolean
[mangled: org/apache/spark/streaming/Time.greater:(Lorg/apache/spark/streaming/Time;)Z]
Time.greaterEq ( Time that ) : boolean
[mangled: org/apache/spark/streaming/Time.greaterEq:(Lorg/apache/spark/streaming/Time;)Z]
Time.less ( Time that ) : boolean
[mangled: org/apache/spark/streaming/Time.less:(Lorg/apache/spark/streaming/Time;)Z]
Time.lessEq ( Time that ) : boolean
[mangled: org/apache/spark/streaming/Time.lessEq:(Lorg/apache/spark/streaming/Time;)Z]
Time.minus ( Duration that ) : Time
[mangled: org/apache/spark/streaming/Time.minus:(Lorg/apache/spark/streaming/Duration;)Lorg/apache/spark/streaming/Time;]
Time.minus ( Time that ) : Duration
[mangled: org/apache/spark/streaming/Time.minus:(Lorg/apache/spark/streaming/Time;)Lorg/apache/spark/streaming/Duration;]
Time.plus ( Duration that ) : Time
[mangled: org/apache/spark/streaming/Time.plus:(Lorg/apache/spark/streaming/Duration;)Lorg/apache/spark/streaming/Time;]
to the top
Problems with Data Types, High Severity (15)
spark-catalyst_2.10-1.3.0.jar
package org.apache.spark.sql
[+] Row (1)
| Change | Effect |
---|
1 | This interface has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (31)
anyNull ( )This abstract method is from 'Row' interface.
apply ( int )This abstract method is from 'Row' interface.
copy ( )This abstract method is from 'Row' interface.
equals ( java.lang.Object )This abstract method is from 'Row' interface.
get ( int )This abstract method is from 'Row' interface.
getAs ( int )This abstract method is from 'Row' interface.
getBoolean ( int )This abstract method is from 'Row' interface.
getByte ( int )This abstract method is from 'Row' interface.
getDate ( int )This abstract method is from 'Row' interface.
getDecimal ( int )This abstract method is from 'Row' interface.
getDouble ( int )This abstract method is from 'Row' interface.
getFloat ( int )This abstract method is from 'Row' interface.
getInt ( int )This abstract method is from 'Row' interface.
getJavaMap ( int )This abstract method is from 'Row' interface.
getList ( int )This abstract method is from 'Row' interface.
getLong ( int )This abstract method is from 'Row' interface.
getMap ( int )This abstract method is from 'Row' interface.
getSeq ( int )This abstract method is from 'Row' interface.
getShort ( int )This abstract method is from 'Row' interface.
getString ( int )This abstract method is from 'Row' interface.
getStruct ( int )This abstract method is from 'Row' interface.
hashCode ( )This abstract method is from 'Row' interface.
isNullAt ( int )This abstract method is from 'Row' interface.
length ( )This abstract method is from 'Row' interface.
mkString ( )This abstract method is from 'Row' interface.
mkString ( java.lang.String )This abstract method is from 'Row' interface.
mkString ( java.lang.String, java.lang.String, java.lang.String )This abstract method is from 'Row' interface.
schema ( )This abstract method is from 'Row' interface.
size ( )This abstract method is from 'Row' interface.
toSeq ( )This abstract method is from 'Row' interface.
toString ( )This abstract method is from 'Row' interface.
package org.apache.spark.sql.catalyst.plans.logical
[+] LogicalPlan (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.Logging. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (4)
executePlan ( LogicalPlan )1st parameter 'plan' of this method has type 'LogicalPlan'.
executePlan ( LogicalPlan )1st parameter 'plan' of this method has type 'LogicalPlan'.
executePlan ( LogicalPlan )1st parameter 'plan' of this method has type 'LogicalPlan'.
parseSql ( java.lang.String )Return value of this method has type 'LogicalPlan'.
spark-core_2.10-1.3.0.jar
package org.apache.spark
[+] SparkContext (1)
| Change | Effect |
---|
1 | Removed super-interface ExecutorAllocationClient. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (160)
JavaSparkContext ( SparkContext )1st parameter 'sc' of this method has type 'SparkContext'.
context ( )Return value of this method has type 'SparkContext'.
rdd.RDD ( SparkContext, scala.collection.Seq<Dependency<?>>, scala.reflect.ClassTag<T> )1st parameter '_sc' of this method has type 'SparkContext'.
sparkContext ( )Return value of this method has type 'SparkContext'.
accumulable ( R, java.lang.String, AccumulableParam<R,T> )This method is from 'SparkContext' class.
accumulable ( R, AccumulableParam<R,T> )This method is from 'SparkContext' class.
accumulableCollection ( R, scala.Function1<R,scala.collection.generic.Growable<T>>, scala.reflect.ClassTag<R> )This method is from 'SparkContext' class.
accumulator ( T, java.lang.String, AccumulatorParam<T> )This method is from 'SparkContext' class.
accumulator ( T, AccumulatorParam<T> )This method is from 'SparkContext' class.
addedFiles ( )This method is from 'SparkContext' class.
addedJars ( )This method is from 'SparkContext' class.
addFile ( java.lang.String )This method is from 'SparkContext' class.
addJar ( java.lang.String )This method is from 'SparkContext' class.
addSparkListener ( scheduler.SparkListener )This method is from 'SparkContext' class.
appName ( )This method is from 'SparkContext' class.
booleanWritableConverter ( )This method is from 'SparkContext' class.
boolToBoolWritable ( boolean )This method is from 'SparkContext' class.
broadcast ( T, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
bytesToBytesWritable ( byte[ ] )This method is from 'SparkContext' class.
bytesWritableConverter ( )This method is from 'SparkContext' class.
cancelAllJobs ( )This method is from 'SparkContext' class.
cancelJob ( int )This method is from 'SparkContext' class.
cancelJobGroup ( java.lang.String )This method is from 'SparkContext' class.
cancelStage ( int )This method is from 'SparkContext' class.
checkpointDir ( )This method is from 'SparkContext' class.
checkpointDir_.eq ( scala.Option<java.lang.String> )This method is from 'SparkContext' class.
checkpointFile ( java.lang.String, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
clean ( F, boolean )This method is from 'SparkContext' class.
cleaner ( )This method is from 'SparkContext' class.
cleanup ( long )This method is from 'SparkContext' class.
clearCallSite ( )This method is from 'SparkContext' class.
clearJobGroup ( )This method is from 'SparkContext' class.
conf ( )This method is from 'SparkContext' class.
dagScheduler ( )This method is from 'SparkContext' class.
dagScheduler_.eq ( scheduler.DAGScheduler )This method is from 'SparkContext' class.
defaultMinPartitions ( )This method is from 'SparkContext' class.
defaultParallelism ( )This method is from 'SparkContext' class.
doubleRDDToDoubleRDDFunctions ( rdd.RDD<java.lang.Object> )This method is from 'SparkContext' class.
doubleToDoubleWritable ( double )This method is from 'SparkContext' class.
doubleWritableConverter ( )This method is from 'SparkContext' class.
emptyRDD ( scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
env ( )This method is from 'SparkContext' class.
eventLogger ( )This method is from 'SparkContext' class.
executorEnvs ( )This method is from 'SparkContext' class.
executorMemory ( )This method is from 'SparkContext' class.
files ( )This method is from 'SparkContext' class.
floatToFloatWritable ( float )This method is from 'SparkContext' class.
floatWritableConverter ( )This method is from 'SparkContext' class.
getAllPools ( )This method is from 'SparkContext' class.
getCallSite ( )This method is from 'SparkContext' class.
getCheckpointDir ( )This method is from 'SparkContext' class.
getConf ( )This method is from 'SparkContext' class.
getExecutorMemoryStatus ( )This method is from 'SparkContext' class.
getExecutorStorageStatus ( )This method is from 'SparkContext' class.
getLocalProperties ( )This method is from 'SparkContext' class.
getLocalProperty ( java.lang.String )This method is from 'SparkContext' class.
getPersistentRDDs ( )This method is from 'SparkContext' class.
getPoolForName ( java.lang.String )This method is from 'SparkContext' class.
getPreferredLocs ( rdd.RDD<?>, int )This method is from 'SparkContext' class.
getRDDStorageInfo ( )This method is from 'SparkContext' class.
getSchedulingMode ( )This method is from 'SparkContext' class.
getSparkHome ( )This method is from 'SparkContext' class.
hadoopConfiguration ( )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, int, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, java.lang.Class<? extends org.apache.hadoop.mapred.InputFormat<K,V>>, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<? extends org.apache.hadoop.mapred.InputFormat<K,V>>, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
intToIntWritable ( int )This method is from 'SparkContext' class.
intWritableConverter ( )This method is from 'SparkContext' class.
isLocal ( )This method is from 'SparkContext' class.
isTraceEnabled ( )This method is from 'SparkContext' class.
jarOfClass ( java.lang.Class<?> )This method is from 'SparkContext' class.
jarOfObject ( java.lang.Object )This method is from 'SparkContext' class.
jars ( )This method is from 'SparkContext' class.
listenerBus ( )This method is from 'SparkContext' class.
log ( )This method is from 'SparkContext' class.
logDebug ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logDebug ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logError ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logError ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logInfo ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logInfo ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logName ( )This method is from 'SparkContext' class.
logTrace ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logTrace ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logWarning ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logWarning ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
longToLongWritable ( long )This method is from 'SparkContext' class.
longWritableConverter ( )This method is from 'SparkContext' class.
makeRDD ( scala.collection.Seq<scala.Tuple2<T,scala.collection.Seq<java.lang.String>>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
makeRDD ( scala.collection.Seq<T>, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
master ( )This method is from 'SparkContext' class.
metadataCleaner ( )This method is from 'SparkContext' class.
newAPIHadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, org.apache.hadoop.conf.Configuration )This method is from 'SparkContext' class.
newAPIHadoopFile ( java.lang.String, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
newAPIHadoopRDD ( org.apache.hadoop.conf.Configuration, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )This method is from 'SparkContext' class.
newRddId ( )This method is from 'SparkContext' class.
newShuffleId ( )This method is from 'SparkContext' class.
numericRDDToDoubleRDDFunctions ( rdd.RDD<T>, scala.math.Numeric<T> )This method is from 'SparkContext' class.
objectFile ( java.lang.String, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
Logging..log_ ( )This method is from 'SparkContext' class.
Logging..log__.eq ( org.slf4j.Logger )This method is from 'SparkContext' class.
SparkContext..warnSparkMem ( java.lang.String )This method is from 'SparkContext' class.
parallelize ( scala.collection.Seq<T>, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
persistentRdds ( )This method is from 'SparkContext' class.
persistRDD ( rdd.RDD<?> )This method is from 'SparkContext' class.
preferredNodeLocationData ( )This method is from 'SparkContext' class.
preferredNodeLocationData_.eq ( scala.collection.Map<java.lang.String,scala.collection.Set<scheduler.SplitInfo>> )This method is from 'SparkContext' class.
rddToAsyncRDDActions ( rdd.RDD<T>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
rddToOrderedRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.math.Ordering<K>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V> )This method is from 'SparkContext' class.
rddToPairRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.math.Ordering<K> )This method is from 'SparkContext' class.
rddToSequenceFileRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.Function1<K,org.apache.hadoop.io.Writable>, scala.reflect.ClassTag<K>, scala.Function1<V,org.apache.hadoop.io.Writable>, scala.reflect.ClassTag<V> )This method is from 'SparkContext' class.
runApproximateJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, partial.ApproximateEvaluator<U,R>, long )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, boolean, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, boolean, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, boolean, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, int, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.Function0<WritableConverter<K>>, scala.Function0<WritableConverter<V>> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
setCallSite ( java.lang.String )This method is from 'SparkContext' class.
setCheckpointDir ( java.lang.String )This method is from 'SparkContext' class.
setJobDescription ( java.lang.String )This method is from 'SparkContext' class.
setJobGroup ( java.lang.String, java.lang.String, boolean )This method is from 'SparkContext' class.
setLocalProperties ( java.util.Properties )This method is from 'SparkContext' class.
setLocalProperty ( java.lang.String, java.lang.String )This method is from 'SparkContext' class.
SparkContext ( )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, java.lang.String )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, java.lang.String, scala.collection.Seq<java.lang.String> )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, java.lang.String, scala.collection.Seq<java.lang.String>, scala.collection.Map<java.lang.String,java.lang.String>, scala.collection.Map<java.lang.String,scala.collection.Set<scheduler.SplitInfo>> )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, SparkConf )This constructor is from 'SparkContext' class.
SparkContext ( SparkConf )This constructor is from 'SparkContext' class.
SparkContext ( SparkConf, scala.collection.Map<java.lang.String,scala.collection.Set<scheduler.SplitInfo>> )This constructor is from 'SparkContext' class.
sparkUser ( )This method is from 'SparkContext' class.
startTime ( )This method is from 'SparkContext' class.
stop ( )This method is from 'SparkContext' class.
stringToText ( java.lang.String )This method is from 'SparkContext' class.
stringWritableConverter ( )This method is from 'SparkContext' class.
submitJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.Function0<R> )This method is from 'SparkContext' class.
tachyonFolderName ( )This method is from 'SparkContext' class.
taskScheduler ( )This method is from 'SparkContext' class.
taskScheduler_.eq ( scheduler.TaskScheduler )This method is from 'SparkContext' class.
textFile ( java.lang.String, int )This method is from 'SparkContext' class.
union ( rdd.RDD<T>, scala.collection.Seq<rdd.RDD<T>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
union ( scala.collection.Seq<rdd.RDD<T>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
unpersistRDD ( int, boolean )This method is from 'SparkContext' class.
version ( )This method is from 'SparkContext' class.
wholeTextFiles ( java.lang.String, int )This method is from 'SparkContext' class.
writableWritableConverter ( )This method is from 'SparkContext' class.
HiveContext ( SparkContext )1st parameter 'sc' of this method has type 'SparkContext'.
sparkContext ( )Return value of this method has type 'SparkContext'.
SQLContext ( SparkContext )1st parameter 'sparkContext' of this method has type 'SparkContext'.
sc ( )Return value of this method has type 'SparkContext'.
sparkContext ( )Return value of this method has type 'SparkContext'.
StreamingContext ( SparkContext, streaming.Checkpoint, streaming.Duration )1st parameter 'sc_' of this method has type 'SparkContext'.
StreamingContext ( SparkContext, streaming.Duration )1st parameter 'sparkContext' of this method has type 'SparkContext'.
[+] TaskContext (1)
| Change | Effect |
---|
1 | Removed super-interface java.io.Serializable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (3)
compute ( Partition, TaskContext )2nd parameter 'p2' of this abstract method has type 'TaskContext'.
computeOrReadCheckpoint ( Partition, TaskContext )2nd parameter 'context' of this method has type 'TaskContext'.
iterator ( Partition, TaskContext )2nd parameter 'context' of this method has type 'TaskContext'.
package org.apache.spark.api.java
[+] JavaSparkContext (1)
| Change | Effect |
---|
1 | Removed super-interface java.io.Closeable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
JavaSparkContext ( org.apache.spark.SparkContext )This constructor is from 'JavaSparkContext' class.
package org.apache.spark.broadcast
[+] Broadcast<T> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.Logging. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
broadcast ( T, scala.reflect.ClassTag<T> )Return value of this method has type 'Broadcast<T>'.
package org.apache.spark.executor
[+] OutputMetrics (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
recordsWritten ( )This method is from 'OutputMetrics' class.
package org.apache.spark.rdd
[+] PairRDDFunctions<K,V> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.mapreduce.SparkHadoopMapReduceUtil. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
rddToPairRDDFunctions ( RDD<scala.Tuple2<K,V>>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.math.Ordering<K> )Return value of this method has type 'PairRDDFunctions<K,V>'.
package org.apache.spark.scheduler
[+] SparkListener (2)
| Change | Effect |
---|
1 | Abstract method onExecutorAdded ( SparkListenerExecutorAdded ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
2 | Abstract method onExecutorRemoved ( SparkListenerExecutorRemoved ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (15)
onApplicationEnd ( SparkListenerApplicationEnd )This abstract method is from 'SparkListener' interface.
onApplicationStart ( SparkListenerApplicationStart )This abstract method is from 'SparkListener' interface.
onBlockManagerAdded ( SparkListenerBlockManagerAdded )This abstract method is from 'SparkListener' interface.
onBlockManagerRemoved ( SparkListenerBlockManagerRemoved )This abstract method is from 'SparkListener' interface.
onEnvironmentUpdate ( SparkListenerEnvironmentUpdate )This abstract method is from 'SparkListener' interface.
onExecutorMetricsUpdate ( SparkListenerExecutorMetricsUpdate )This abstract method is from 'SparkListener' interface.
onJobEnd ( SparkListenerJobEnd )This abstract method is from 'SparkListener' interface.
onJobStart ( SparkListenerJobStart )This abstract method is from 'SparkListener' interface.
onStageCompleted ( SparkListenerStageCompleted )This abstract method is from 'SparkListener' interface.
onStageSubmitted ( SparkListenerStageSubmitted )This abstract method is from 'SparkListener' interface.
onTaskEnd ( SparkListenerTaskEnd )This abstract method is from 'SparkListener' interface.
onTaskGettingResult ( SparkListenerTaskGettingResult )This abstract method is from 'SparkListener' interface.
onTaskStart ( SparkListenerTaskStart )This abstract method is from 'SparkListener' interface.
onUnpersistRDD ( SparkListenerUnpersistRDD )This abstract method is from 'SparkListener' interface.
addSparkListener ( SparkListener )1st parameter 'listener' of this method has type 'SparkListener'.
[+] SparkListenerExecutorAdded (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (15)
canEqual ( java.lang.Object )This method is from 'SparkListenerExecutorAdded' class.
copy ( long, java.lang.String, cluster.ExecutorInfo )This method is from 'SparkListenerExecutorAdded' class.
curried ( )This method is from 'SparkListenerExecutorAdded' class.
equals ( java.lang.Object )This method is from 'SparkListenerExecutorAdded' class.
executorId ( )This method is from 'SparkListenerExecutorAdded' class.
executorInfo ( )This method is from 'SparkListenerExecutorAdded' class.
hashCode ( )This method is from 'SparkListenerExecutorAdded' class.
productArity ( )This method is from 'SparkListenerExecutorAdded' class.
productElement ( int )This method is from 'SparkListenerExecutorAdded' class.
productIterator ( )This method is from 'SparkListenerExecutorAdded' class.
productPrefix ( )This method is from 'SparkListenerExecutorAdded' class.
SparkListenerExecutorAdded ( long, java.lang.String, cluster.ExecutorInfo )This constructor is from 'SparkListenerExecutorAdded' class.
time ( )This method is from 'SparkListenerExecutorAdded' class.
toString ( )This method is from 'SparkListenerExecutorAdded' class.
tupled ( )This method is from 'SparkListenerExecutorAdded' class.
[+] SparkListenerExecutorRemoved (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (15)
canEqual ( java.lang.Object )This method is from 'SparkListenerExecutorRemoved' class.
copy ( long, java.lang.String, java.lang.String )This method is from 'SparkListenerExecutorRemoved' class.
curried ( )This method is from 'SparkListenerExecutorRemoved' class.
equals ( java.lang.Object )This method is from 'SparkListenerExecutorRemoved' class.
executorId ( )This method is from 'SparkListenerExecutorRemoved' class.
hashCode ( )This method is from 'SparkListenerExecutorRemoved' class.
productArity ( )This method is from 'SparkListenerExecutorRemoved' class.
productElement ( int )This method is from 'SparkListenerExecutorRemoved' class.
productIterator ( )This method is from 'SparkListenerExecutorRemoved' class.
productPrefix ( )This method is from 'SparkListenerExecutorRemoved' class.
reason ( )This method is from 'SparkListenerExecutorRemoved' class.
SparkListenerExecutorRemoved ( long, java.lang.String, java.lang.String )This constructor is from 'SparkListenerExecutorRemoved' class.
time ( )This method is from 'SparkListenerExecutorRemoved' class.
toString ( )This method is from 'SparkListenerExecutorRemoved' class.
tupled ( )This method is from 'SparkListenerExecutorRemoved' class.
spark-hive_2.10-1.3.0.jar
package org.apache.spark.sql.hive
[+] HiveContext.QueryExecution (1)
| Change | Effect |
---|
1 | This class became abstract. | A client program may be interrupted by InstantiationError exception. |
[+] affected methods (1)
executePlan ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan )Return value of this method has type 'HiveContext.QueryExecution'.
spark-sql_2.10-1.3.0.jar
package org.apache.spark.sql
[+] DataFrame (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (114)
agg ( java.util.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
agg ( Column, Column... )This method is from 'DataFrame' class.
agg ( Column, scala.collection.Seq<Column> )This method is from 'DataFrame' class.
agg ( scala.collection.immutable.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
agg ( scala.Tuple2<java.lang.String,java.lang.String>, scala.collection.Seq<scala.Tuple2<java.lang.String,java.lang.String>> )This method is from 'DataFrame' class.
apply ( java.lang.String )This method is from 'DataFrame' class.
as ( java.lang.String )This method is from 'DataFrame' class.
as ( scala.Symbol )This method is from 'DataFrame' class.
cache ( )This method is from 'DataFrame' class.
cache ( )This method is from 'DataFrame' class.
col ( java.lang.String )This method is from 'DataFrame' class.
collect ( )This method is from 'DataFrame' class.
collect ( )This method is from 'DataFrame' class.
collectAsList ( )This method is from 'DataFrame' class.
columns ( )This method is from 'DataFrame' class.
count ( )This method is from 'DataFrame' class.
createJDBCTable ( java.lang.String, java.lang.String, boolean )This method is from 'DataFrame' class.
DataFrame ( SQLContext, catalyst.plans.logical.LogicalPlan )This constructor is from 'DataFrame' class.
DataFrame ( SQLContext, SQLContext.QueryExecution )This constructor is from 'DataFrame' class.
distinct ( )This method is from 'DataFrame' class.
dtypes ( )This method is from 'DataFrame' class.
except ( DataFrame )This method is from 'DataFrame' class.
explain ( )This method is from 'DataFrame' class.
explain ( boolean )This method is from 'DataFrame' class.
explode ( java.lang.String, java.lang.String, scala.Function1<A,scala.collection.TraversableOnce<B>>, scala.reflect.api.TypeTags.TypeTag<B> )This method is from 'DataFrame' class.
explode ( scala.collection.Seq<Column>, scala.Function1<Row,scala.collection.TraversableOnce<A>>, scala.reflect.api.TypeTags.TypeTag<A> )This method is from 'DataFrame' class.
filter ( java.lang.String )This method is from 'DataFrame' class.
filter ( Column )This method is from 'DataFrame' class.
first ( )This method is from 'DataFrame' class.
first ( )This method is from 'DataFrame' class.
flatMap ( scala.Function1<Row,scala.collection.TraversableOnce<R>>, scala.reflect.ClassTag<R> )This method is from 'DataFrame' class.
foreach ( scala.Function1<Row,scala.runtime.BoxedUnit> )This method is from 'DataFrame' class.
foreachPartition ( scala.Function1<scala.collection.Iterator<Row>,scala.runtime.BoxedUnit> )This method is from 'DataFrame' class.
groupBy ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
groupBy ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
groupBy ( Column... )This method is from 'DataFrame' class.
groupBy ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
head ( )This method is from 'DataFrame' class.
head ( int )This method is from 'DataFrame' class.
insertInto ( java.lang.String )This method is from 'DataFrame' class.
insertInto ( java.lang.String, boolean )This method is from 'DataFrame' class.
insertIntoJDBC ( java.lang.String, java.lang.String, boolean )This method is from 'DataFrame' class.
intersect ( DataFrame )This method is from 'DataFrame' class.
isLocal ( )This method is from 'DataFrame' class.
javaRDD ( )This method is from 'DataFrame' class.
javaToPython ( )This method is from 'DataFrame' class.
join ( DataFrame )This method is from 'DataFrame' class.
join ( DataFrame, Column )This method is from 'DataFrame' class.
join ( DataFrame, Column, java.lang.String )This method is from 'DataFrame' class.
limit ( int )This method is from 'DataFrame' class.
logicalPlan ( )This method is from 'DataFrame' class.
map ( scala.Function1<Row,R>, scala.reflect.ClassTag<R> )This method is from 'DataFrame' class.
mapPartitions ( scala.Function1<scala.collection.Iterator<Row>,scala.collection.Iterator<R>>, scala.reflect.ClassTag<R> )This method is from 'DataFrame' class.
numericColumns ( )This method is from 'DataFrame' class.
orderBy ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
orderBy ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
orderBy ( Column... )This method is from 'DataFrame' class.
orderBy ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
persist ( )This method is from 'DataFrame' class.
persist ( )This method is from 'DataFrame' class.
persist ( org.apache.spark.storage.StorageLevel )This method is from 'DataFrame' class.
persist ( org.apache.spark.storage.StorageLevel )This method is from 'DataFrame' class.
printSchema ( )This method is from 'DataFrame' class.
queryExecution ( )This method is from 'DataFrame' class.
rdd ( )This method is from 'DataFrame' class.
registerTempTable ( java.lang.String )This method is from 'DataFrame' class.
repartition ( int )This method is from 'DataFrame' class.
resolve ( java.lang.String )This method is from 'DataFrame' class.
sample ( boolean, double )This method is from 'DataFrame' class.
sample ( boolean, double, long )This method is from 'DataFrame' class.
save ( java.lang.String )This method is from 'DataFrame' class.
save ( java.lang.String, java.lang.String )This method is from 'DataFrame' class.
save ( java.lang.String, java.lang.String, SaveMode )This method is from 'DataFrame' class.
save ( java.lang.String, SaveMode )This method is from 'DataFrame' class.
save ( java.lang.String, SaveMode, java.util.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
save ( java.lang.String, SaveMode, scala.collection.immutable.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
saveAsParquetFile ( java.lang.String )This method is from 'DataFrame' class.
saveAsTable ( java.lang.String )This method is from 'DataFrame' class.
saveAsTable ( java.lang.String, java.lang.String )This method is from 'DataFrame' class.
saveAsTable ( java.lang.String, java.lang.String, SaveMode )This method is from 'DataFrame' class.
saveAsTable ( java.lang.String, java.lang.String, SaveMode, java.util.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
saveAsTable ( java.lang.String, java.lang.String, SaveMode, scala.collection.immutable.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
saveAsTable ( java.lang.String, SaveMode )This method is from 'DataFrame' class.
schema ( )This method is from 'DataFrame' class.
select ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
select ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
select ( Column... )This method is from 'DataFrame' class.
select ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
selectExpr ( java.lang.String... )This method is from 'DataFrame' class.
selectExpr ( scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
show ( )This method is from 'DataFrame' class.
show ( int )This method is from 'DataFrame' class.
showString ( int )This method is from 'DataFrame' class.
sort ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
sort ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
sort ( Column... )This method is from 'DataFrame' class.
sort ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
sqlContext ( )This method is from 'DataFrame' class.
take ( int )This method is from 'DataFrame' class.
take ( int )This method is from 'DataFrame' class.
toDF ( )This method is from 'DataFrame' class.
toDF ( java.lang.String... )This method is from 'DataFrame' class.
toDF ( scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
toJavaRDD ( )This method is from 'DataFrame' class.
toJSON ( )This method is from 'DataFrame' class.
toString ( )This method is from 'DataFrame' class.
unionAll ( DataFrame )This method is from 'DataFrame' class.
unpersist ( )This method is from 'DataFrame' class.
unpersist ( )This method is from 'DataFrame' class.
unpersist ( boolean )This method is from 'DataFrame' class.
unpersist ( boolean )This method is from 'DataFrame' class.
where ( Column )This method is from 'DataFrame' class.
withColumn ( java.lang.String, Column )This method is from 'DataFrame' class.
withColumnRenamed ( java.lang.String, java.lang.String )This method is from 'DataFrame' class.
[+] SQLContext.QueryExecution (1)
| Change | Effect |
---|
1 | This class became abstract. | A client program may be interrupted by InstantiationError exception. |
[+] affected methods (3)
executePlan ( catalyst.plans.logical.LogicalPlan )Return value of this method has type 'SQLContext.QueryExecution'.
executePlan ( catalyst.plans.logical.LogicalPlan )Return value of this method has type 'SQLContext.QueryExecution'.
executeSql ( java.lang.String )Return value of this method has type 'SQLContext.QueryExecution'.
to the top
Problems with Data Types, Medium Severity (1)
spark-core_2.10-1.3.0.jar
package org.apache.spark.scheduler
[+] LiveListenerBus (1)
| Change | Effect |
---|
1 | Removed super-class org.apache.spark.util.AsynchronousListenerBus<SparkListener,SparkListenerEvent>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (1)
listenerBus ( )Return value of this method has type 'LiveListenerBus'.
to the top
Problems with Data Types, Low Severity (1)
spark-hive_2.10-1.3.0.jar
package org.apache.spark.sql.hive
[+] HiveContext (1)
| Change | Effect |
---|
1 | Method sql ( java.lang.String ) has been moved up type hierarchy to sql ( java.lang.String ) | Method sql ( java.lang.String ) will be called instead of sql ( java.lang.String ) in a client program. |
[+] affected methods (1)
sql ( java.lang.String )Method 'sql ( java.lang.String )' will be called instead of this method in a client program.
to the top
Problems with Methods, Low Severity (1)
spark-hive_2.10-1.3.0.jar, HiveContext
package org.apache.spark.sql.hive
[+] HiveContext.runHive ( String cmd, int maxRows ) : scala.collection.Seq<String> (1)
[mangled: org/apache/spark/sql/hive/HiveContext.runHive:(Ljava/lang/String;I)Lscala/collection/Seq;]
| Change | Effect |
---|
1 | Method became non-synchronized.
| A multi-threaded client program may change behavior. |
to the top
Java ARchives (8)
spark-catalyst_2.10-1.3.0.jar
spark-core_2.10-1.3.0.jar
spark-hive_2.10-1.3.0.jar
spark-mllib_2.10-1.3.0.jar
spark-sql_2.10-1.3.0.jar
spark-streaming-kafka_2.10-1.3.0.jar
spark-streaming_2.10-1.3.0.jar
spark-yarn_2.10-1.3.0.jar
to the top
Generated on Tue Feb 2 00:11:57 2016 for spark-testing-base_2.10-1.3.1_0.3.1 by Java API Compliance Checker 1.4.1
A tool for checking backward compatibility of a Java library API