Binary compatibility report for the spark-testing-base_2.10-0.3.1 library between 1.6.0 and 1.1.0 versions (relating to the portability of client application spark-testing-base_2.10-0.3.1.jar)
Test Info
Library Name | spark-testing-base_2.10-0.3.1 |
Version #1 | 1.6.0 |
Version #2 | 1.1.0 |
Java Version | 1.7.0_85 |
Test Results
Total Java ARchives | 8 |
---|
Total Methods / Classes | 1404 / 5294 |
---|
Verdict | Incompatible (48.3%) |
Problem Summary
| Severity | Count |
---|
Added Methods | - | 43 |
---|
Removed Methods | High | 500 |
---|
Problems with Data Types | High | 22 |
---|
Medium | 4 |
Low | 4 |
Problems with Methods | High | 1 |
---|
Medium | 0 |
Low | 1 |
Added Methods (43)
spark-core_2.10-1.1.0.jar, Clock.class
package org.apache.spark.util
Clock.getTime ( ) [abstract] : long
[mangled: org/apache/spark/util/Clock.getTime:()J]
spark-core_2.10-1.1.0.jar, RDD<T>.class
package org.apache.spark.rdd
RDD<T>.markCheckpointed ( RDD<?> checkpointRDD ) : void
[mangled: org/apache/spark/rdd/RDD<T>.markCheckpointed:(Lorg/apache/spark/rdd/RDD;)V]
spark-core_2.10-1.1.0.jar, SparkConf.class
package org.apache.spark
SparkConf.settings ( ) : scala.collection.mutable.HashMap<String,String>
[mangled: org/apache/spark/SparkConf.settings:()Lscala/collection/mutable/HashMap;]
spark-core_2.10-1.1.0.jar, SparkContext.class
package org.apache.spark
SparkContext.preferredNodeLocationData ( ) : scala.collection.Map<String,scala.collection.Set<scheduler.SplitInfo>>
[mangled: org/apache/spark/SparkContext.preferredNodeLocationData:()Lscala/collection/Map;]
SparkContext.preferredNodeLocationData_.eq ( scala.collection.Map<String,scala.collection.Set<scheduler.SplitInfo>> p1 ) : void
[mangled: org/apache/spark/SparkContext.preferredNodeLocationData_.eq:(Lscala/collection/Map;)V]
SparkContext.ui ( ) : ui.SparkUI
[mangled: org/apache/spark/SparkContext.ui:()Lorg/apache/spark/ui/SparkUI;]
spark-core_2.10-1.1.0.jar, SparkListenerApplicationStart.class
package org.apache.spark.scheduler
SparkListenerApplicationStart.copy ( String appName, long time, String sparkUser ) : SparkListenerApplicationStart
[mangled: org/apache/spark/scheduler/SparkListenerApplicationStart.copy:(Ljava/lang/String;JLjava/lang/String;)Lorg/apache/spark/scheduler/SparkListenerApplicationStart;]
SparkListenerApplicationStart.SparkListenerApplicationStart ( String appName, long time, String sparkUser )
[mangled: org/apache/spark/scheduler/SparkListenerApplicationStart."<init>":(Ljava/lang/String;JLjava/lang/String;)V]
spark-core_2.10-1.1.0.jar, SparkListenerBlockManagerAdded.class
package org.apache.spark.scheduler
SparkListenerBlockManagerAdded.copy ( org.apache.spark.storage.BlockManagerId blockManagerId, long maxMem ) : SparkListenerBlockManagerAdded
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded.copy:(Lorg/apache/spark/storage/BlockManagerId;J)Lorg/apache/spark/scheduler/SparkListenerBlockManagerAdded;]
SparkListenerBlockManagerAdded.SparkListenerBlockManagerAdded ( org.apache.spark.storage.BlockManagerId blockManagerId, long maxMem )
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded."<init>":(Lorg/apache/spark/storage/BlockManagerId;J)V]
spark-core_2.10-1.1.0.jar, SparkListenerBlockManagerRemoved.class
package org.apache.spark.scheduler
SparkListenerBlockManagerRemoved.andThen ( scala.Function1<SparkListenerBlockManagerRemoved,A> p1 ) [static] : scala.Function1<org.apache.spark.storage.BlockManagerId,A>
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.andThen:(Lscala/Function1;)Lscala/Function1;]
SparkListenerBlockManagerRemoved.compose ( scala.Function1<A,org.apache.spark.storage.BlockManagerId> p1 ) [static] : scala.Function1<A,SparkListenerBlockManagerRemoved>
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.compose:(Lscala/Function1;)Lscala/Function1;]
SparkListenerBlockManagerRemoved.copy ( org.apache.spark.storage.BlockManagerId blockManagerId ) : SparkListenerBlockManagerRemoved
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.copy:(Lorg/apache/spark/storage/BlockManagerId;)Lorg/apache/spark/scheduler/SparkListenerBlockManagerRemoved;]
SparkListenerBlockManagerRemoved.SparkListenerBlockManagerRemoved ( org.apache.spark.storage.BlockManagerId blockManagerId )
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved."<init>":(Lorg/apache/spark/storage/BlockManagerId;)V]
spark-core_2.10-1.1.0.jar, SparkListenerJobEnd.class
package org.apache.spark.scheduler
SparkListenerJobEnd.copy ( int jobId, JobResult jobResult ) : SparkListenerJobEnd
[mangled: org/apache/spark/scheduler/SparkListenerJobEnd.copy:(ILorg/apache/spark/scheduler/JobResult;)Lorg/apache/spark/scheduler/SparkListenerJobEnd;]
SparkListenerJobEnd.SparkListenerJobEnd ( int jobId, JobResult jobResult )
[mangled: org/apache/spark/scheduler/SparkListenerJobEnd."<init>":(ILorg/apache/spark/scheduler/JobResult;)V]
spark-core_2.10-1.1.0.jar, SparkListenerJobStart.class
package org.apache.spark.scheduler
SparkListenerJobStart.copy ( int jobId, scala.collection.Seq<Object> stageIds, java.util.Properties properties ) : SparkListenerJobStart
[mangled: org/apache/spark/scheduler/SparkListenerJobStart.copy:(ILscala/collection/Seq;Ljava/util/Properties;)Lorg/apache/spark/scheduler/SparkListenerJobStart;]
SparkListenerJobStart.SparkListenerJobStart ( int jobId, scala.collection.Seq<Object> stageIds, java.util.Properties properties )
[mangled: org/apache/spark/scheduler/SparkListenerJobStart."<init>":(ILscala/collection/Seq;Ljava/util/Properties;)V]
spark-core_2.10-1.1.0.jar, TaskMetrics.class
package org.apache.spark.executor
TaskMetrics.diskBytesSpilled_.eq ( long p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.diskBytesSpilled_.eq:(J)V]
TaskMetrics.executorDeserializeTime_.eq ( long p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.executorDeserializeTime_.eq:(J)V]
TaskMetrics.executorRunTime_.eq ( long p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.executorRunTime_.eq:(J)V]
TaskMetrics.hostname_.eq ( String p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.hostname_.eq:(Ljava/lang/String;)V]
TaskMetrics.inputMetrics_.eq ( scala.Option<InputMetrics> p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.inputMetrics_.eq:(Lscala/Option;)V]
TaskMetrics.jvmGCTime_.eq ( long p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.jvmGCTime_.eq:(J)V]
TaskMetrics.memoryBytesSpilled_.eq ( long p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.memoryBytesSpilled_.eq:(J)V]
TaskMetrics.resultSerializationTime_.eq ( long p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.resultSerializationTime_.eq:(J)V]
TaskMetrics.resultSize_.eq ( long p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.resultSize_.eq:(J)V]
spark-hive_2.10-1.1.0.jar, HiveContext.class
package org.apache.spark.sql.hive
HiveContext.createTable ( String tableName, boolean allowExisting, scala.reflect.api.TypeTags.TypeTag<A> p3 ) : void
[mangled: org/apache/spark/sql/hive/HiveContext.createTable:(Ljava/lang/String;ZLscala/reflect/api/TypeTags$TypeTag;)V]
HiveContext.dialect ( ) : String
[mangled: org/apache/spark/sql/hive/HiveContext.dialect:()Ljava/lang/String;]
HiveContext.executePlan ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan plan ) : org.apache.spark.sql.SQLContext.QueryExecution
[mangled: org/apache/spark/sql/hive/HiveContext.executePlan:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/SQLContext$QueryExecution;]
HiveContext.functionRegistry ( ) : HiveFunctionRegistry
[mangled: org/apache/spark/sql/hive/HiveContext.functionRegistry:()Lorg/apache/spark/sql/hive/HiveFunctionRegistry;]
HiveContext.outputBuffer ( ) : java.io.OutputStream
[mangled: org/apache/spark/sql/hive/HiveContext.outputBuffer:()Ljava/io/OutputStream;]
HiveContext.runHive ( String cmd, int maxRows ) : scala.collection.Seq<String>
[mangled: org/apache/spark/sql/hive/HiveContext.runHive:(Ljava/lang/String;I)Lscala/collection/Seq;]
HiveContext.sessionState ( ) : org.apache.hadoop.hive.ql.session.SessionState
[mangled: org/apache/spark/sql/hive/HiveContext.sessionState:()Lorg/apache/hadoop/hive/ql/session/SessionState;]
HiveContext.sql ( String sqlText ) : org.apache.spark.sql.SchemaRDD
[mangled: org/apache/spark/sql/hive/HiveContext.sql:(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;]
spark-streaming_2.10-1.1.0.jar, Checkpoint.class
package org.apache.spark.streaming
Checkpoint.getCheckpointFiles ( String p1, org.apache.hadoop.fs.FileSystem p2 ) [static] : scala.collection.Seq<org.apache.hadoop.fs.Path>
[mangled: org/apache/spark/streaming/Checkpoint.getCheckpointFiles:(Ljava/lang/String;Lorg/apache/hadoop/fs/FileSystem;)Lscala/collection/Seq;]
Checkpoint.sparkConf ( ) : org.apache.spark.SparkConf
[mangled: org/apache/spark/streaming/Checkpoint.sparkConf:()Lorg/apache/spark/SparkConf;]
spark-streaming_2.10-1.1.0.jar, DStream<T>.class
package org.apache.spark.streaming.dstream
DStream<T>.validate ( ) : void
[mangled: org/apache/spark/streaming/dstream/DStream<T>.validate:()V]
spark-streaming_2.10-1.1.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.getNewReceiverStreamId ( ) : int
[mangled: org/apache/spark/streaming/StreamingContext.getNewReceiverStreamId:()I]
StreamingContext.state ( ) : scala.Enumeration.Value
[mangled: org/apache/spark/streaming/StreamingContext.state:()Lscala/Enumeration$Value;]
StreamingContext.state_.eq ( scala.Enumeration.Value p1 ) : void
[mangled: org/apache/spark/streaming/StreamingContext.state_.eq:(Lscala/Enumeration$Value;)V]
StreamingContext.StreamingContextState ( ) : StreamingContext.StreamingContextState.
[mangled: org/apache/spark/streaming/StreamingContext.StreamingContextState:()Lorg/apache/spark/streaming/StreamingContext$StreamingContextState$;]
StreamingContext.uiTab ( ) : ui.StreamingTab
[mangled: org/apache/spark/streaming/StreamingContext.uiTab:()Lorg/apache/spark/streaming/ui/StreamingTab;]
to the top
Removed Methods (500)
spark-catalyst_2.10-1.6.0.jar, Row.class
package org.apache.spark.sql
Row.anyNull ( ) [abstract] : boolean
[mangled: org/apache/spark/sql/Row.anyNull:()Z]
Row.apply ( int p1 ) [abstract] : Object
[mangled: org/apache/spark/sql/Row.apply:(I)Ljava/lang/Object;]
Row.copy ( ) [abstract] : Row
[mangled: org/apache/spark/sql/Row.copy:()Lorg/apache/spark/sql/Row;]
Row.equals ( Object p1 ) [abstract] : boolean
[mangled: org/apache/spark/sql/Row.equals:(Ljava/lang/Object;)Z]
Row.fieldIndex ( String p1 ) [abstract] : int
[mangled: org/apache/spark/sql/Row.fieldIndex:(Ljava/lang/String;)I]
Row.get ( int p1 ) [abstract] : Object
[mangled: org/apache/spark/sql/Row.get:(I)Ljava/lang/Object;]
Row.getAs ( int p1 ) [abstract] : T
[mangled: org/apache/spark/sql/Row.getAs:(I)Ljava/lang/Object;]
Row.getAs ( String p1 ) [abstract] : T
[mangled: org/apache/spark/sql/Row.getAs:(Ljava/lang/String;)Ljava/lang/Object;]
Row.getBoolean ( int p1 ) [abstract] : boolean
[mangled: org/apache/spark/sql/Row.getBoolean:(I)Z]
Row.getByte ( int p1 ) [abstract] : byte
[mangled: org/apache/spark/sql/Row.getByte:(I)B]
Row.getDate ( int p1 ) [abstract] : java.sql.Date
[mangled: org/apache/spark/sql/Row.getDate:(I)Ljava/sql/Date;]
Row.getDecimal ( int p1 ) [abstract] : java.math.BigDecimal
[mangled: org/apache/spark/sql/Row.getDecimal:(I)Ljava/math/BigDecimal;]
Row.getDouble ( int p1 ) [abstract] : double
[mangled: org/apache/spark/sql/Row.getDouble:(I)D]
Row.getFloat ( int p1 ) [abstract] : float
[mangled: org/apache/spark/sql/Row.getFloat:(I)F]
Row.getInt ( int p1 ) [abstract] : int
[mangled: org/apache/spark/sql/Row.getInt:(I)I]
Row.getJavaMap ( int p1 ) [abstract] : java.util.Map<K,V>
[mangled: org/apache/spark/sql/Row.getJavaMap:(I)Ljava/util/Map;]
Row.getList ( int p1 ) [abstract] : java.util.List<T>
[mangled: org/apache/spark/sql/Row.getList:(I)Ljava/util/List;]
Row.getLong ( int p1 ) [abstract] : long
[mangled: org/apache/spark/sql/Row.getLong:(I)J]
Row.getMap ( int p1 ) [abstract] : scala.collection.Map<K,V>
[mangled: org/apache/spark/sql/Row.getMap:(I)Lscala/collection/Map;]
Row.getSeq ( int p1 ) [abstract] : scala.collection.Seq<T>
[mangled: org/apache/spark/sql/Row.getSeq:(I)Lscala/collection/Seq;]
Row.getShort ( int p1 ) [abstract] : short
[mangled: org/apache/spark/sql/Row.getShort:(I)S]
Row.getString ( int p1 ) [abstract] : String
[mangled: org/apache/spark/sql/Row.getString:(I)Ljava/lang/String;]
Row.getStruct ( int p1 ) [abstract] : Row
[mangled: org/apache/spark/sql/Row.getStruct:(I)Lorg/apache/spark/sql/Row;]
Row.getTimestamp ( int p1 ) [abstract] : java.sql.Timestamp
[mangled: org/apache/spark/sql/Row.getTimestamp:(I)Ljava/sql/Timestamp;]
Row.getValuesMap ( scala.collection.Seq<String> p1 ) [abstract] : scala.collection.immutable.Map<String,T>
[mangled: org/apache/spark/sql/Row.getValuesMap:(Lscala/collection/Seq;)Lscala/collection/immutable/Map;]
Row.hashCode ( ) [abstract] : int
[mangled: org/apache/spark/sql/Row.hashCode:()I]
Row.isNullAt ( int p1 ) [abstract] : boolean
[mangled: org/apache/spark/sql/Row.isNullAt:(I)Z]
Row.length ( ) [abstract] : int
[mangled: org/apache/spark/sql/Row.length:()I]
Row.mkString ( ) [abstract] : String
[mangled: org/apache/spark/sql/Row.mkString:()Ljava/lang/String;]
Row.mkString ( String p1 ) [abstract] : String
[mangled: org/apache/spark/sql/Row.mkString:(Ljava/lang/String;)Ljava/lang/String;]
Row.mkString ( String p1, String p2, String p3 ) [abstract] : String
[mangled: org/apache/spark/sql/Row.mkString:(Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;)Ljava/lang/String;]
Row.schema ( ) [abstract] : types.StructType
[mangled: org/apache/spark/sql/Row.schema:()Lorg/apache/spark/sql/types/StructType;]
Row.size ( ) [abstract] : int
[mangled: org/apache/spark/sql/Row.size:()I]
Row.toSeq ( ) [abstract] : scala.collection.Seq<Object>
[mangled: org/apache/spark/sql/Row.toSeq:()Lscala/collection/Seq;]
Row.toString ( ) [abstract] : String
[mangled: org/apache/spark/sql/Row.toString:()Ljava/lang/String;]
spark-core_2.10-1.6.0.jar, Clock.class
package org.apache.spark.util
Clock.getTimeMillis ( ) [abstract] : long
[mangled: org/apache/spark/util/Clock.getTimeMillis:()J]
Clock.waitTillTime ( long p1 ) [abstract] : long
[mangled: org/apache/spark/util/Clock.waitTillTime:(J)J]
spark-core_2.10-1.6.0.jar, InputMetrics.class
package org.apache.spark.executor
InputMetrics.recordsRead ( ) : long
[mangled: org/apache/spark/executor/InputMetrics.recordsRead:()J]
spark-core_2.10-1.6.0.jar, JavaSparkContext.class
package org.apache.spark.api.java
JavaSparkContext.setLogLevel ( String logLevel ) : void
[mangled: org/apache/spark/api/java/JavaSparkContext.setLogLevel:(Ljava/lang/String;)V]
spark-core_2.10-1.6.0.jar, OutputMetrics.class
package org.apache.spark.executor
OutputMetrics.recordsWritten ( ) : long
[mangled: org/apache/spark/executor/OutputMetrics.recordsWritten:()J]
spark-core_2.10-1.6.0.jar, RDD<T>.class
package org.apache.spark.rdd
RDD<T>.doubleRDDToDoubleRDDFunctions ( RDD<Object> p1 ) [static] : DoubleRDDFunctions
[mangled: org/apache/spark/rdd/RDD<T>.doubleRDDToDoubleRDDFunctions:(Lorg/apache/spark/rdd/RDD;)Lorg/apache/spark/rdd/DoubleRDDFunctions;]
RDD<T>.getNumPartitions ( ) : int
[mangled: org/apache/spark/rdd/RDD<T>.getNumPartitions:()I]
RDD<T>.isCheckpointedAndMaterialized ( ) : boolean
[mangled: org/apache/spark/rdd/RDD<T>.isCheckpointedAndMaterialized:()Z]
RDD<T>.isEmpty ( ) : boolean
[mangled: org/apache/spark/rdd/RDD<T>.isEmpty:()Z]
RDD<T>.isLocallyCheckpointed ( ) : boolean
[mangled: org/apache/spark/rdd/RDD<T>.isLocallyCheckpointed:()Z]
RDD<T>.localCheckpoint ( ) : RDD<T>
[mangled: org/apache/spark/rdd/RDD<T>.localCheckpoint:()Lorg/apache/spark/rdd/RDD;]
RDD<T>.mapPartitionsInternal ( scala.Function1<scala.collection.Iterator<T>,scala.collection.Iterator<U>> f, boolean preservesPartitioning, scala.reflect.ClassTag<U> p3 ) : RDD<U>
[mangled: org/apache/spark/rdd/RDD<T>.mapPartitionsInternal:(Lscala/Function1;ZLscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
RDD<T>.markCheckpointed ( ) : void
[mangled: org/apache/spark/rdd/RDD<T>.markCheckpointed:()V]
RDD<T>.numericRDDToDoubleRDDFunctions ( RDD<T> p1, scala.math.Numeric<T> p2 ) [static] : DoubleRDDFunctions
[mangled: org/apache/spark/rdd/RDD<T>.numericRDDToDoubleRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/math/Numeric;)Lorg/apache/spark/rdd/DoubleRDDFunctions;]
RDD<T>.RDD..doCheckpointCalled ( ) : boolean
[mangled: org/apache/spark/rdd/RDD<T>.org.apache.spark.rdd.RDD..doCheckpointCalled:()Z]
RDD<T>.RDD..doCheckpointCalled_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/rdd/RDD<T>.org.apache.spark.rdd.RDD..doCheckpointCalled_.eq:(Z)V]
RDD<T>.RDD..sc ( ) : org.apache.spark.SparkContext
[mangled: org/apache/spark/rdd/RDD<T>.org.apache.spark.rdd.RDD..sc:()Lorg/apache/spark/SparkContext;]
RDD<T>.parent ( int j, scala.reflect.ClassTag<U> p2 ) : RDD<U>
[mangled: org/apache/spark/rdd/RDD<T>.parent:(ILscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
RDD<T>.randomSampleWithRange ( double lb, double ub, long seed ) : RDD<T>
[mangled: org/apache/spark/rdd/RDD<T>.randomSampleWithRange:(DDJ)Lorg/apache/spark/rdd/RDD;]
RDD<T>.rddToAsyncRDDActions ( RDD<T> p1, scala.reflect.ClassTag<T> p2 ) [static] : AsyncRDDActions<T>
[mangled: org/apache/spark/rdd/RDD<T>.rddToAsyncRDDActions:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/AsyncRDDActions;]
RDD<T>.rddToOrderedRDDFunctions ( RDD<scala.Tuple2<K,V>> p1, scala.math.Ordering<K> p2, scala.reflect.ClassTag<K> p3, scala.reflect.ClassTag<V> p4 ) [static] : OrderedRDDFunctions<K,V,scala.Tuple2<K,V>>
[mangled: org/apache/spark/rdd/RDD<T>.rddToOrderedRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/math/Ordering;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/OrderedRDDFunctions;]
RDD<T>.rddToPairRDDFunctions ( RDD<scala.Tuple2<K,V>> p1, scala.reflect.ClassTag<K> p2, scala.reflect.ClassTag<V> p3, scala.math.Ordering<K> p4 ) [static] : PairRDDFunctions<K,V>
[mangled: org/apache/spark/rdd/RDD<T>.rddToPairRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/math/Ordering;)Lorg/apache/spark/rdd/PairRDDFunctions;]
RDD<T>.rddToSequenceFileRDDFunctions ( RDD<scala.Tuple2<K,V>> p1, scala.reflect.ClassTag<K> p2, scala.reflect.ClassTag<V> p3, org.apache.spark.WritableFactory<K> p4, org.apache.spark.WritableFactory<V> p5 ) [static] : SequenceFileRDDFunctions<K,V>
[mangled: org/apache/spark/rdd/RDD<T>.rddToSequenceFileRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lorg/apache/spark/WritableFactory;Lorg/apache/spark/WritableFactory;)Lorg/apache/spark/rdd/SequenceFileRDDFunctions;]
RDD<T>.scope ( ) : scala.Option<RDDOperationScope>
[mangled: org/apache/spark/rdd/RDD<T>.scope:()Lscala/Option;]
RDD<T>.treeAggregate ( U zeroValue, scala.Function2<U,T,U> seqOp, scala.Function2<U,U,U> combOp, int depth, scala.reflect.ClassTag<U> p5 ) : U
[mangled: org/apache/spark/rdd/RDD<T>.treeAggregate:(Ljava/lang/Object;Lscala/Function2;Lscala/Function2;ILscala/reflect/ClassTag;)Ljava/lang/Object;]
RDD<T>.treeReduce ( scala.Function2<T,T,T> f, int depth ) : T
[mangled: org/apache/spark/rdd/RDD<T>.treeReduce:(Lscala/Function2;I)Ljava/lang/Object;]
RDD<T>.withScope ( scala.Function0<U> body ) : U
[mangled: org/apache/spark/rdd/RDD<T>.withScope:(Lscala/Function0;)Ljava/lang/Object;]
spark-core_2.10-1.6.0.jar, SparkConf.class
package org.apache.spark
SparkConf.getAppId ( ) : String
[mangled: org/apache/spark/SparkConf.getAppId:()Ljava/lang/String;]
SparkConf.getAvroSchema ( ) : scala.collection.immutable.Map<Object,String>
[mangled: org/apache/spark/SparkConf.getAvroSchema:()Lscala/collection/immutable/Map;]
SparkConf.getDeprecatedConfig ( String p1, SparkConf p2 ) [static] : scala.Option<String>
[mangled: org/apache/spark/SparkConf.getDeprecatedConfig:(Ljava/lang/String;Lorg/apache/spark/SparkConf;)Lscala/Option;]
SparkConf.getSizeAsBytes ( String key ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsBytes:(Ljava/lang/String;)J]
SparkConf.getSizeAsBytes ( String key, long defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsBytes:(Ljava/lang/String;J)J]
SparkConf.getSizeAsBytes ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsBytes:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getSizeAsGb ( String key ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsGb:(Ljava/lang/String;)J]
SparkConf.getSizeAsGb ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsGb:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getSizeAsKb ( String key ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsKb:(Ljava/lang/String;)J]
SparkConf.getSizeAsKb ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsKb:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getSizeAsMb ( String key ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsMb:(Ljava/lang/String;)J]
SparkConf.getSizeAsMb ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsMb:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getTimeAsMs ( String key ) : long
[mangled: org/apache/spark/SparkConf.getTimeAsMs:(Ljava/lang/String;)J]
SparkConf.getTimeAsMs ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getTimeAsMs:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getTimeAsSeconds ( String key ) : long
[mangled: org/apache/spark/SparkConf.getTimeAsSeconds:(Ljava/lang/String;)J]
SparkConf.getTimeAsSeconds ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getTimeAsSeconds:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.logDeprecationWarning ( String p1 ) [static] : void
[mangled: org/apache/spark/SparkConf.logDeprecationWarning:(Ljava/lang/String;)V]
SparkConf.registerAvroSchemas ( scala.collection.Seq<org.apache.avro.Schema> schemas ) : SparkConf
[mangled: org/apache/spark/SparkConf.registerAvroSchemas:(Lscala/collection/Seq;)Lorg/apache/spark/SparkConf;]
SparkConf.registerKryoClasses ( Class<?>[ ] classes ) : SparkConf
[mangled: org/apache/spark/SparkConf.registerKryoClasses:([Ljava/lang/Class;)Lorg/apache/spark/SparkConf;]
spark-core_2.10-1.6.0.jar, SparkContext.class
package org.apache.spark
SparkContext.addFile ( String path, boolean recursive ) : void
[mangled: org/apache/spark/SparkContext.addFile:(Ljava/lang/String;Z)V]
SparkContext.applicationAttemptId ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.applicationAttemptId:()Lscala/Option;]
SparkContext.applicationId ( ) : String
[mangled: org/apache/spark/SparkContext.applicationId:()Ljava/lang/String;]
SparkContext.binaryFiles ( String path, int minPartitions ) : rdd.RDD<scala.Tuple2<String,input.PortableDataStream>>
[mangled: org/apache/spark/SparkContext.binaryFiles:(Ljava/lang/String;I)Lorg/apache/spark/rdd/RDD;]
SparkContext.binaryRecords ( String path, int recordLength, org.apache.hadoop.conf.Configuration conf ) : rdd.RDD<byte[ ]>
[mangled: org/apache/spark/SparkContext.binaryRecords:(Ljava/lang/String;ILorg/apache/hadoop/conf/Configuration;)Lorg/apache/spark/rdd/RDD;]
SparkContext.createSparkEnv ( SparkConf conf, boolean isLocal, scheduler.LiveListenerBus listenerBus ) : SparkEnv
[mangled: org/apache/spark/SparkContext.createSparkEnv:(Lorg/apache/spark/SparkConf;ZLorg/apache/spark/scheduler/LiveListenerBus;)Lorg/apache/spark/SparkEnv;]
SparkContext.eventLogCodec ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.eventLogCodec:()Lscala/Option;]
SparkContext.eventLogDir ( ) : scala.Option<java.net.URI>
[mangled: org/apache/spark/SparkContext.eventLogDir:()Lscala/Option;]
SparkContext.executorAllocationManager ( ) : scala.Option<ExecutorAllocationManager>
[mangled: org/apache/spark/SparkContext.executorAllocationManager:()Lscala/Option;]
SparkContext.externalBlockStoreFolderName ( ) : String
[mangled: org/apache/spark/SparkContext.externalBlockStoreFolderName:()Ljava/lang/String;]
SparkContext.getExecutorThreadDump ( String executorId ) : scala.Option<util.ThreadStackTrace[ ]>
[mangled: org/apache/spark/SparkContext.getExecutorThreadDump:(Ljava/lang/String;)Lscala/Option;]
SparkContext.getOrCreate ( ) [static] : SparkContext
[mangled: org/apache/spark/SparkContext.getOrCreate:()Lorg/apache/spark/SparkContext;]
SparkContext.getOrCreate ( SparkConf p1 ) [static] : SparkContext
[mangled: org/apache/spark/SparkContext.getOrCreate:(Lorg/apache/spark/SparkConf;)Lorg/apache/spark/SparkContext;]
SparkContext.getRDDStorageInfo ( scala.Function1<rdd.RDD<?>,Object> filter ) : storage.RDDInfo[ ]
[mangled: org/apache/spark/SparkContext.getRDDStorageInfo:(Lscala/Function1;)[Lorg/apache/spark/storage/RDDInfo;]
SparkContext.isEventLogEnabled ( ) : boolean
[mangled: org/apache/spark/SparkContext.isEventLogEnabled:()Z]
SparkContext.isStopped ( ) : boolean
[mangled: org/apache/spark/SparkContext.isStopped:()Z]
SparkContext.jobProgressListener ( ) : ui.jobs.JobProgressListener
[mangled: org/apache/spark/SparkContext.jobProgressListener:()Lorg/apache/spark/ui/jobs/JobProgressListener;]
SparkContext.killAndReplaceExecutor ( String executorId ) : boolean
[mangled: org/apache/spark/SparkContext.killAndReplaceExecutor:(Ljava/lang/String;)Z]
SparkContext.killExecutor ( String executorId ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutor:(Ljava/lang/String;)Z]
SparkContext.killExecutors ( scala.collection.Seq<String> executorIds ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutors:(Lscala/collection/Seq;)Z]
SparkContext.metricsSystem ( ) : metrics.MetricsSystem
[mangled: org/apache/spark/SparkContext.metricsSystem:()Lorg/apache/spark/metrics/MetricsSystem;]
SparkContext.SparkContext.._applicationId ( ) : String
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._applicationId:()Ljava/lang/String;]
SparkContext.SparkContext.._cleaner ( ) : scala.Option<ContextCleaner>
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._cleaner:()Lscala/Option;]
SparkContext.SparkContext.._conf ( ) : SparkConf
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._conf:()Lorg/apache/spark/SparkConf;]
SparkContext.SparkContext.._dagScheduler ( ) : scheduler.DAGScheduler
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._dagScheduler:()Lorg/apache/spark/scheduler/DAGScheduler;]
SparkContext.SparkContext.._env ( ) : SparkEnv
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._env:()Lorg/apache/spark/SparkEnv;]
SparkContext.SparkContext.._eventLogger ( ) : scala.Option<scheduler.EventLoggingListener>
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._eventLogger:()Lscala/Option;]
SparkContext.SparkContext.._executorAllocationManager ( ) : scala.Option<ExecutorAllocationManager>
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._executorAllocationManager:()Lscala/Option;]
SparkContext.SparkContext.._heartbeatReceiver ( ) : rpc.RpcEndpointRef
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._heartbeatReceiver:()Lorg/apache/spark/rpc/RpcEndpointRef;]
SparkContext.SparkContext.._listenerBusStarted_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._listenerBusStarted_.eq:(Z)V]
SparkContext.SparkContext.._progressBar ( ) : scala.Option<ui.ConsoleProgressBar>
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._progressBar:()Lscala/Option;]
SparkContext.SparkContext.._ui ( ) : scala.Option<ui.SparkUI>
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._ui:()Lscala/Option;]
SparkContext.SparkContext..assertNotStopped ( ) : void
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext..assertNotStopped:()V]
SparkContext.SparkContext..creationSite ( ) : util.CallSite
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext..creationSite:()Lorg/apache/spark/util/CallSite;]
SparkContext.SparkContext..postApplicationEnd ( ) : void
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext..postApplicationEnd:()V]
SparkContext.progressBar ( ) : scala.Option<ui.ConsoleProgressBar>
[mangled: org/apache/spark/SparkContext.progressBar:()Lscala/Option;]
SparkContext.range ( long start, long end, long step, int numSlices ) : rdd.RDD<Object>
[mangled: org/apache/spark/SparkContext.range:(JJJI)Lorg/apache/spark/rdd/RDD;]
SparkContext.requestExecutors ( int numAdditionalExecutors ) : boolean
[mangled: org/apache/spark/SparkContext.requestExecutors:(I)Z]
SparkContext.requestTotalExecutors ( int numExecutors, int localityAwareTasks, scala.collection.immutable.Map<String,Object> hostToLocalTaskCount ) : boolean
[mangled: org/apache/spark/SparkContext.requestTotalExecutors:(IILscala/collection/immutable/Map;)Z]
SparkContext.runJob ( rdd.RDD<T> rdd, scala.Function1<scala.collection.Iterator<T>,U> func, scala.collection.Seq<Object> partitions, scala.reflect.ClassTag<U> p4 ) : Object
[mangled: org/apache/spark/SparkContext.runJob:(Lorg/apache/spark/rdd/RDD;Lscala/Function1;Lscala/collection/Seq;Lscala/reflect/ClassTag;)Ljava/lang/Object;]
SparkContext.runJob ( rdd.RDD<T> rdd, scala.Function2<TaskContext,scala.collection.Iterator<T>,U> func, scala.collection.Seq<Object> partitions, scala.Function2<Object,U,scala.runtime.BoxedUnit> resultHandler, scala.reflect.ClassTag<U> p5 ) : void
[mangled: org/apache/spark/SparkContext.runJob:(Lorg/apache/spark/rdd/RDD;Lscala/Function2;Lscala/collection/Seq;Lscala/Function2;Lscala/reflect/ClassTag;)V]
SparkContext.runJob ( rdd.RDD<T> rdd, scala.Function2<TaskContext,scala.collection.Iterator<T>,U> func, scala.collection.Seq<Object> partitions, scala.reflect.ClassTag<U> p4 ) : Object
[mangled: org/apache/spark/SparkContext.runJob:(Lorg/apache/spark/rdd/RDD;Lscala/Function2;Lscala/collection/Seq;Lscala/reflect/ClassTag;)Ljava/lang/Object;]
SparkContext.schedulerBackend ( ) : scheduler.SchedulerBackend
[mangled: org/apache/spark/SparkContext.schedulerBackend:()Lorg/apache/spark/scheduler/SchedulerBackend;]
SparkContext.schedulerBackend_.eq ( scheduler.SchedulerBackend sb ) : void
[mangled: org/apache/spark/SparkContext.schedulerBackend_.eq:(Lorg/apache/spark/scheduler/SchedulerBackend;)V]
SparkContext.setCallSite ( util.CallSite callSite ) : void
[mangled: org/apache/spark/SparkContext.setCallSite:(Lorg/apache/spark/util/CallSite;)V]
SparkContext.setLogLevel ( String logLevel ) : void
[mangled: org/apache/spark/SparkContext.setLogLevel:(Ljava/lang/String;)V]
SparkContext.statusTracker ( ) : SparkStatusTracker
[mangled: org/apache/spark/SparkContext.statusTracker:()Lorg/apache/spark/SparkStatusTracker;]
SparkContext.stopped ( ) : java.util.concurrent.atomic.AtomicBoolean
[mangled: org/apache/spark/SparkContext.stopped:()Ljava/util/concurrent/atomic/AtomicBoolean;]
SparkContext.submitMapStage ( ShuffleDependency<K,V,C> dependency ) : SimpleFutureAction<MapOutputStatistics>
[mangled: org/apache/spark/SparkContext.submitMapStage:(Lorg/apache/spark/ShuffleDependency;)Lorg/apache/spark/SimpleFutureAction;]
SparkContext.ui ( ) : scala.Option<ui.SparkUI>
[mangled: org/apache/spark/SparkContext.ui:()Lscala/Option;]
SparkContext.withScope ( scala.Function0<U> body ) : U
[mangled: org/apache/spark/SparkContext.withScope:(Lscala/Function0;)Ljava/lang/Object;]
spark-core_2.10-1.6.0.jar, SparkListener.class
package org.apache.spark.scheduler
SparkListener.onBlockUpdated ( SparkListenerBlockUpdated p1 ) [abstract] : void
[mangled: org/apache/spark/scheduler/SparkListener.onBlockUpdated:(Lorg/apache/spark/scheduler/SparkListenerBlockUpdated;)V]
SparkListener.onExecutorAdded ( SparkListenerExecutorAdded p1 ) [abstract] : void
[mangled: org/apache/spark/scheduler/SparkListener.onExecutorAdded:(Lorg/apache/spark/scheduler/SparkListenerExecutorAdded;)V]
SparkListener.onExecutorRemoved ( SparkListenerExecutorRemoved p1 ) [abstract] : void
[mangled: org/apache/spark/scheduler/SparkListener.onExecutorRemoved:(Lorg/apache/spark/scheduler/SparkListenerExecutorRemoved;)V]
spark-core_2.10-1.6.0.jar, SparkListenerApplicationStart.class
package org.apache.spark.scheduler
SparkListenerApplicationStart.appAttemptId ( ) : scala.Option<String>
[mangled: org/apache/spark/scheduler/SparkListenerApplicationStart.appAttemptId:()Lscala/Option;]
SparkListenerApplicationStart.appId ( ) : scala.Option<String>
[mangled: org/apache/spark/scheduler/SparkListenerApplicationStart.appId:()Lscala/Option;]
SparkListenerApplicationStart.copy ( String appName, scala.Option<String> appId, long time, String sparkUser, scala.Option<String> appAttemptId, scala.Option<scala.collection.Map<String,String>> driverLogs ) : SparkListenerApplicationStart
[mangled: org/apache/spark/scheduler/SparkListenerApplicationStart.copy:(Ljava/lang/String;Lscala/Option;JLjava/lang/String;Lscala/Option;Lscala/Option;)Lorg/apache/spark/scheduler/SparkListenerApplicationStart;]
SparkListenerApplicationStart.driverLogs ( ) : scala.Option<scala.collection.Map<String,String>>
[mangled: org/apache/spark/scheduler/SparkListenerApplicationStart.driverLogs:()Lscala/Option;]
SparkListenerApplicationStart.SparkListenerApplicationStart ( String appName, scala.Option<String> appId, long time, String sparkUser, scala.Option<String> appAttemptId, scala.Option<scala.collection.Map<String,String>> driverLogs )
[mangled: org/apache/spark/scheduler/SparkListenerApplicationStart."<init>":(Ljava/lang/String;Lscala/Option;JLjava/lang/String;Lscala/Option;Lscala/Option;)V]
spark-core_2.10-1.6.0.jar, SparkListenerBlockManagerAdded.class
package org.apache.spark.scheduler
SparkListenerBlockManagerAdded.copy ( long time, org.apache.spark.storage.BlockManagerId blockManagerId, long maxMem ) : SparkListenerBlockManagerAdded
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded.copy:(JLorg/apache/spark/storage/BlockManagerId;J)Lorg/apache/spark/scheduler/SparkListenerBlockManagerAdded;]
SparkListenerBlockManagerAdded.SparkListenerBlockManagerAdded ( long time, org.apache.spark.storage.BlockManagerId blockManagerId, long maxMem )
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded."<init>":(JLorg/apache/spark/storage/BlockManagerId;J)V]
SparkListenerBlockManagerAdded.time ( ) : long
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded.time:()J]
spark-core_2.10-1.6.0.jar, SparkListenerBlockManagerRemoved.class
package org.apache.spark.scheduler
SparkListenerBlockManagerRemoved.copy ( long time, org.apache.spark.storage.BlockManagerId blockManagerId ) : SparkListenerBlockManagerRemoved
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.copy:(JLorg/apache/spark/storage/BlockManagerId;)Lorg/apache/spark/scheduler/SparkListenerBlockManagerRemoved;]
SparkListenerBlockManagerRemoved.curried ( ) [static] : scala.Function1<Object,scala.Function1<org.apache.spark.storage.BlockManagerId,SparkListenerBlockManagerRemoved>>
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.curried:()Lscala/Function1;]
SparkListenerBlockManagerRemoved.SparkListenerBlockManagerRemoved ( long time, org.apache.spark.storage.BlockManagerId blockManagerId )
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved."<init>":(JLorg/apache/spark/storage/BlockManagerId;)V]
SparkListenerBlockManagerRemoved.time ( ) : long
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.time:()J]
SparkListenerBlockManagerRemoved.tupled ( ) [static] : scala.Function1<scala.Tuple2<Object,org.apache.spark.storage.BlockManagerId>,SparkListenerBlockManagerRemoved>
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.tupled:()Lscala/Function1;]
spark-core_2.10-1.6.0.jar, SparkListenerBlockUpdated.class
package org.apache.spark.scheduler
SparkListenerBlockUpdated.andThen ( scala.Function1<SparkListenerBlockUpdated,A> p1 ) [static] : scala.Function1<org.apache.spark.storage.BlockUpdatedInfo,A>
[mangled: org/apache/spark/scheduler/SparkListenerBlockUpdated.andThen:(Lscala/Function1;)Lscala/Function1;]
SparkListenerBlockUpdated.blockUpdatedInfo ( ) : org.apache.spark.storage.BlockUpdatedInfo
[mangled: org/apache/spark/scheduler/SparkListenerBlockUpdated.blockUpdatedInfo:()Lorg/apache/spark/storage/BlockUpdatedInfo;]
SparkListenerBlockUpdated.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/scheduler/SparkListenerBlockUpdated.canEqual:(Ljava/lang/Object;)Z]
SparkListenerBlockUpdated.compose ( scala.Function1<A,org.apache.spark.storage.BlockUpdatedInfo> p1 ) [static] : scala.Function1<A,SparkListenerBlockUpdated>
[mangled: org/apache/spark/scheduler/SparkListenerBlockUpdated.compose:(Lscala/Function1;)Lscala/Function1;]
SparkListenerBlockUpdated.copy ( org.apache.spark.storage.BlockUpdatedInfo blockUpdatedInfo ) : SparkListenerBlockUpdated
[mangled: org/apache/spark/scheduler/SparkListenerBlockUpdated.copy:(Lorg/apache/spark/storage/BlockUpdatedInfo;)Lorg/apache/spark/scheduler/SparkListenerBlockUpdated;]
SparkListenerBlockUpdated.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/scheduler/SparkListenerBlockUpdated.equals:(Ljava/lang/Object;)Z]
SparkListenerBlockUpdated.hashCode ( ) : int
[mangled: org/apache/spark/scheduler/SparkListenerBlockUpdated.hashCode:()I]
SparkListenerBlockUpdated.productArity ( ) : int
[mangled: org/apache/spark/scheduler/SparkListenerBlockUpdated.productArity:()I]
SparkListenerBlockUpdated.productElement ( int p1 ) : Object
[mangled: org/apache/spark/scheduler/SparkListenerBlockUpdated.productElement:(I)Ljava/lang/Object;]
SparkListenerBlockUpdated.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/scheduler/SparkListenerBlockUpdated.productIterator:()Lscala/collection/Iterator;]
SparkListenerBlockUpdated.productPrefix ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerBlockUpdated.productPrefix:()Ljava/lang/String;]
SparkListenerBlockUpdated.SparkListenerBlockUpdated ( org.apache.spark.storage.BlockUpdatedInfo blockUpdatedInfo )
[mangled: org/apache/spark/scheduler/SparkListenerBlockUpdated."<init>":(Lorg/apache/spark/storage/BlockUpdatedInfo;)V]
SparkListenerBlockUpdated.toString ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerBlockUpdated.toString:()Ljava/lang/String;]
spark-core_2.10-1.6.0.jar, SparkListenerExecutorAdded.class
package org.apache.spark.scheduler
SparkListenerExecutorAdded.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.canEqual:(Ljava/lang/Object;)Z]
SparkListenerExecutorAdded.copy ( long time, String executorId, cluster.ExecutorInfo executorInfo ) : SparkListenerExecutorAdded
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.copy:(JLjava/lang/String;Lorg/apache/spark/scheduler/cluster/ExecutorInfo;)Lorg/apache/spark/scheduler/SparkListenerExecutorAdded;]
SparkListenerExecutorAdded.curried ( ) [static] : scala.Function1<Object,scala.Function1<String,scala.Function1<cluster.ExecutorInfo,SparkListenerExecutorAdded>>>
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.curried:()Lscala/Function1;]
SparkListenerExecutorAdded.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.equals:(Ljava/lang/Object;)Z]
SparkListenerExecutorAdded.executorId ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.executorId:()Ljava/lang/String;]
SparkListenerExecutorAdded.executorInfo ( ) : cluster.ExecutorInfo
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.executorInfo:()Lorg/apache/spark/scheduler/cluster/ExecutorInfo;]
SparkListenerExecutorAdded.hashCode ( ) : int
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.hashCode:()I]
SparkListenerExecutorAdded.productArity ( ) : int
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.productArity:()I]
SparkListenerExecutorAdded.productElement ( int p1 ) : Object
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.productElement:(I)Ljava/lang/Object;]
SparkListenerExecutorAdded.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.productIterator:()Lscala/collection/Iterator;]
SparkListenerExecutorAdded.productPrefix ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.productPrefix:()Ljava/lang/String;]
SparkListenerExecutorAdded.SparkListenerExecutorAdded ( long time, String executorId, cluster.ExecutorInfo executorInfo )
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded."<init>":(JLjava/lang/String;Lorg/apache/spark/scheduler/cluster/ExecutorInfo;)V]
SparkListenerExecutorAdded.time ( ) : long
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.time:()J]
SparkListenerExecutorAdded.toString ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.toString:()Ljava/lang/String;]
SparkListenerExecutorAdded.tupled ( ) [static] : scala.Function1<scala.Tuple3<Object,String,cluster.ExecutorInfo>,SparkListenerExecutorAdded>
[mangled: org/apache/spark/scheduler/SparkListenerExecutorAdded.tupled:()Lscala/Function1;]
spark-core_2.10-1.6.0.jar, SparkListenerExecutorRemoved.class
package org.apache.spark.scheduler
SparkListenerExecutorRemoved.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.canEqual:(Ljava/lang/Object;)Z]
SparkListenerExecutorRemoved.copy ( long time, String executorId, String reason ) : SparkListenerExecutorRemoved
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.copy:(JLjava/lang/String;Ljava/lang/String;)Lorg/apache/spark/scheduler/SparkListenerExecutorRemoved;]
SparkListenerExecutorRemoved.curried ( ) [static] : scala.Function1<Object,scala.Function1<String,scala.Function1<String,SparkListenerExecutorRemoved>>>
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.curried:()Lscala/Function1;]
SparkListenerExecutorRemoved.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.equals:(Ljava/lang/Object;)Z]
SparkListenerExecutorRemoved.executorId ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.executorId:()Ljava/lang/String;]
SparkListenerExecutorRemoved.hashCode ( ) : int
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.hashCode:()I]
SparkListenerExecutorRemoved.productArity ( ) : int
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.productArity:()I]
SparkListenerExecutorRemoved.productElement ( int p1 ) : Object
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.productElement:(I)Ljava/lang/Object;]
SparkListenerExecutorRemoved.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.productIterator:()Lscala/collection/Iterator;]
SparkListenerExecutorRemoved.productPrefix ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.productPrefix:()Ljava/lang/String;]
SparkListenerExecutorRemoved.reason ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.reason:()Ljava/lang/String;]
SparkListenerExecutorRemoved.SparkListenerExecutorRemoved ( long time, String executorId, String reason )
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved."<init>":(JLjava/lang/String;Ljava/lang/String;)V]
SparkListenerExecutorRemoved.time ( ) : long
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.time:()J]
SparkListenerExecutorRemoved.toString ( ) : String
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.toString:()Ljava/lang/String;]
SparkListenerExecutorRemoved.tupled ( ) [static] : scala.Function1<scala.Tuple3<Object,String,String>,SparkListenerExecutorRemoved>
[mangled: org/apache/spark/scheduler/SparkListenerExecutorRemoved.tupled:()Lscala/Function1;]
spark-core_2.10-1.6.0.jar, SparkListenerJobEnd.class
package org.apache.spark.scheduler
SparkListenerJobEnd.copy ( int jobId, long time, JobResult jobResult ) : SparkListenerJobEnd
[mangled: org/apache/spark/scheduler/SparkListenerJobEnd.copy:(IJLorg/apache/spark/scheduler/JobResult;)Lorg/apache/spark/scheduler/SparkListenerJobEnd;]
SparkListenerJobEnd.SparkListenerJobEnd ( int jobId, long time, JobResult jobResult )
[mangled: org/apache/spark/scheduler/SparkListenerJobEnd."<init>":(IJLorg/apache/spark/scheduler/JobResult;)V]
SparkListenerJobEnd.time ( ) : long
[mangled: org/apache/spark/scheduler/SparkListenerJobEnd.time:()J]
spark-core_2.10-1.6.0.jar, SparkListenerJobStart.class
package org.apache.spark.scheduler
SparkListenerJobStart.copy ( int jobId, long time, scala.collection.Seq<StageInfo> stageInfos, java.util.Properties properties ) : SparkListenerJobStart
[mangled: org/apache/spark/scheduler/SparkListenerJobStart.copy:(IJLscala/collection/Seq;Ljava/util/Properties;)Lorg/apache/spark/scheduler/SparkListenerJobStart;]
SparkListenerJobStart.SparkListenerJobStart ( int jobId, long time, scala.collection.Seq<StageInfo> stageInfos, java.util.Properties properties )
[mangled: org/apache/spark/scheduler/SparkListenerJobStart."<init>":(IJLscala/collection/Seq;Ljava/util/Properties;)V]
SparkListenerJobStart.stageInfos ( ) : scala.collection.Seq<StageInfo>
[mangled: org/apache/spark/scheduler/SparkListenerJobStart.stageInfos:()Lscala/collection/Seq;]
SparkListenerJobStart.time ( ) : long
[mangled: org/apache/spark/scheduler/SparkListenerJobStart.time:()J]
spark-core_2.10-1.6.0.jar, TaskMetrics.class
package org.apache.spark.executor
TaskMetrics.accumulatorUpdates ( ) : scala.collection.immutable.Map<Object,Object>
[mangled: org/apache/spark/executor/TaskMetrics.accumulatorUpdates:()Lscala/collection/immutable/Map;]
TaskMetrics.decDiskBytesSpilled ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.decDiskBytesSpilled:(J)V]
TaskMetrics.decMemoryBytesSpilled ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.decMemoryBytesSpilled:(J)V]
TaskMetrics.getCachedHostName ( String p1 ) [static] : String
[mangled: org/apache/spark/executor/TaskMetrics.getCachedHostName:(Ljava/lang/String;)Ljava/lang/String;]
TaskMetrics.getInputMetricsForReadMethod ( scala.Enumeration.Value readMethod ) : InputMetrics
[mangled: org/apache/spark/executor/TaskMetrics.getInputMetricsForReadMethod:(Lscala/Enumeration$Value;)Lorg/apache/spark/executor/InputMetrics;]
TaskMetrics.incDiskBytesSpilled ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.incDiskBytesSpilled:(J)V]
TaskMetrics.incMemoryBytesSpilled ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.incMemoryBytesSpilled:(J)V]
TaskMetrics.TaskMetrics.._hostname ( ) : String
[mangled: org/apache/spark/executor/TaskMetrics.org.apache.spark.executor.TaskMetrics.._hostname:()Ljava/lang/String;]
TaskMetrics.TaskMetrics.._hostname_.eq ( String p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.org.apache.spark.executor.TaskMetrics.._hostname_.eq:(Ljava/lang/String;)V]
TaskMetrics.outputMetrics ( ) : scala.Option<OutputMetrics>
[mangled: org/apache/spark/executor/TaskMetrics.outputMetrics:()Lscala/Option;]
TaskMetrics.outputMetrics_.eq ( scala.Option<OutputMetrics> p1 ) : void
[mangled: org/apache/spark/executor/TaskMetrics.outputMetrics_.eq:(Lscala/Option;)V]
TaskMetrics.setAccumulatorsUpdater ( scala.Function0<scala.collection.immutable.Map<Object,Object>> accumulatorsUpdater ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setAccumulatorsUpdater:(Lscala/Function0;)V]
TaskMetrics.setExecutorDeserializeTime ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setExecutorDeserializeTime:(J)V]
TaskMetrics.setExecutorRunTime ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setExecutorRunTime:(J)V]
TaskMetrics.setHostname ( String value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setHostname:(Ljava/lang/String;)V]
TaskMetrics.setInputMetrics ( scala.Option<InputMetrics> inputMetrics ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setInputMetrics:(Lscala/Option;)V]
TaskMetrics.setJvmGCTime ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setJvmGCTime:(J)V]
TaskMetrics.setResultSerializationTime ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setResultSerializationTime:(J)V]
TaskMetrics.setResultSize ( long value ) : void
[mangled: org/apache/spark/executor/TaskMetrics.setResultSize:(J)V]
TaskMetrics.updateAccumulators ( ) : void
[mangled: org/apache/spark/executor/TaskMetrics.updateAccumulators:()V]
TaskMetrics.updateInputMetrics ( ) : void
[mangled: org/apache/spark/executor/TaskMetrics.updateInputMetrics:()V]
spark-hive_2.10-1.6.0.jar, HiveContext.class
package org.apache.spark.sql.hive
HiveContext.addJar ( String path ) : void
[mangled: org/apache/spark/sql/hive/HiveContext.addJar:(Ljava/lang/String;)V]
HiveContext.conf ( ) : org.apache.spark.sql.SQLConf
[mangled: org/apache/spark/sql/hive/HiveContext.conf:()Lorg/apache/spark/sql/SQLConf;]
HiveContext.configure ( ) : scala.collection.immutable.Map<String,String>
[mangled: org/apache/spark/sql/hive/HiveContext.configure:()Lscala/collection/immutable/Map;]
HiveContext.CONVERT_CTAS ( ) [static] : org.apache.spark.sql.SQLConf.SQLConfEntry<Object>
[mangled: org/apache/spark/sql/hive/HiveContext.CONVERT_CTAS:()Lorg/apache/spark/sql/SQLConf$SQLConfEntry;]
HiveContext.CONVERT_METASTORE_PARQUET ( ) [static] : org.apache.spark.sql.SQLConf.SQLConfEntry<Object>
[mangled: org/apache/spark/sql/hive/HiveContext.CONVERT_METASTORE_PARQUET:()Lorg/apache/spark/sql/SQLConf$SQLConfEntry;]
HiveContext.CONVERT_METASTORE_PARQUET_WITH_SCHEMA_MERGING ( ) [static] : org.apache.spark.sql.SQLConf.SQLConfEntry<Object>
[mangled: org/apache/spark/sql/hive/HiveContext.CONVERT_METASTORE_PARQUET_WITH_SCHEMA_MERGING:()Lorg/apache/spark/sql/SQLConf$SQLConfEntry;]
HiveContext.convertCTAS ( ) : boolean
[mangled: org/apache/spark/sql/hive/HiveContext.convertCTAS:()Z]
HiveContext.convertMetastoreParquetWithSchemaMerging ( ) : boolean
[mangled: org/apache/spark/sql/hive/HiveContext.convertMetastoreParquetWithSchemaMerging:()Z]
HiveContext.defaultOverrides ( ) : void
[mangled: org/apache/spark/sql/hive/HiveContext.defaultOverrides:()V]
HiveContext.executionHive ( ) : client.ClientWrapper
[mangled: org/apache/spark/sql/hive/HiveContext.executionHive:()Lorg/apache/spark/sql/hive/client/ClientWrapper;]
HiveContext.getSQLDialect ( ) : org.apache.spark.sql.catalyst.ParserDialect
[mangled: org/apache/spark/sql/hive/HiveContext.getSQLDialect:()Lorg/apache/spark/sql/catalyst/ParserDialect;]
HiveContext.HIVE_EXECUTION_VERSION ( ) [static] : org.apache.spark.sql.SQLConf.SQLConfEntry<String>
[mangled: org/apache/spark/sql/hive/HiveContext.HIVE_EXECUTION_VERSION:()Lorg/apache/spark/sql/SQLConf$SQLConfEntry;]
HiveContext.HIVE_METASTORE_BARRIER_PREFIXES ( ) [static] : org.apache.spark.sql.SQLConf.SQLConfEntry<scala.collection.Seq<String>>
[mangled: org/apache/spark/sql/hive/HiveContext.HIVE_METASTORE_BARRIER_PREFIXES:()Lorg/apache/spark/sql/SQLConf$SQLConfEntry;]
HiveContext.HIVE_METASTORE_JARS ( ) [static] : org.apache.spark.sql.SQLConf.SQLConfEntry<String>
[mangled: org/apache/spark/sql/hive/HiveContext.HIVE_METASTORE_JARS:()Lorg/apache/spark/sql/SQLConf$SQLConfEntry;]
HiveContext.HIVE_METASTORE_SHARED_PREFIXES ( ) [static] : org.apache.spark.sql.SQLConf.SQLConfEntry<scala.collection.Seq<String>>
[mangled: org/apache/spark/sql/hive/HiveContext.HIVE_METASTORE_SHARED_PREFIXES:()Lorg/apache/spark/sql/SQLConf$SQLConfEntry;]
HiveContext.HIVE_METASTORE_VERSION ( ) [static] : org.apache.spark.sql.SQLConf.SQLConfEntry<String>
[mangled: org/apache/spark/sql/hive/HiveContext.HIVE_METASTORE_VERSION:()Lorg/apache/spark/sql/SQLConf$SQLConfEntry;]
HiveContext.HIVE_THRIFT_SERVER_ASYNC ( ) [static] : org.apache.spark.sql.SQLConf.SQLConfEntry<Object>
[mangled: org/apache/spark/sql/hive/HiveContext.HIVE_THRIFT_SERVER_ASYNC:()Lorg/apache/spark/sql/SQLConf$SQLConfEntry;]
HiveContext.HiveContext ( org.apache.spark.api.java.JavaSparkContext sc )
[mangled: org/apache/spark/sql/hive/HiveContext."<init>":(Lorg/apache/spark/api/java/JavaSparkContext;)V]
HiveContext.HiveContext ( org.apache.spark.SparkContext sc, org.apache.spark.sql.execution.CacheManager cacheManager, org.apache.spark.sql.execution.ui.SQLListener listener, client.ClientWrapper execHive, client.ClientInterface metaHive, boolean isRootContext )
[mangled: org/apache/spark/sql/hive/HiveContext."<init>":(Lorg/apache/spark/SparkContext;Lorg/apache/spark/sql/execution/CacheManager;Lorg/apache/spark/sql/execution/ui/SQLListener;Lorg/apache/spark/sql/hive/client/ClientWrapper;Lorg/apache/spark/sql/hive/client/ClientInterface;Z)V]
HiveContext.hiveExecutionVersion ( ) [static] : String
[mangled: org/apache/spark/sql/hive/HiveContext.hiveExecutionVersion:()Ljava/lang/String;]
HiveContext.hiveMetastoreBarrierPrefixes ( ) : scala.collection.Seq<String>
[mangled: org/apache/spark/sql/hive/HiveContext.hiveMetastoreBarrierPrefixes:()Lscala/collection/Seq;]
HiveContext.hiveMetastoreJars ( ) : String
[mangled: org/apache/spark/sql/hive/HiveContext.hiveMetastoreJars:()Ljava/lang/String;]
HiveContext.hiveMetastoreSharedPrefixes ( ) : scala.collection.Seq<String>
[mangled: org/apache/spark/sql/hive/HiveContext.hiveMetastoreSharedPrefixes:()Lscala/collection/Seq;]
HiveContext.hiveMetastoreVersion ( ) : String
[mangled: org/apache/spark/sql/hive/HiveContext.hiveMetastoreVersion:()Ljava/lang/String;]
HiveContext.hiveThriftServerAsync ( ) : boolean
[mangled: org/apache/spark/sql/hive/HiveContext.hiveThriftServerAsync:()Z]
HiveContext.hiveThriftServerSingleSession ( ) : boolean
[mangled: org/apache/spark/sql/hive/HiveContext.hiveThriftServerSingleSession:()Z]
HiveContext.invalidateTable ( String tableName ) : void
[mangled: org/apache/spark/sql/hive/HiveContext.invalidateTable:(Ljava/lang/String;)V]
HiveContext.metadataHive ( ) : client.ClientInterface
[mangled: org/apache/spark/sql/hive/HiveContext.metadataHive:()Lorg/apache/spark/sql/hive/client/ClientInterface;]
HiveContext.newSession ( ) : HiveContext
[mangled: org/apache/spark/sql/hive/HiveContext.newSession:()Lorg/apache/spark/sql/hive/HiveContext;]
HiveContext.newSession ( ) : org.apache.spark.sql.SQLContext
[mangled: org/apache/spark/sql/hive/HiveContext.newSession:()Lorg/apache/spark/sql/SQLContext;]
HiveContext.newTemporaryConfiguration ( ) [static] : scala.collection.immutable.Map<String,String>
[mangled: org/apache/spark/sql/hive/HiveContext.newTemporaryConfiguration:()Lscala/collection/immutable/Map;]
HiveContext.refreshTable ( String tableName ) : void
[mangled: org/apache/spark/sql/hive/HiveContext.refreshTable:(Ljava/lang/String;)V]
HiveContext.setConf ( org.apache.spark.sql.SQLConf.SQLConfEntry<T> entry, T value ) : void
[mangled: org/apache/spark/sql/hive/HiveContext.setConf:(Lorg/apache/spark/sql/SQLConf$SQLConfEntry;Ljava/lang/Object;)V]
HiveContext.substitutor ( ) : org.apache.hadoop.hive.ql.parse.VariableSubstitution
[mangled: org/apache/spark/sql/hive/HiveContext.substitutor:()Lorg/apache/hadoop/hive/ql/parse/VariableSubstitution;]
HiveContext.toHiveStructString ( scala.Tuple2<Object,org.apache.spark.sql.types.DataType> p1 ) [static] : String
[mangled: org/apache/spark/sql/hive/HiveContext.toHiveStructString:(Lscala/Tuple2;)Ljava/lang/String;]
spark-sql_2.10-1.6.0.jar, DataFrame.class
package org.apache.spark.sql
DataFrame.agg ( java.util.Map<String,String> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Ljava/util/Map;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( Column expr, Column... exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lorg/apache/spark/sql/Column;[Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( Column expr, scala.collection.Seq<Column> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lorg/apache/spark/sql/Column;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( scala.collection.immutable.Map<String,String> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( scala.Tuple2<String,String> aggExpr, scala.collection.Seq<scala.Tuple2<String,String>> aggExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lscala/Tuple2;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.alias ( scala.Symbol alias ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.alias:(Lscala/Symbol;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.alias ( String alias ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.alias:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.apply ( String colName ) : Column
[mangled: org/apache/spark/sql/DataFrame.apply:(Ljava/lang/String;)Lorg/apache/spark/sql/Column;]
DataFrame.as ( Encoder<U> p1 ) : Dataset<U>
[mangled: org/apache/spark/sql/DataFrame.as:(Lorg/apache/spark/sql/Encoder;)Lorg/apache/spark/sql/Dataset;]
DataFrame.as ( scala.Symbol alias ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.as:(Lscala/Symbol;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.as ( String alias ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.as:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.cache ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.cache:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.coalesce ( int numPartitions ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.coalesce:(I)Lorg/apache/spark/sql/DataFrame;]
DataFrame.col ( String colName ) : Column
[mangled: org/apache/spark/sql/DataFrame.col:(Ljava/lang/String;)Lorg/apache/spark/sql/Column;]
DataFrame.collect ( ) : Row[ ]
[mangled: org/apache/spark/sql/DataFrame.collect:()[Lorg/apache/spark/sql/Row;]
DataFrame.collectAsList ( ) : java.util.List<Row>
[mangled: org/apache/spark/sql/DataFrame.collectAsList:()Ljava/util/List;]
DataFrame.collectToPython ( ) : int
[mangled: org/apache/spark/sql/DataFrame.collectToPython:()I]
DataFrame.columns ( ) : String[ ]
[mangled: org/apache/spark/sql/DataFrame.columns:()[Ljava/lang/String;]
DataFrame.count ( ) : long
[mangled: org/apache/spark/sql/DataFrame.count:()J]
DataFrame.cube ( Column... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.cube ( scala.collection.Seq<Column> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:(Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.cube ( String col1, scala.collection.Seq<String> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.cube ( String col1, String... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.DataFrame ( SQLContext sqlContext, catalyst.plans.logical.LogicalPlan logicalPlan )
[mangled: org/apache/spark/sql/DataFrame."<init>":(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)V]
DataFrame.DataFrame ( SQLContext sqlContext, execution.QueryExecution queryExecution )
[mangled: org/apache/spark/sql/DataFrame."<init>":(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/execution/QueryExecution;)V]
DataFrame.describe ( scala.collection.Seq<String> cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.describe:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.describe ( String... cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.describe:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.distinct ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.distinct:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.drop ( Column col ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.drop:(Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.drop ( String colName ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.drop:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.dropDuplicates ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.dropDuplicates:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.dropDuplicates ( scala.collection.Seq<String> colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.dropDuplicates:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.dropDuplicates ( String[ ] colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.dropDuplicates:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.dtypes ( ) : scala.Tuple2<String,String>[ ]
[mangled: org/apache/spark/sql/DataFrame.dtypes:()[Lscala/Tuple2;]
DataFrame.except ( DataFrame other ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.except:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.explain ( ) : void
[mangled: org/apache/spark/sql/DataFrame.explain:()V]
DataFrame.explain ( boolean extended ) : void
[mangled: org/apache/spark/sql/DataFrame.explain:(Z)V]
DataFrame.explode ( scala.collection.Seq<Column> input, scala.Function1<Row,scala.collection.TraversableOnce<A>> f, scala.reflect.api.TypeTags.TypeTag<A> p3 ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.explode:(Lscala/collection/Seq;Lscala/Function1;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.explode ( String inputColumn, String outputColumn, scala.Function1<A,scala.collection.TraversableOnce<B>> f, scala.reflect.api.TypeTags.TypeTag<B> p4 ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.explode:(Ljava/lang/String;Ljava/lang/String;Lscala/Function1;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.filter ( Column condition ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.filter:(Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.filter ( String conditionExpr ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.filter:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.first ( ) : Row
[mangled: org/apache/spark/sql/DataFrame.first:()Lorg/apache/spark/sql/Row;]
DataFrame.flatMap ( scala.Function1<Row,scala.collection.TraversableOnce<R>> f, scala.reflect.ClassTag<R> p2 ) : org.apache.spark.rdd.RDD<R>
[mangled: org/apache/spark/sql/DataFrame.flatMap:(Lscala/Function1;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
DataFrame.foreach ( scala.Function1<Row,scala.runtime.BoxedUnit> f ) : void
[mangled: org/apache/spark/sql/DataFrame.foreach:(Lscala/Function1;)V]
DataFrame.foreachPartition ( scala.Function1<scala.collection.Iterator<Row>,scala.runtime.BoxedUnit> f ) : void
[mangled: org/apache/spark/sql/DataFrame.foreachPartition:(Lscala/Function1;)V]
DataFrame.groupBy ( Column... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.groupBy ( scala.collection.Seq<Column> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:(Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.groupBy ( String col1, scala.collection.Seq<String> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.groupBy ( String col1, String... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.head ( ) : Row
[mangled: org/apache/spark/sql/DataFrame.head:()Lorg/apache/spark/sql/Row;]
DataFrame.head ( int n ) : Row[ ]
[mangled: org/apache/spark/sql/DataFrame.head:(I)[Lorg/apache/spark/sql/Row;]
DataFrame.inputFiles ( ) : String[ ]
[mangled: org/apache/spark/sql/DataFrame.inputFiles:()[Ljava/lang/String;]
DataFrame.intersect ( DataFrame other ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.intersect:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.isLocal ( ) : boolean
[mangled: org/apache/spark/sql/DataFrame.isLocal:()Z]
DataFrame.javaRDD ( ) : org.apache.spark.api.java.JavaRDD<Row>
[mangled: org/apache/spark/sql/DataFrame.javaRDD:()Lorg/apache/spark/api/java/JavaRDD;]
DataFrame.javaToPython ( ) : org.apache.spark.api.java.JavaRDD<byte[ ]>
[mangled: org/apache/spark/sql/DataFrame.javaToPython:()Lorg/apache/spark/api/java/JavaRDD;]
DataFrame.join ( DataFrame right ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.join ( DataFrame right, Column joinExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.join ( DataFrame right, Column joinExprs, String joinType ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Lorg/apache/spark/sql/Column;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.join ( DataFrame right, scala.collection.Seq<String> usingColumns ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.join ( DataFrame right, scala.collection.Seq<String> usingColumns, String joinType ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Lscala/collection/Seq;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.join ( DataFrame right, String usingColumn ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.limit ( int n ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.limit:(I)Lorg/apache/spark/sql/DataFrame;]
DataFrame.logicalPlan ( ) : catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/DataFrame.logicalPlan:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
DataFrame.map ( scala.Function1<Row,R> f, scala.reflect.ClassTag<R> p2 ) : org.apache.spark.rdd.RDD<R>
[mangled: org/apache/spark/sql/DataFrame.map:(Lscala/Function1;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
DataFrame.mapPartitions ( scala.Function1<scala.collection.Iterator<Row>,scala.collection.Iterator<R>> f, scala.reflect.ClassTag<R> p2 ) : org.apache.spark.rdd.RDD<R>
[mangled: org/apache/spark/sql/DataFrame.mapPartitions:(Lscala/Function1;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
DataFrame.na ( ) : DataFrameNaFunctions
[mangled: org/apache/spark/sql/DataFrame.na:()Lorg/apache/spark/sql/DataFrameNaFunctions;]
DataFrame.numericColumns ( ) : scala.collection.Seq<catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/DataFrame.numericColumns:()Lscala/collection/Seq;]
DataFrame.orderBy ( Column... sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.orderBy ( scala.collection.Seq<Column> sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.orderBy ( String sortCol, scala.collection.Seq<String> sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.orderBy ( String sortCol, String... sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.DataFrame..collect ( boolean needCallback ) : Row[ ]
[mangled: org/apache/spark/sql/DataFrame.org.apache.spark.sql.DataFrame..collect:(Z)[Lorg/apache/spark/sql/Row;]
DataFrame.DataFrame..withPlan ( scala.Function0<catalyst.plans.logical.LogicalPlan> logicalPlan ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.org.apache.spark.sql.DataFrame..withPlan:(Lscala/Function0;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.persist ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.persist:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.persist ( org.apache.spark.storage.StorageLevel newLevel ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.persist:(Lorg/apache/spark/storage/StorageLevel;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.printSchema ( ) : void
[mangled: org/apache/spark/sql/DataFrame.printSchema:()V]
DataFrame.queryExecution ( ) : execution.QueryExecution
[mangled: org/apache/spark/sql/DataFrame.queryExecution:()Lorg/apache/spark/sql/execution/QueryExecution;]
DataFrame.randomSplit ( double[ ] weights ) : DataFrame[ ]
[mangled: org/apache/spark/sql/DataFrame.randomSplit:([D)[Lorg/apache/spark/sql/DataFrame;]
DataFrame.randomSplit ( double[ ] weights, long seed ) : DataFrame[ ]
[mangled: org/apache/spark/sql/DataFrame.randomSplit:([DJ)[Lorg/apache/spark/sql/DataFrame;]
DataFrame.randomSplit ( scala.collection.immutable.List<Object> weights, long seed ) : DataFrame[ ]
[mangled: org/apache/spark/sql/DataFrame.randomSplit:(Lscala/collection/immutable/List;J)[Lorg/apache/spark/sql/DataFrame;]
DataFrame.rdd ( ) : org.apache.spark.rdd.RDD<Row>
[mangled: org/apache/spark/sql/DataFrame.rdd:()Lorg/apache/spark/rdd/RDD;]
DataFrame.registerTempTable ( String tableName ) : void
[mangled: org/apache/spark/sql/DataFrame.registerTempTable:(Ljava/lang/String;)V]
DataFrame.repartition ( int numPartitions ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.repartition:(I)Lorg/apache/spark/sql/DataFrame;]
DataFrame.repartition ( int numPartitions, Column... partitionExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.repartition:(I[Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.repartition ( int numPartitions, scala.collection.Seq<Column> partitionExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.repartition:(ILscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.repartition ( Column... partitionExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.repartition:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.repartition ( scala.collection.Seq<Column> partitionExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.repartition:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.resolve ( String colName ) : catalyst.expressions.NamedExpression
[mangled: org/apache/spark/sql/DataFrame.resolve:(Ljava/lang/String;)Lorg/apache/spark/sql/catalyst/expressions/NamedExpression;]
DataFrame.rollup ( Column... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.rollup ( scala.collection.Seq<Column> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:(Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.rollup ( String col1, scala.collection.Seq<String> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.rollup ( String col1, String... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.sample ( boolean withReplacement, double fraction ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sample:(ZD)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sample ( boolean withReplacement, double fraction, long seed ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sample:(ZDJ)Lorg/apache/spark/sql/DataFrame;]
DataFrame.schema ( ) : types.StructType
[mangled: org/apache/spark/sql/DataFrame.schema:()Lorg/apache/spark/sql/types/StructType;]
DataFrame.select ( Column... cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.select ( scala.collection.Seq<Column> cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.select ( String col, scala.collection.Seq<String> cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.select ( String col, String... cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.selectExpr ( scala.collection.Seq<String> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.selectExpr:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.selectExpr ( String... exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.selectExpr:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.show ( ) : void
[mangled: org/apache/spark/sql/DataFrame.show:()V]
DataFrame.show ( boolean truncate ) : void
[mangled: org/apache/spark/sql/DataFrame.show:(Z)V]
DataFrame.show ( int numRows ) : void
[mangled: org/apache/spark/sql/DataFrame.show:(I)V]
DataFrame.show ( int numRows, boolean truncate ) : void
[mangled: org/apache/spark/sql/DataFrame.show:(IZ)V]
DataFrame.showString ( int _numRows, boolean truncate ) : String
[mangled: org/apache/spark/sql/DataFrame.showString:(IZ)Ljava/lang/String;]
DataFrame.sort ( Column... sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sort ( scala.collection.Seq<Column> sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sort ( String sortCol, scala.collection.Seq<String> sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sort ( String sortCol, String... sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sortWithinPartitions ( Column... sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sortWithinPartitions:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sortWithinPartitions ( scala.collection.Seq<Column> sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sortWithinPartitions:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sortWithinPartitions ( String sortCol, scala.collection.Seq<String> sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sortWithinPartitions:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sortWithinPartitions ( String sortCol, String... sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sortWithinPartitions:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sqlContext ( ) : SQLContext
[mangled: org/apache/spark/sql/DataFrame.sqlContext:()Lorg/apache/spark/sql/SQLContext;]
DataFrame.stat ( ) : DataFrameStatFunctions
[mangled: org/apache/spark/sql/DataFrame.stat:()Lorg/apache/spark/sql/DataFrameStatFunctions;]
DataFrame.take ( int n ) : Row[ ]
[mangled: org/apache/spark/sql/DataFrame.take:(I)[Lorg/apache/spark/sql/Row;]
DataFrame.takeAsList ( int n ) : java.util.List<Row>
[mangled: org/apache/spark/sql/DataFrame.takeAsList:(I)Ljava/util/List;]
DataFrame.toDF ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.toDF:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.toDF ( scala.collection.Seq<String> colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.toDF:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.toDF ( String... colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.toDF:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.toJavaRDD ( ) : org.apache.spark.api.java.JavaRDD<Row>
[mangled: org/apache/spark/sql/DataFrame.toJavaRDD:()Lorg/apache/spark/api/java/JavaRDD;]
DataFrame.toJSON ( ) : org.apache.spark.rdd.RDD<String>
[mangled: org/apache/spark/sql/DataFrame.toJSON:()Lorg/apache/spark/rdd/RDD;]
DataFrame.toString ( ) : String
[mangled: org/apache/spark/sql/DataFrame.toString:()Ljava/lang/String;]
DataFrame.transform ( scala.Function1<DataFrame,DataFrame> t ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.transform:(Lscala/Function1;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.unionAll ( DataFrame other ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.unionAll:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.unpersist ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.unpersist:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.unpersist ( boolean blocking ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.unpersist:(Z)Lorg/apache/spark/sql/DataFrame;]
DataFrame.where ( Column condition ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.where:(Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.where ( String conditionExpr ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.where:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.withColumn ( String colName, Column col ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.withColumn:(Ljava/lang/String;Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.withColumn ( String colName, Column col, types.Metadata metadata ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.withColumn:(Ljava/lang/String;Lorg/apache/spark/sql/Column;Lorg/apache/spark/sql/types/Metadata;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.withColumnRenamed ( String existingName, String newName ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.withColumnRenamed:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.withNewExecutionId ( scala.Function0<T> body ) : T
[mangled: org/apache/spark/sql/DataFrame.withNewExecutionId:(Lscala/Function0;)Ljava/lang/Object;]
DataFrame.write ( ) : DataFrameWriter
[mangled: org/apache/spark/sql/DataFrame.write:()Lorg/apache/spark/sql/DataFrameWriter;]
spark-streaming-kafka_2.10-1.6.0.jar, KafkaTestUtils.class
package org.apache.spark.streaming.kafka
KafkaTestUtils.brokerAddress ( ) : String
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.brokerAddress:()Ljava/lang/String;]
KafkaTestUtils.createTopic ( String topic ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.createTopic:(Ljava/lang/String;)V]
KafkaTestUtils.eventually ( org.apache.spark.streaming.Time timeout, org.apache.spark.streaming.Time interval, scala.Function0<T> func ) : T
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.eventually:(Lorg/apache/spark/streaming/Time;Lorg/apache/spark/streaming/Time;Lscala/Function0;)Ljava/lang/Object;]
KafkaTestUtils.isTraceEnabled ( ) : boolean
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.isTraceEnabled:()Z]
KafkaTestUtils.KafkaTestUtils ( )
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils."<init>":()V]
KafkaTestUtils.log ( ) : org.slf4j.Logger
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.log:()Lorg/slf4j/Logger;]
KafkaTestUtils.logDebug ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.logDebug:(Lscala/Function0;)V]
KafkaTestUtils.logDebug ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.logDebug:(Lscala/Function0;Ljava/lang/Throwable;)V]
KafkaTestUtils.logError ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.logError:(Lscala/Function0;)V]
KafkaTestUtils.logError ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.logError:(Lscala/Function0;Ljava/lang/Throwable;)V]
KafkaTestUtils.logInfo ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.logInfo:(Lscala/Function0;)V]
KafkaTestUtils.logInfo ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.logInfo:(Lscala/Function0;Ljava/lang/Throwable;)V]
KafkaTestUtils.logName ( ) : String
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.logName:()Ljava/lang/String;]
KafkaTestUtils.logTrace ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.logTrace:(Lscala/Function0;)V]
KafkaTestUtils.logTrace ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.logTrace:(Lscala/Function0;Ljava/lang/Throwable;)V]
KafkaTestUtils.logWarning ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.logWarning:(Lscala/Function0;)V]
KafkaTestUtils.logWarning ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.logWarning:(Lscala/Function0;Ljava/lang/Throwable;)V]
KafkaTestUtils.org.apache.spark.Logging..log_ ( ) : org.slf4j.Logger
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.org.apache.spark.Logging..log_:()Lorg/slf4j/Logger;]
KafkaTestUtils.org.apache.spark.Logging..log__.eq ( org.slf4j.Logger p1 ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.org.apache.spark.Logging..log__.eq:(Lorg/slf4j/Logger;)V]
KafkaTestUtils.KafkaTestUtils..brokerConf ( ) : kafka.server.KafkaConfig
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.org.apache.spark.streaming.kafka.KafkaTestUtils..brokerConf:()Lkafka/server/KafkaConfig;]
KafkaTestUtils.KafkaTestUtils..brokerConf_.eq ( kafka.server.KafkaConfig p1 ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.org.apache.spark.streaming.kafka.KafkaTestUtils..brokerConf_.eq:(Lkafka/server/KafkaConfig;)V]
KafkaTestUtils.KafkaTestUtils..brokerConfiguration ( ) : java.util.Properties
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.org.apache.spark.streaming.kafka.KafkaTestUtils..brokerConfiguration:()Ljava/util/Properties;]
KafkaTestUtils.KafkaTestUtils..brokerPort_.eq ( int p1 ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.org.apache.spark.streaming.kafka.KafkaTestUtils..brokerPort_.eq:(I)V]
KafkaTestUtils.KafkaTestUtils..server ( ) : kafka.server.KafkaServer
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.org.apache.spark.streaming.kafka.KafkaTestUtils..server:()Lkafka/server/KafkaServer;]
KafkaTestUtils.KafkaTestUtils..server_.eq ( kafka.server.KafkaServer p1 ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.org.apache.spark.streaming.kafka.KafkaTestUtils..server_.eq:(Lkafka/server/KafkaServer;)V]
KafkaTestUtils.sendMessages ( String topic, java.util.Map<String,Integer> messageToFreq ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.sendMessages:(Ljava/lang/String;Ljava/util/Map;)V]
KafkaTestUtils.sendMessages ( String topic, scala.collection.immutable.Map<String,Object> messageToFreq ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.sendMessages:(Ljava/lang/String;Lscala/collection/immutable/Map;)V]
KafkaTestUtils.sendMessages ( String topic, String[ ] messages ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.sendMessages:(Ljava/lang/String;[Ljava/lang/String;)V]
KafkaTestUtils.setup ( ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.setup:()V]
KafkaTestUtils.teardown ( ) : void
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.teardown:()V]
KafkaTestUtils.zkAddress ( ) : String
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.zkAddress:()Ljava/lang/String;]
KafkaTestUtils.zookeeperClient ( ) : org.I0Itec.zkclient.ZkClient
[mangled: org/apache/spark/streaming/kafka/KafkaTestUtils.zookeeperClient:()Lorg/I0Itec/zkclient/ZkClient;]
spark-streaming_2.10-1.6.0.jar, Checkpoint.class
package org.apache.spark.streaming
Checkpoint.createSparkConf ( ) : org.apache.spark.SparkConf
[mangled: org/apache/spark/streaming/Checkpoint.createSparkConf:()Lorg/apache/spark/SparkConf;]
Checkpoint.deserialize ( java.io.InputStream p1, org.apache.spark.SparkConf p2 ) [static] : Checkpoint
[mangled: org/apache/spark/streaming/Checkpoint.deserialize:(Ljava/io/InputStream;Lorg/apache/spark/SparkConf;)Lorg/apache/spark/streaming/Checkpoint;]
Checkpoint.getCheckpointFiles ( String p1, scala.Option<org.apache.hadoop.fs.FileSystem> p2 ) [static] : scala.collection.Seq<org.apache.hadoop.fs.Path>
[mangled: org/apache/spark/streaming/Checkpoint.getCheckpointFiles:(Ljava/lang/String;Lscala/Option;)Lscala/collection/Seq;]
Checkpoint.serialize ( Checkpoint p1, org.apache.spark.SparkConf p2 ) [static] : byte[ ]
[mangled: org/apache/spark/streaming/Checkpoint.serialize:(Lorg/apache/spark/streaming/Checkpoint;Lorg/apache/spark/SparkConf;)[B]
spark-streaming_2.10-1.6.0.jar, DStream<T>.class
package org.apache.spark.streaming.dstream
DStream<T>.baseScope ( ) : scala.Option<String>
[mangled: org/apache/spark/streaming/dstream/DStream<T>.baseScope:()Lscala/Option;]
DStream<T>.createRDDWithLocalProperties ( org.apache.spark.streaming.Time time, boolean displayInnerRDDOps, scala.Function0<U> body ) : U
[mangled: org/apache/spark/streaming/dstream/DStream<T>.createRDDWithLocalProperties:(Lorg/apache/spark/streaming/Time;ZLscala/Function0;)Ljava/lang/Object;]
DStream<T>.creationSite ( ) : org.apache.spark.util.CallSite
[mangled: org/apache/spark/streaming/dstream/DStream<T>.creationSite:()Lorg/apache/spark/util/CallSite;]
DStream<T>.DStream..foreachRDD ( scala.Function2<org.apache.spark.rdd.RDD<T>,org.apache.spark.streaming.Time,scala.runtime.BoxedUnit> foreachFunc, boolean displayInnerRDDOps ) : void
[mangled: org/apache/spark/streaming/dstream/DStream<T>.org.apache.spark.streaming.dstream.DStream..foreachRDD:(Lscala/Function2;Z)V]
DStream<T>.print ( int num ) : void
[mangled: org/apache/spark/streaming/dstream/DStream<T>.print:(I)V]
DStream<T>.toPairDStreamFunctions ( DStream<scala.Tuple2<K,V>> p1, scala.reflect.ClassTag<K> p2, scala.reflect.ClassTag<V> p3, scala.math.Ordering<K> p4 ) [static] : PairDStreamFunctions<K,V>
[mangled: org/apache/spark/streaming/dstream/DStream<T>.toPairDStreamFunctions:(Lorg/apache/spark/streaming/dstream/DStream;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/math/Ordering;)Lorg/apache/spark/streaming/dstream/PairDStreamFunctions;]
DStream<T>.validateAtStart ( ) : void
[mangled: org/apache/spark/streaming/dstream/DStream<T>.validateAtStart:()V]
spark-streaming_2.10-1.6.0.jar, Duration.class
package org.apache.spark.streaming
Duration.div ( Duration that ) : double
[mangled: org/apache/spark/streaming/Duration.div:(Lorg/apache/spark/streaming/Duration;)D]
Duration.greater ( Duration that ) : boolean
[mangled: org/apache/spark/streaming/Duration.greater:(Lorg/apache/spark/streaming/Duration;)Z]
Duration.greaterEq ( Duration that ) : boolean
[mangled: org/apache/spark/streaming/Duration.greaterEq:(Lorg/apache/spark/streaming/Duration;)Z]
Duration.less ( Duration that ) : boolean
[mangled: org/apache/spark/streaming/Duration.less:(Lorg/apache/spark/streaming/Duration;)Z]
Duration.lessEq ( Duration that ) : boolean
[mangled: org/apache/spark/streaming/Duration.lessEq:(Lorg/apache/spark/streaming/Duration;)Z]
Duration.minus ( Duration that ) : Duration
[mangled: org/apache/spark/streaming/Duration.minus:(Lorg/apache/spark/streaming/Duration;)Lorg/apache/spark/streaming/Duration;]
Duration.Duration..millis ( ) : long
[mangled: org/apache/spark/streaming/Duration.org.apache.spark.streaming.Duration..millis:()J]
Duration.plus ( Duration that ) : Duration
[mangled: org/apache/spark/streaming/Duration.plus:(Lorg/apache/spark/streaming/Duration;)Lorg/apache/spark/streaming/Duration;]
Duration.times ( int times ) : Duration
[mangled: org/apache/spark/streaming/Duration.times:(I)Lorg/apache/spark/streaming/Duration;]
spark-streaming_2.10-1.6.0.jar, InputDStream<T>.class
package org.apache.spark.streaming.dstream
InputDStream<T>.baseScope ( ) : scala.Option<String>
[mangled: org/apache/spark/streaming/dstream/InputDStream<T>.baseScope:()Lscala/Option;]
InputDStream<T>.id ( ) : int
[mangled: org/apache/spark/streaming/dstream/InputDStream<T>.id:()I]
InputDStream<T>.name ( ) : String
[mangled: org/apache/spark/streaming/dstream/InputDStream<T>.name:()Ljava/lang/String;]
InputDStream<T>.rateController ( ) : scala.Option<org.apache.spark.streaming.scheduler.RateController>
[mangled: org/apache/spark/streaming/dstream/InputDStream<T>.rateController:()Lscala/Option;]
spark-streaming_2.10-1.6.0.jar, JobScheduler.class
package org.apache.spark.streaming.scheduler
JobScheduler.clock ( ) : org.apache.spark.util.Clock
[mangled: org/apache/spark/streaming/scheduler/JobScheduler.clock:()Lorg/apache/spark/util/Clock;]
spark-streaming_2.10-1.6.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.awaitTerminationOrTimeout ( long timeout ) : boolean
[mangled: org/apache/spark/streaming/StreamingContext.awaitTerminationOrTimeout:(J)Z]
StreamingContext.binaryRecordsStream ( String directory, int recordLength ) : dstream.DStream<byte[ ]>
[mangled: org/apache/spark/streaming/StreamingContext.binaryRecordsStream:(Ljava/lang/String;I)Lorg/apache/spark/streaming/dstream/DStream;]
StreamingContext.fileStream ( String directory, scala.Function1<org.apache.hadoop.fs.Path,Object> filter, boolean newFilesOnly, org.apache.hadoop.conf.Configuration conf, scala.reflect.ClassTag<K> p5, scala.reflect.ClassTag<V> p6, scala.reflect.ClassTag<F> p7 ) : dstream.InputDStream<scala.Tuple2<K,V>>
[mangled: org/apache/spark/streaming/StreamingContext.fileStream:(Ljava/lang/String;Lscala/Function1;ZLorg/apache/hadoop/conf/Configuration;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/streaming/dstream/InputDStream;]
StreamingContext.getActive ( ) [static] : scala.Option<StreamingContext>
[mangled: org/apache/spark/streaming/StreamingContext.getActive:()Lscala/Option;]
StreamingContext.getActiveOrCreate ( scala.Function0<StreamingContext> p1 ) [static] : StreamingContext
[mangled: org/apache/spark/streaming/StreamingContext.getActiveOrCreate:(Lscala/Function0;)Lorg/apache/spark/streaming/StreamingContext;]
StreamingContext.getActiveOrCreate ( String p1, scala.Function0<StreamingContext> p2, org.apache.hadoop.conf.Configuration p3, boolean p4 ) [static] : StreamingContext
[mangled: org/apache/spark/streaming/StreamingContext.getActiveOrCreate:(Ljava/lang/String;Lscala/Function0;Lorg/apache/hadoop/conf/Configuration;Z)Lorg/apache/spark/streaming/StreamingContext;]
StreamingContext.getNewInputStreamId ( ) : int
[mangled: org/apache/spark/streaming/StreamingContext.getNewInputStreamId:()I]
StreamingContext.getStartSite ( ) : org.apache.spark.util.CallSite
[mangled: org/apache/spark/streaming/StreamingContext.getStartSite:()Lorg/apache/spark/util/CallSite;]
StreamingContext.getState ( ) : StreamingContextState
[mangled: org/apache/spark/streaming/StreamingContext.getState:()Lorg/apache/spark/streaming/StreamingContextState;]
StreamingContext.isCheckpointingEnabled ( ) : boolean
[mangled: org/apache/spark/streaming/StreamingContext.isCheckpointingEnabled:()Z]
StreamingContext.StreamingContext..startSite ( ) : java.util.concurrent.atomic.AtomicReference<org.apache.spark.util.CallSite>
[mangled: org/apache/spark/streaming/StreamingContext.org.apache.spark.streaming.StreamingContext..startSite:()Ljava/util/concurrent/atomic/AtomicReference;]
StreamingContext.StreamingContext..stopOnShutdown ( ) : void
[mangled: org/apache/spark/streaming/StreamingContext.org.apache.spark.streaming.StreamingContext..stopOnShutdown:()V]
StreamingContext.progressListener ( ) : ui.StreamingJobProgressListener
[mangled: org/apache/spark/streaming/StreamingContext.progressListener:()Lorg/apache/spark/streaming/ui/StreamingJobProgressListener;]
StreamingContext.StreamingContext ( String path, org.apache.spark.SparkContext sparkContext )
[mangled: org/apache/spark/streaming/StreamingContext."<init>":(Ljava/lang/String;Lorg/apache/spark/SparkContext;)V]
StreamingContext.uiTab ( ) : scala.Option<ui.StreamingTab>
[mangled: org/apache/spark/streaming/StreamingContext.uiTab:()Lscala/Option;]
StreamingContext.withNamedScope ( String name, scala.Function0<U> body ) : U
[mangled: org/apache/spark/streaming/StreamingContext.withNamedScope:(Ljava/lang/String;Lscala/Function0;)Ljava/lang/Object;]
StreamingContext.withScope ( scala.Function0<U> body ) : U
[mangled: org/apache/spark/streaming/StreamingContext.withScope:(Lscala/Function0;)Ljava/lang/Object;]
spark-streaming_2.10-1.6.0.jar, StreamingListener.class
package org.apache.spark.streaming.scheduler
StreamingListener.onOutputOperationCompleted ( StreamingListenerOutputOperationCompleted p1 ) [abstract] : void
[mangled: org/apache/spark/streaming/scheduler/StreamingListener.onOutputOperationCompleted:(Lorg/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted;)V]
StreamingListener.onOutputOperationStarted ( StreamingListenerOutputOperationStarted p1 ) [abstract] : void
[mangled: org/apache/spark/streaming/scheduler/StreamingListener.onOutputOperationStarted:(Lorg/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted;)V]
spark-streaming_2.10-1.6.0.jar, StreamingListenerOutputOperationCompleted.class
package org.apache.spark.streaming.scheduler
StreamingListenerOutputOperationCompleted.andThen ( scala.Function1<StreamingListenerOutputOperationCompleted,A> p1 ) [static] : scala.Function1<OutputOperationInfo,A>
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted.andThen:(Lscala/Function1;)Lscala/Function1;]
StreamingListenerOutputOperationCompleted.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted.canEqual:(Ljava/lang/Object;)Z]
StreamingListenerOutputOperationCompleted.compose ( scala.Function1<A,OutputOperationInfo> p1 ) [static] : scala.Function1<A,StreamingListenerOutputOperationCompleted>
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted.compose:(Lscala/Function1;)Lscala/Function1;]
StreamingListenerOutputOperationCompleted.copy ( OutputOperationInfo outputOperationInfo ) : StreamingListenerOutputOperationCompleted
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted.copy:(Lorg/apache/spark/streaming/scheduler/OutputOperationInfo;)Lorg/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted;]
StreamingListenerOutputOperationCompleted.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted.equals:(Ljava/lang/Object;)Z]
StreamingListenerOutputOperationCompleted.hashCode ( ) : int
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted.hashCode:()I]
StreamingListenerOutputOperationCompleted.outputOperationInfo ( ) : OutputOperationInfo
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted.outputOperationInfo:()Lorg/apache/spark/streaming/scheduler/OutputOperationInfo;]
StreamingListenerOutputOperationCompleted.productArity ( ) : int
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted.productArity:()I]
StreamingListenerOutputOperationCompleted.productElement ( int p1 ) : Object
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted.productElement:(I)Ljava/lang/Object;]
StreamingListenerOutputOperationCompleted.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted.productIterator:()Lscala/collection/Iterator;]
StreamingListenerOutputOperationCompleted.productPrefix ( ) : String
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted.productPrefix:()Ljava/lang/String;]
StreamingListenerOutputOperationCompleted.StreamingListenerOutputOperationCompleted ( OutputOperationInfo outputOperationInfo )
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted."<init>":(Lorg/apache/spark/streaming/scheduler/OutputOperationInfo;)V]
StreamingListenerOutputOperationCompleted.toString ( ) : String
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationCompleted.toString:()Ljava/lang/String;]
spark-streaming_2.10-1.6.0.jar, StreamingListenerOutputOperationStarted.class
package org.apache.spark.streaming.scheduler
StreamingListenerOutputOperationStarted.andThen ( scala.Function1<StreamingListenerOutputOperationStarted,A> p1 ) [static] : scala.Function1<OutputOperationInfo,A>
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted.andThen:(Lscala/Function1;)Lscala/Function1;]
StreamingListenerOutputOperationStarted.canEqual ( Object p1 ) : boolean
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted.canEqual:(Ljava/lang/Object;)Z]
StreamingListenerOutputOperationStarted.compose ( scala.Function1<A,OutputOperationInfo> p1 ) [static] : scala.Function1<A,StreamingListenerOutputOperationStarted>
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted.compose:(Lscala/Function1;)Lscala/Function1;]
StreamingListenerOutputOperationStarted.copy ( OutputOperationInfo outputOperationInfo ) : StreamingListenerOutputOperationStarted
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted.copy:(Lorg/apache/spark/streaming/scheduler/OutputOperationInfo;)Lorg/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted;]
StreamingListenerOutputOperationStarted.equals ( Object p1 ) : boolean
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted.equals:(Ljava/lang/Object;)Z]
StreamingListenerOutputOperationStarted.hashCode ( ) : int
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted.hashCode:()I]
StreamingListenerOutputOperationStarted.outputOperationInfo ( ) : OutputOperationInfo
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted.outputOperationInfo:()Lorg/apache/spark/streaming/scheduler/OutputOperationInfo;]
StreamingListenerOutputOperationStarted.productArity ( ) : int
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted.productArity:()I]
StreamingListenerOutputOperationStarted.productElement ( int p1 ) : Object
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted.productElement:(I)Ljava/lang/Object;]
StreamingListenerOutputOperationStarted.productIterator ( ) : scala.collection.Iterator<Object>
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted.productIterator:()Lscala/collection/Iterator;]
StreamingListenerOutputOperationStarted.productPrefix ( ) : String
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted.productPrefix:()Ljava/lang/String;]
StreamingListenerOutputOperationStarted.StreamingListenerOutputOperationStarted ( OutputOperationInfo outputOperationInfo )
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted."<init>":(Lorg/apache/spark/streaming/scheduler/OutputOperationInfo;)V]
StreamingListenerOutputOperationStarted.toString ( ) : String
[mangled: org/apache/spark/streaming/scheduler/StreamingListenerOutputOperationStarted.toString:()Ljava/lang/String;]
spark-streaming_2.10-1.6.0.jar, Time.class
package org.apache.spark.streaming
Time.floor ( Duration that, Time zeroTime ) : Time
[mangled: org/apache/spark/streaming/Time.floor:(Lorg/apache/spark/streaming/Duration;Lorg/apache/spark/streaming/Time;)Lorg/apache/spark/streaming/Time;]
Time.greater ( Time that ) : boolean
[mangled: org/apache/spark/streaming/Time.greater:(Lorg/apache/spark/streaming/Time;)Z]
Time.greaterEq ( Time that ) : boolean
[mangled: org/apache/spark/streaming/Time.greaterEq:(Lorg/apache/spark/streaming/Time;)Z]
Time.less ( Time that ) : boolean
[mangled: org/apache/spark/streaming/Time.less:(Lorg/apache/spark/streaming/Time;)Z]
Time.lessEq ( Time that ) : boolean
[mangled: org/apache/spark/streaming/Time.lessEq:(Lorg/apache/spark/streaming/Time;)Z]
Time.minus ( Duration that ) : Time
[mangled: org/apache/spark/streaming/Time.minus:(Lorg/apache/spark/streaming/Duration;)Lorg/apache/spark/streaming/Time;]
Time.minus ( Time that ) : Duration
[mangled: org/apache/spark/streaming/Time.minus:(Lorg/apache/spark/streaming/Time;)Lorg/apache/spark/streaming/Duration;]
Time.plus ( Duration that ) : Time
[mangled: org/apache/spark/streaming/Time.plus:(Lorg/apache/spark/streaming/Duration;)Lorg/apache/spark/streaming/Time;]
to the top
Problems with Data Types, High Severity (22)
spark-catalyst_2.10-1.6.0.jar
package org.apache.spark.sql
[+] Row (1)
| Change | Effect |
---|
1 | This interface has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (35)
anyNull ( )This abstract method is from 'Row' interface.
apply ( int )This abstract method is from 'Row' interface.
copy ( )This abstract method is from 'Row' interface.
equals ( java.lang.Object )This abstract method is from 'Row' interface.
fieldIndex ( java.lang.String )This abstract method is from 'Row' interface.
get ( int )This abstract method is from 'Row' interface.
getAs ( int )This abstract method is from 'Row' interface.
getAs ( java.lang.String )This abstract method is from 'Row' interface.
getBoolean ( int )This abstract method is from 'Row' interface.
getByte ( int )This abstract method is from 'Row' interface.
getDate ( int )This abstract method is from 'Row' interface.
getDecimal ( int )This abstract method is from 'Row' interface.
getDouble ( int )This abstract method is from 'Row' interface.
getFloat ( int )This abstract method is from 'Row' interface.
getInt ( int )This abstract method is from 'Row' interface.
getJavaMap ( int )This abstract method is from 'Row' interface.
getList ( int )This abstract method is from 'Row' interface.
getLong ( int )This abstract method is from 'Row' interface.
getMap ( int )This abstract method is from 'Row' interface.
getSeq ( int )This abstract method is from 'Row' interface.
getShort ( int )This abstract method is from 'Row' interface.
getString ( int )This abstract method is from 'Row' interface.
getStruct ( int )This abstract method is from 'Row' interface.
getTimestamp ( int )This abstract method is from 'Row' interface.
getValuesMap ( scala.collection.Seq<java.lang.String> )This abstract method is from 'Row' interface.
hashCode ( )This abstract method is from 'Row' interface.
isNullAt ( int )This abstract method is from 'Row' interface.
length ( )This abstract method is from 'Row' interface.
mkString ( )This abstract method is from 'Row' interface.
mkString ( java.lang.String )This abstract method is from 'Row' interface.
mkString ( java.lang.String, java.lang.String, java.lang.String )This abstract method is from 'Row' interface.
schema ( )This abstract method is from 'Row' interface.
size ( )This abstract method is from 'Row' interface.
toSeq ( )This abstract method is from 'Row' interface.
toString ( )This abstract method is from 'Row' interface.
package org.apache.spark.sql.catalyst.analysis
[+] Analyzer (1)
| Change | Effect |
---|
1 | Removed super-interface CheckAnalysis. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
analyzer ( )Return value of this method has type 'Analyzer'.
package org.apache.spark.sql.catalyst.plans.logical
[+] LogicalPlan (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.Logging. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
executePlan ( LogicalPlan )1st parameter 'plan' of this method has type 'LogicalPlan'.
spark-core_2.10-1.6.0.jar
package org.apache.spark
[+] SparkContext (1)
| Change | Effect |
---|
1 | Removed super-interface ExecutorAllocationClient. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (151)
JavaSparkContext ( SparkContext )1st parameter 'sc' of this method has type 'SparkContext'.
context ( )Return value of this method has type 'SparkContext'.
rdd.RDD ( SparkContext, scala.collection.Seq<Dependency<?>>, scala.reflect.ClassTag<T> )1st parameter '_sc' of this method has type 'SparkContext'.
sparkContext ( )Return value of this method has type 'SparkContext'.
accumulable ( R, java.lang.String, AccumulableParam<R,T> )This method is from 'SparkContext' class.
accumulable ( R, AccumulableParam<R,T> )This method is from 'SparkContext' class.
accumulableCollection ( R, scala.Function1<R,scala.collection.generic.Growable<T>>, scala.reflect.ClassTag<R> )This method is from 'SparkContext' class.
accumulator ( T, java.lang.String, AccumulatorParam<T> )This method is from 'SparkContext' class.
accumulator ( T, AccumulatorParam<T> )This method is from 'SparkContext' class.
addedFiles ( )This method is from 'SparkContext' class.
addedJars ( )This method is from 'SparkContext' class.
addFile ( java.lang.String )This method is from 'SparkContext' class.
addJar ( java.lang.String )This method is from 'SparkContext' class.
addSparkListener ( scheduler.SparkListener )This method is from 'SparkContext' class.
appName ( )This method is from 'SparkContext' class.
booleanWritableConverter ( )This method is from 'SparkContext' class.
boolToBoolWritable ( boolean )This method is from 'SparkContext' class.
broadcast ( T, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
bytesToBytesWritable ( byte[ ] )This method is from 'SparkContext' class.
bytesWritableConverter ( )This method is from 'SparkContext' class.
cancelAllJobs ( )This method is from 'SparkContext' class.
cancelJob ( int )This method is from 'SparkContext' class.
cancelJobGroup ( java.lang.String )This method is from 'SparkContext' class.
cancelStage ( int )This method is from 'SparkContext' class.
checkpointDir ( )This method is from 'SparkContext' class.
checkpointDir_.eq ( scala.Option<java.lang.String> )This method is from 'SparkContext' class.
checkpointFile ( java.lang.String, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
clean ( F, boolean )This method is from 'SparkContext' class.
cleaner ( )This method is from 'SparkContext' class.
cleanup ( long )This method is from 'SparkContext' class.
clearCallSite ( )This method is from 'SparkContext' class.
clearJobGroup ( )This method is from 'SparkContext' class.
conf ( )This method is from 'SparkContext' class.
dagScheduler ( )This method is from 'SparkContext' class.
dagScheduler_.eq ( scheduler.DAGScheduler )This method is from 'SparkContext' class.
defaultMinPartitions ( )This method is from 'SparkContext' class.
defaultParallelism ( )This method is from 'SparkContext' class.
doubleRDDToDoubleRDDFunctions ( rdd.RDD<java.lang.Object> )This method is from 'SparkContext' class.
doubleToDoubleWritable ( double )This method is from 'SparkContext' class.
doubleWritableConverter ( )This method is from 'SparkContext' class.
emptyRDD ( scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
env ( )This method is from 'SparkContext' class.
eventLogger ( )This method is from 'SparkContext' class.
executorEnvs ( )This method is from 'SparkContext' class.
executorMemory ( )This method is from 'SparkContext' class.
files ( )This method is from 'SparkContext' class.
floatToFloatWritable ( float )This method is from 'SparkContext' class.
floatWritableConverter ( )This method is from 'SparkContext' class.
getAllPools ( )This method is from 'SparkContext' class.
getCallSite ( )This method is from 'SparkContext' class.
getCheckpointDir ( )This method is from 'SparkContext' class.
getConf ( )This method is from 'SparkContext' class.
getExecutorMemoryStatus ( )This method is from 'SparkContext' class.
getExecutorStorageStatus ( )This method is from 'SparkContext' class.
getLocalProperties ( )This method is from 'SparkContext' class.
getLocalProperty ( java.lang.String )This method is from 'SparkContext' class.
getPersistentRDDs ( )This method is from 'SparkContext' class.
getPoolForName ( java.lang.String )This method is from 'SparkContext' class.
getPreferredLocs ( rdd.RDD<?>, int )This method is from 'SparkContext' class.
getRDDStorageInfo ( )This method is from 'SparkContext' class.
getSchedulingMode ( )This method is from 'SparkContext' class.
getSparkHome ( )This method is from 'SparkContext' class.
hadoopConfiguration ( )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, int, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, java.lang.Class<? extends org.apache.hadoop.mapred.InputFormat<K,V>>, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<? extends org.apache.hadoop.mapred.InputFormat<K,V>>, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
intToIntWritable ( int )This method is from 'SparkContext' class.
intWritableConverter ( )This method is from 'SparkContext' class.
isLocal ( )This method is from 'SparkContext' class.
isTraceEnabled ( )This method is from 'SparkContext' class.
jarOfClass ( java.lang.Class<?> )This method is from 'SparkContext' class.
jarOfObject ( java.lang.Object )This method is from 'SparkContext' class.
jars ( )This method is from 'SparkContext' class.
listenerBus ( )This method is from 'SparkContext' class.
localProperties ( )This method is from 'SparkContext' class.
log ( )This method is from 'SparkContext' class.
logDebug ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logDebug ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logError ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logError ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logInfo ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logInfo ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logName ( )This method is from 'SparkContext' class.
logTrace ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logTrace ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logWarning ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logWarning ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
longToLongWritable ( long )This method is from 'SparkContext' class.
longWritableConverter ( )This method is from 'SparkContext' class.
makeRDD ( scala.collection.Seq<scala.Tuple2<T,scala.collection.Seq<java.lang.String>>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
makeRDD ( scala.collection.Seq<T>, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
master ( )This method is from 'SparkContext' class.
metadataCleaner ( )This method is from 'SparkContext' class.
newAPIHadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, org.apache.hadoop.conf.Configuration )This method is from 'SparkContext' class.
newAPIHadoopFile ( java.lang.String, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
newAPIHadoopRDD ( org.apache.hadoop.conf.Configuration, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )This method is from 'SparkContext' class.
newRddId ( )This method is from 'SparkContext' class.
newShuffleId ( )This method is from 'SparkContext' class.
numericRDDToDoubleRDDFunctions ( rdd.RDD<T>, scala.math.Numeric<T> )This method is from 'SparkContext' class.
objectFile ( java.lang.String, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
Logging..log_ ( )This method is from 'SparkContext' class.
Logging..log__.eq ( org.slf4j.Logger )This method is from 'SparkContext' class.
SparkContext..warnSparkMem ( java.lang.String )This method is from 'SparkContext' class.
parallelize ( scala.collection.Seq<T>, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
persistentRdds ( )This method is from 'SparkContext' class.
persistRDD ( rdd.RDD<?> )This method is from 'SparkContext' class.
rddToAsyncRDDActions ( rdd.RDD<T>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
rddToOrderedRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.math.Ordering<K>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V> )This method is from 'SparkContext' class.
rddToPairRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.math.Ordering<K> )This method is from 'SparkContext' class.
rddToSequenceFileRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.Function1<K,org.apache.hadoop.io.Writable>, scala.reflect.ClassTag<K>, scala.Function1<V,org.apache.hadoop.io.Writable>, scala.reflect.ClassTag<V> )This method is from 'SparkContext' class.
runApproximateJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, partial.ApproximateEvaluator<U,R>, long )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, int, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.Function0<WritableConverter<K>>, scala.Function0<WritableConverter<V>> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
setCallSite ( java.lang.String )This method is from 'SparkContext' class.
setCheckpointDir ( java.lang.String )This method is from 'SparkContext' class.
setJobDescription ( java.lang.String )This method is from 'SparkContext' class.
setJobGroup ( java.lang.String, java.lang.String, boolean )This method is from 'SparkContext' class.
setLocalProperties ( java.util.Properties )This method is from 'SparkContext' class.
setLocalProperty ( java.lang.String, java.lang.String )This method is from 'SparkContext' class.
SparkContext ( )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, java.lang.String )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, java.lang.String, scala.collection.Seq<java.lang.String> )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, SparkConf )This constructor is from 'SparkContext' class.
SparkContext ( SparkConf )This constructor is from 'SparkContext' class.
sparkUser ( )This method is from 'SparkContext' class.
startTime ( )This method is from 'SparkContext' class.
stop ( )This method is from 'SparkContext' class.
stringToText ( java.lang.String )This method is from 'SparkContext' class.
stringWritableConverter ( )This method is from 'SparkContext' class.
submitJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.Function0<R> )This method is from 'SparkContext' class.
taskScheduler ( )This method is from 'SparkContext' class.
taskScheduler_.eq ( scheduler.TaskScheduler )This method is from 'SparkContext' class.
textFile ( java.lang.String, int )This method is from 'SparkContext' class.
union ( rdd.RDD<T>, scala.collection.Seq<rdd.RDD<T>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
union ( scala.collection.Seq<rdd.RDD<T>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
unpersistRDD ( int, boolean )This method is from 'SparkContext' class.
version ( )This method is from 'SparkContext' class.
wholeTextFiles ( java.lang.String, int )This method is from 'SparkContext' class.
writableWritableConverter ( )This method is from 'SparkContext' class.
HiveContext ( SparkContext )1st parameter 'sc' of this method has type 'SparkContext'.
sc ( )Return value of this method has type 'SparkContext'.
sparkContext ( )Return value of this method has type 'SparkContext'.
StreamingContext ( SparkContext, streaming.Checkpoint, streaming.Duration )1st parameter 'sc_' of this method has type 'SparkContext'.
StreamingContext ( SparkContext, streaming.Duration )1st parameter 'sparkContext' of this method has type 'SparkContext'.
[+] TaskContext (1)
| Change | Effect |
---|
1 | Removed super-interface java.io.Serializable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (3)
compute ( Partition, TaskContext )2nd parameter 'p2' of this abstract method has type 'TaskContext'.
computeOrReadCheckpoint ( Partition, TaskContext )2nd parameter 'context' of this method has type 'TaskContext'.
iterator ( Partition, TaskContext )2nd parameter 'context' of this method has type 'TaskContext'.
package org.apache.spark.api.java
[+] JavaSparkContext (1)
| Change | Effect |
---|
1 | Removed super-interface java.io.Closeable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
JavaSparkContext ( org.apache.spark.SparkContext )This constructor is from 'JavaSparkContext' class.
package org.apache.spark.broadcast
[+] Broadcast<T> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.Logging. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
broadcast ( T, scala.reflect.ClassTag<T> )Return value of this method has type 'Broadcast<T>'.
package org.apache.spark.executor
[+] OutputMetrics (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
recordsWritten ( )This method is from 'OutputMetrics' class.
package org.apache.spark.rdd
[+] PairRDDFunctions<K,V> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.mapreduce.SparkHadoopMapReduceUtil. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
rddToPairRDDFunctions ( RDD<scala.Tuple2<K,V>>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.math.Ordering<K> )Return value of this method has type 'PairRDDFunctions<K,V>'.
package org.apache.spark.scheduler
[+] SparkListener (3)
| Change | Effect |
---|
1 | Abstract method onBlockUpdated ( SparkListenerBlockUpdated ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
2 | Abstract method onExecutorAdded ( SparkListenerExecutorAdded ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
3 | Abstract method onExecutorRemoved ( SparkListenerExecutorRemoved ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (15)
onApplicationEnd ( SparkListenerApplicationEnd )This abstract method is from 'SparkListener' interface.
onApplicationStart ( SparkListenerApplicationStart )This abstract method is from 'SparkListener' interface.
onBlockManagerAdded ( SparkListenerBlockManagerAdded )This abstract method is from 'SparkListener' interface.
onBlockManagerRemoved ( SparkListenerBlockManagerRemoved )This abstract method is from 'SparkListener' interface.
onEnvironmentUpdate ( SparkListenerEnvironmentUpdate )This abstract method is from 'SparkListener' interface.
onExecutorMetricsUpdate ( SparkListenerExecutorMetricsUpdate )This abstract method is from 'SparkListener' interface.
onJobEnd ( SparkListenerJobEnd )This abstract method is from 'SparkListener' interface.
onJobStart ( SparkListenerJobStart )This abstract method is from 'SparkListener' interface.
onStageCompleted ( SparkListenerStageCompleted )This abstract method is from 'SparkListener' interface.
onStageSubmitted ( SparkListenerStageSubmitted )This abstract method is from 'SparkListener' interface.
onTaskEnd ( SparkListenerTaskEnd )This abstract method is from 'SparkListener' interface.
onTaskGettingResult ( SparkListenerTaskGettingResult )This abstract method is from 'SparkListener' interface.
onTaskStart ( SparkListenerTaskStart )This abstract method is from 'SparkListener' interface.
onUnpersistRDD ( SparkListenerUnpersistRDD )This abstract method is from 'SparkListener' interface.
addSparkListener ( SparkListener )1st parameter 'listener' of this method has type 'SparkListener'.
[+] SparkListenerBlockUpdated (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (13)
andThen ( scala.Function1<SparkListenerBlockUpdated,A> )This method is from 'SparkListenerBlockUpdated' class.
blockUpdatedInfo ( )This method is from 'SparkListenerBlockUpdated' class.
canEqual ( java.lang.Object )This method is from 'SparkListenerBlockUpdated' class.
compose ( scala.Function1<A,org.apache.spark.storage.BlockUpdatedInfo> )This method is from 'SparkListenerBlockUpdated' class.
copy ( org.apache.spark.storage.BlockUpdatedInfo )This method is from 'SparkListenerBlockUpdated' class.
equals ( java.lang.Object )This method is from 'SparkListenerBlockUpdated' class.
hashCode ( )This method is from 'SparkListenerBlockUpdated' class.
productArity ( )This method is from 'SparkListenerBlockUpdated' class.
productElement ( int )This method is from 'SparkListenerBlockUpdated' class.
productIterator ( )This method is from 'SparkListenerBlockUpdated' class.
productPrefix ( )This method is from 'SparkListenerBlockUpdated' class.
SparkListenerBlockUpdated ( org.apache.spark.storage.BlockUpdatedInfo )This constructor is from 'SparkListenerBlockUpdated' class.
toString ( )This method is from 'SparkListenerBlockUpdated' class.
[+] SparkListenerExecutorAdded (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (15)
canEqual ( java.lang.Object )This method is from 'SparkListenerExecutorAdded' class.
copy ( long, java.lang.String, cluster.ExecutorInfo )This method is from 'SparkListenerExecutorAdded' class.
curried ( )This method is from 'SparkListenerExecutorAdded' class.
equals ( java.lang.Object )This method is from 'SparkListenerExecutorAdded' class.
executorId ( )This method is from 'SparkListenerExecutorAdded' class.
executorInfo ( )This method is from 'SparkListenerExecutorAdded' class.
hashCode ( )This method is from 'SparkListenerExecutorAdded' class.
productArity ( )This method is from 'SparkListenerExecutorAdded' class.
productElement ( int )This method is from 'SparkListenerExecutorAdded' class.
productIterator ( )This method is from 'SparkListenerExecutorAdded' class.
productPrefix ( )This method is from 'SparkListenerExecutorAdded' class.
SparkListenerExecutorAdded ( long, java.lang.String, cluster.ExecutorInfo )This constructor is from 'SparkListenerExecutorAdded' class.
time ( )This method is from 'SparkListenerExecutorAdded' class.
toString ( )This method is from 'SparkListenerExecutorAdded' class.
tupled ( )This method is from 'SparkListenerExecutorAdded' class.
[+] SparkListenerExecutorRemoved (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (15)
canEqual ( java.lang.Object )This method is from 'SparkListenerExecutorRemoved' class.
copy ( long, java.lang.String, java.lang.String )This method is from 'SparkListenerExecutorRemoved' class.
curried ( )This method is from 'SparkListenerExecutorRemoved' class.
equals ( java.lang.Object )This method is from 'SparkListenerExecutorRemoved' class.
executorId ( )This method is from 'SparkListenerExecutorRemoved' class.
hashCode ( )This method is from 'SparkListenerExecutorRemoved' class.
productArity ( )This method is from 'SparkListenerExecutorRemoved' class.
productElement ( int )This method is from 'SparkListenerExecutorRemoved' class.
productIterator ( )This method is from 'SparkListenerExecutorRemoved' class.
productPrefix ( )This method is from 'SparkListenerExecutorRemoved' class.
reason ( )This method is from 'SparkListenerExecutorRemoved' class.
SparkListenerExecutorRemoved ( long, java.lang.String, java.lang.String )This constructor is from 'SparkListenerExecutorRemoved' class.
time ( )This method is from 'SparkListenerExecutorRemoved' class.
toString ( )This method is from 'SparkListenerExecutorRemoved' class.
tupled ( )This method is from 'SparkListenerExecutorRemoved' class.
spark-hive_2.10-1.6.0.jar
package org.apache.spark.sql.hive
[+] HiveContext.QueryExecution (1)
| Change | Effect |
---|
1 | This class became abstract. | A client program may be interrupted by InstantiationError exception. |
[+] affected methods (1)
executePlan ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan )Return value of this method has type 'HiveContext.QueryExecution'.
spark-sql_2.10-1.6.0.jar
package org.apache.spark.sql
[+] DataFrame (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (136)
agg ( java.util.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
agg ( Column, Column... )This method is from 'DataFrame' class.
agg ( Column, scala.collection.Seq<Column> )This method is from 'DataFrame' class.
agg ( scala.collection.immutable.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
agg ( scala.Tuple2<java.lang.String,java.lang.String>, scala.collection.Seq<scala.Tuple2<java.lang.String,java.lang.String>> )This method is from 'DataFrame' class.
alias ( java.lang.String )This method is from 'DataFrame' class.
alias ( scala.Symbol )This method is from 'DataFrame' class.
apply ( java.lang.String )This method is from 'DataFrame' class.
as ( java.lang.String )This method is from 'DataFrame' class.
as ( Encoder<U> )This method is from 'DataFrame' class.
as ( scala.Symbol )This method is from 'DataFrame' class.
cache ( )This method is from 'DataFrame' class.
coalesce ( int )This method is from 'DataFrame' class.
col ( java.lang.String )This method is from 'DataFrame' class.
collect ( )This method is from 'DataFrame' class.
collectAsList ( )This method is from 'DataFrame' class.
collectToPython ( )This method is from 'DataFrame' class.
columns ( )This method is from 'DataFrame' class.
count ( )This method is from 'DataFrame' class.
cube ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
cube ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
cube ( Column... )This method is from 'DataFrame' class.
cube ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
DataFrame ( SQLContext, catalyst.plans.logical.LogicalPlan )This constructor is from 'DataFrame' class.
DataFrame ( SQLContext, execution.QueryExecution )This constructor is from 'DataFrame' class.
describe ( java.lang.String... )This method is from 'DataFrame' class.
describe ( scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
distinct ( )This method is from 'DataFrame' class.
drop ( java.lang.String )This method is from 'DataFrame' class.
drop ( Column )This method is from 'DataFrame' class.
dropDuplicates ( )This method is from 'DataFrame' class.
dropDuplicates ( java.lang.String[ ] )This method is from 'DataFrame' class.
dropDuplicates ( scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
dtypes ( )This method is from 'DataFrame' class.
except ( DataFrame )This method is from 'DataFrame' class.
explain ( )This method is from 'DataFrame' class.
explain ( boolean )This method is from 'DataFrame' class.
explode ( java.lang.String, java.lang.String, scala.Function1<A,scala.collection.TraversableOnce<B>>, scala.reflect.api.TypeTags.TypeTag<B> )This method is from 'DataFrame' class.
explode ( scala.collection.Seq<Column>, scala.Function1<Row,scala.collection.TraversableOnce<A>>, scala.reflect.api.TypeTags.TypeTag<A> )This method is from 'DataFrame' class.
filter ( java.lang.String )This method is from 'DataFrame' class.
filter ( Column )This method is from 'DataFrame' class.
first ( )This method is from 'DataFrame' class.
flatMap ( scala.Function1<Row,scala.collection.TraversableOnce<R>>, scala.reflect.ClassTag<R> )This method is from 'DataFrame' class.
foreach ( scala.Function1<Row,scala.runtime.BoxedUnit> )This method is from 'DataFrame' class.
foreachPartition ( scala.Function1<scala.collection.Iterator<Row>,scala.runtime.BoxedUnit> )This method is from 'DataFrame' class.
groupBy ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
groupBy ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
groupBy ( Column... )This method is from 'DataFrame' class.
groupBy ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
head ( )This method is from 'DataFrame' class.
head ( int )This method is from 'DataFrame' class.
inputFiles ( )This method is from 'DataFrame' class.
intersect ( DataFrame )This method is from 'DataFrame' class.
isLocal ( )This method is from 'DataFrame' class.
javaRDD ( )This method is from 'DataFrame' class.
javaToPython ( )This method is from 'DataFrame' class.
join ( DataFrame )This method is from 'DataFrame' class.
join ( DataFrame, java.lang.String )This method is from 'DataFrame' class.
join ( DataFrame, Column )This method is from 'DataFrame' class.
join ( DataFrame, Column, java.lang.String )This method is from 'DataFrame' class.
join ( DataFrame, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
join ( DataFrame, scala.collection.Seq<java.lang.String>, java.lang.String )This method is from 'DataFrame' class.
limit ( int )This method is from 'DataFrame' class.
logicalPlan ( )This method is from 'DataFrame' class.
map ( scala.Function1<Row,R>, scala.reflect.ClassTag<R> )This method is from 'DataFrame' class.
mapPartitions ( scala.Function1<scala.collection.Iterator<Row>,scala.collection.Iterator<R>>, scala.reflect.ClassTag<R> )This method is from 'DataFrame' class.
na ( )This method is from 'DataFrame' class.
numericColumns ( )This method is from 'DataFrame' class.
orderBy ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
orderBy ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
orderBy ( Column... )This method is from 'DataFrame' class.
orderBy ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
DataFrame..collect ( boolean )This method is from 'DataFrame' class.
DataFrame..withPlan ( scala.Function0<catalyst.plans.logical.LogicalPlan> )This method is from 'DataFrame' class.
persist ( )This method is from 'DataFrame' class.
persist ( org.apache.spark.storage.StorageLevel )This method is from 'DataFrame' class.
printSchema ( )This method is from 'DataFrame' class.
queryExecution ( )This method is from 'DataFrame' class.
randomSplit ( double[ ] )This method is from 'DataFrame' class.
randomSplit ( double[ ], long )This method is from 'DataFrame' class.
randomSplit ( scala.collection.immutable.List<java.lang.Object>, long )This method is from 'DataFrame' class.
rdd ( )This method is from 'DataFrame' class.
registerTempTable ( java.lang.String )This method is from 'DataFrame' class.
repartition ( int )This method is from 'DataFrame' class.
repartition ( int, Column... )This method is from 'DataFrame' class.
repartition ( int, scala.collection.Seq<Column> )This method is from 'DataFrame' class.
repartition ( Column... )This method is from 'DataFrame' class.
repartition ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
resolve ( java.lang.String )This method is from 'DataFrame' class.
rollup ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
rollup ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
rollup ( Column... )This method is from 'DataFrame' class.
rollup ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
sample ( boolean, double )This method is from 'DataFrame' class.
sample ( boolean, double, long )This method is from 'DataFrame' class.
schema ( )This method is from 'DataFrame' class.
select ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
select ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
select ( Column... )This method is from 'DataFrame' class.
select ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
selectExpr ( java.lang.String... )This method is from 'DataFrame' class.
selectExpr ( scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
show ( )This method is from 'DataFrame' class.
show ( boolean )This method is from 'DataFrame' class.
show ( int )This method is from 'DataFrame' class.
show ( int, boolean )This method is from 'DataFrame' class.
showString ( int, boolean )This method is from 'DataFrame' class.
sort ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
sort ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
sort ( Column... )This method is from 'DataFrame' class.
sort ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
sortWithinPartitions ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
sortWithinPartitions ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
sortWithinPartitions ( Column... )This method is from 'DataFrame' class.
sortWithinPartitions ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
sqlContext ( )This method is from 'DataFrame' class.
stat ( )This method is from 'DataFrame' class.
take ( int )This method is from 'DataFrame' class.
takeAsList ( int )This method is from 'DataFrame' class.
toDF ( )This method is from 'DataFrame' class.
toDF ( java.lang.String... )This method is from 'DataFrame' class.
toDF ( scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
toJavaRDD ( )This method is from 'DataFrame' class.
toJSON ( )This method is from 'DataFrame' class.
toString ( )This method is from 'DataFrame' class.
transform ( scala.Function1<DataFrame,DataFrame> )This method is from 'DataFrame' class.
unionAll ( DataFrame )This method is from 'DataFrame' class.
unpersist ( )This method is from 'DataFrame' class.
unpersist ( boolean )This method is from 'DataFrame' class.
where ( java.lang.String )This method is from 'DataFrame' class.
where ( Column )This method is from 'DataFrame' class.
withColumn ( java.lang.String, Column )This method is from 'DataFrame' class.
withColumn ( java.lang.String, Column, types.Metadata )This method is from 'DataFrame' class.
withColumnRenamed ( java.lang.String, java.lang.String )This method is from 'DataFrame' class.
withNewExecutionId ( scala.Function0<T> )This method is from 'DataFrame' class.
write ( )This method is from 'DataFrame' class.
spark-streaming-kafka_2.10-1.6.0.jar
package org.apache.spark.streaming.kafka
[+] KafkaTestUtils (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (32)
brokerAddress ( )This method is from 'KafkaTestUtils' class.
createTopic ( java.lang.String )This method is from 'KafkaTestUtils' class.
eventually ( org.apache.spark.streaming.Time, org.apache.spark.streaming.Time, scala.Function0<T> )This method is from 'KafkaTestUtils' class.
isTraceEnabled ( )This method is from 'KafkaTestUtils' class.
KafkaTestUtils ( )This constructor is from 'KafkaTestUtils' class.
log ( )This method is from 'KafkaTestUtils' class.
logDebug ( scala.Function0<java.lang.String> )This method is from 'KafkaTestUtils' class.
logDebug ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'KafkaTestUtils' class.
logError ( scala.Function0<java.lang.String> )This method is from 'KafkaTestUtils' class.
logError ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'KafkaTestUtils' class.
logInfo ( scala.Function0<java.lang.String> )This method is from 'KafkaTestUtils' class.
logInfo ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'KafkaTestUtils' class.
logName ( )This method is from 'KafkaTestUtils' class.
logTrace ( scala.Function0<java.lang.String> )This method is from 'KafkaTestUtils' class.
logTrace ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'KafkaTestUtils' class.
logWarning ( scala.Function0<java.lang.String> )This method is from 'KafkaTestUtils' class.
logWarning ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'KafkaTestUtils' class.
org.apache.spark.Logging..log_ ( )This method is from 'KafkaTestUtils' class.
org.apache.spark.Logging..log__.eq ( org.slf4j.Logger )This method is from 'KafkaTestUtils' class.
KafkaTestUtils..brokerConf ( )This method is from 'KafkaTestUtils' class.
KafkaTestUtils..brokerConf_.eq ( kafka.server.KafkaConfig )This method is from 'KafkaTestUtils' class.
KafkaTestUtils..brokerConfiguration ( )This method is from 'KafkaTestUtils' class.
KafkaTestUtils..brokerPort_.eq ( int )This method is from 'KafkaTestUtils' class.
KafkaTestUtils..server ( )This method is from 'KafkaTestUtils' class.
KafkaTestUtils..server_.eq ( kafka.server.KafkaServer )This method is from 'KafkaTestUtils' class.
sendMessages ( java.lang.String, java.lang.String[ ] )This method is from 'KafkaTestUtils' class.
sendMessages ( java.lang.String, java.util.Map<java.lang.String,java.lang.Integer> )This method is from 'KafkaTestUtils' class.
sendMessages ( java.lang.String, scala.collection.immutable.Map<java.lang.String,java.lang.Object> )This method is from 'KafkaTestUtils' class.
setup ( )This method is from 'KafkaTestUtils' class.
teardown ( )This method is from 'KafkaTestUtils' class.
zkAddress ( )This method is from 'KafkaTestUtils' class.
zookeeperClient ( )This method is from 'KafkaTestUtils' class.
spark-streaming_2.10-1.6.0.jar
package org.apache.spark.streaming.scheduler
[+] StreamingListener (2)
| Change | Effect |
---|
1 | Abstract method onOutputOperationCompleted ( StreamingListenerOutputOperationCompleted ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
2 | Abstract method onOutputOperationStarted ( StreamingListenerOutputOperationStarted ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (7)
onBatchCompleted ( StreamingListenerBatchCompleted )This abstract method is from 'StreamingListener' interface.
onBatchStarted ( StreamingListenerBatchStarted )This abstract method is from 'StreamingListener' interface.
onBatchSubmitted ( StreamingListenerBatchSubmitted )This abstract method is from 'StreamingListener' interface.
onReceiverError ( StreamingListenerReceiverError )This abstract method is from 'StreamingListener' interface.
onReceiverStarted ( StreamingListenerReceiverStarted )This abstract method is from 'StreamingListener' interface.
onReceiverStopped ( StreamingListenerReceiverStopped )This abstract method is from 'StreamingListener' interface.
addStreamingListener ( StreamingListener )1st parameter 'streamingListener' of this method has type 'StreamingListener'.
[+] StreamingListenerOutputOperationCompleted (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (13)
andThen ( scala.Function1<StreamingListenerOutputOperationCompleted,A> )This method is from 'StreamingListenerOutputOperationCompleted' class.
canEqual ( java.lang.Object )This method is from 'StreamingListenerOutputOperationCompleted' class.
compose ( scala.Function1<A,OutputOperationInfo> )This method is from 'StreamingListenerOutputOperationCompleted' class.
copy ( OutputOperationInfo )This method is from 'StreamingListenerOutputOperationCompleted' class.
equals ( java.lang.Object )This method is from 'StreamingListenerOutputOperationCompleted' class.
hashCode ( )This method is from 'StreamingListenerOutputOperationCompleted' class.
outputOperationInfo ( )This method is from 'StreamingListenerOutputOperationCompleted' class.
productArity ( )This method is from 'StreamingListenerOutputOperationCompleted' class.
productElement ( int )This method is from 'StreamingListenerOutputOperationCompleted' class.
productIterator ( )This method is from 'StreamingListenerOutputOperationCompleted' class.
productPrefix ( )This method is from 'StreamingListenerOutputOperationCompleted' class.
StreamingListenerOutputOperationCompleted ( OutputOperationInfo )This constructor is from 'StreamingListenerOutputOperationCompleted' class.
toString ( )This method is from 'StreamingListenerOutputOperationCompleted' class.
[+] StreamingListenerOutputOperationStarted (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (13)
andThen ( scala.Function1<StreamingListenerOutputOperationStarted,A> )This method is from 'StreamingListenerOutputOperationStarted' class.
canEqual ( java.lang.Object )This method is from 'StreamingListenerOutputOperationStarted' class.
compose ( scala.Function1<A,OutputOperationInfo> )This method is from 'StreamingListenerOutputOperationStarted' class.
copy ( OutputOperationInfo )This method is from 'StreamingListenerOutputOperationStarted' class.
equals ( java.lang.Object )This method is from 'StreamingListenerOutputOperationStarted' class.
hashCode ( )This method is from 'StreamingListenerOutputOperationStarted' class.
outputOperationInfo ( )This method is from 'StreamingListenerOutputOperationStarted' class.
productArity ( )This method is from 'StreamingListenerOutputOperationStarted' class.
productElement ( int )This method is from 'StreamingListenerOutputOperationStarted' class.
productIterator ( )This method is from 'StreamingListenerOutputOperationStarted' class.
productPrefix ( )This method is from 'StreamingListenerOutputOperationStarted' class.
StreamingListenerOutputOperationStarted ( OutputOperationInfo )This constructor is from 'StreamingListenerOutputOperationStarted' class.
toString ( )This method is from 'StreamingListenerOutputOperationStarted' class.
to the top
Problems with Methods, High Severity (1)
spark-core_2.10-1.6.0.jar, SparkContext
package org.apache.spark
[+] SparkContext.localProperties ( ) : InheritableThreadLocal<java.util.Properties> (1)
[mangled: org/apache/spark/SparkContext.localProperties:()Ljava/lang/InheritableThreadLocal;]
| Change | Effect |
---|
1 | Access level has been changed from public to private. | A client program may be interrupted by IllegalAccessError exception. |
to the top
Problems with Data Types, Medium Severity (4)
spark-core_2.10-1.6.0.jar
package org.apache.spark.api.java
[+] JavaRDD<T> (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaRDDLike<T,JavaRDD<T>>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (1)
toJavaRDD ( )Return value of this method has type 'JavaRDD<T>'.
package org.apache.spark.scheduler
[+] LiveListenerBus (1)
| Change | Effect |
---|
1 | Removed super-class org.apache.spark.util.AsynchronousListenerBus<SparkListener,SparkListenerEvent>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (1)
listenerBus ( )Return value of this method has type 'LiveListenerBus'.
spark-hive_2.10-1.6.0.jar
package org.apache.spark.sql.hive
[+] HiveContext.QueryExecution (1)
| Change | Effect |
---|
1 | Superclass has been changed from org.apache.spark.sql.execution.QueryExecution to org.apache.spark.sql.SQLContext.QueryExecution. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (1)
executePlan ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan )Return value of this method has type 'HiveContext.QueryExecution'.
spark-sql_2.10-1.6.0.jar
package org.apache.spark.sql
[+] SQLContext.SparkPlanner (1)
| Change | Effect |
---|
1 | Superclass has been changed from execution.SparkPlanner to execution.SparkStrategies. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (1)
planner ( )Return value of this method has type 'SQLContext.SparkPlanner'.
to the top
Problems with Data Types, Low Severity (4)
spark-hive_2.10-1.6.0.jar
package org.apache.spark.sql.hive
[+] HiveContext (4)
| Change | Effect |
---|
1 | Method executePlan ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan ) has been moved up type hierarchy to executePlan ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan ) | Method executePlan ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan ) will be called instead of executePlan ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan ) in a client program. |
2 | Method parseSql ( java.lang.String ) has been moved up type hierarchy to parseSql ( java.lang.String ) | Method parseSql ( java.lang.String ) will be called instead of parseSql ( java.lang.String ) in a client program. |
3 | Method planner ( ) has been moved up type hierarchy to planner ( ) | Method planner ( ) will be called instead of planner ( ) in a client program. |
4 | Method functionRegistry ( ) has been overridden by functionRegistry ( ) | Method functionRegistry ( ) will be called instead of functionRegistry ( ) in a client program. |
[+] affected methods (4)
executePlan ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan )Method 'executePlan ( org.apache.spark.sql.catalyst.plans.logical.LogicalPlan )' will be called instead of this method in a client program.
parseSql ( java.lang.String )Method 'parseSql ( java.lang.String )' will be called instead of this method in a client program.
planner ( )Method 'planner ( )' will be called instead of this method in a client program.
functionRegistry ( )Method 'functionRegistry ( )' will be called instead of this method in a client program.
to the top
Problems with Methods, Low Severity (1)
spark-streaming_2.10-1.6.0.jar, StreamingContext
package org.apache.spark.streaming
[+] StreamingContext.stop ( boolean stopSparkContext, boolean stopGracefully ) : void (1)
[mangled: org/apache/spark/streaming/StreamingContext.stop:(ZZ)V]
| Change | Effect |
---|
1 | Method became synchronized.
| A multi-threaded client program may change behavior. |
to the top
Java ARchives (8)
spark-catalyst_2.10-1.6.0.jar
spark-core_2.10-1.6.0.jar
spark-hive_2.10-1.6.0.jar
spark-mllib_2.10-1.6.0.jar
spark-sql_2.10-1.6.0.jar
spark-streaming-kafka_2.10-1.6.0.jar
spark-streaming_2.10-1.6.0.jar
spark-yarn_2.10-1.6.0.jar
to the top
Generated on Tue Feb 2 01:42:10 2016 for spark-testing-base_2.10-0.3.1 by Java API Compliance Checker 1.4.1
A tool for checking backward compatibility of a Java library API