Binary compatibility report for the elasticsearch-spark_2.10-2.1.0.Beta1 library between 1.0.0 and 1.4.0 versions (relating to the portability of client application elasticsearch-spark_2.10-2.1.0.Beta1.jar)
Test Info
Library Name | elasticsearch-spark_2.10-2.1.0.Beta1 |
Version #1 | 1.0.0 |
Version #2 | 1.4.0 |
Java Version | 1.7.0_75 |
Test Results
Total Java ARchives | 1 |
---|
Total Methods / Classes | 500 / 997 |
---|
Verdict | Incompatible (8.4%) |
Problem Summary
| Severity | Count |
---|
Added Methods | - | 106 |
---|
Removed Methods | High | 11 |
---|
Problems with Data Types | High | 20 |
---|
Medium | 0 |
Low | 4 |
Problems with Methods | High | 6 |
---|
Medium | 0 |
Low | 0 |
Other Changes in Data Types | - | 1 |
Added Methods (106)
spark-core_2.10-1.4.0.jar, JavaSparkContext.class
package org.apache.spark.api.java
JavaSparkContext.accumulable ( T initialValue, String name, org.apache.spark.AccumulableParam<T,R> param ) : org.apache.spark.Accumulable<T,R>
[mangled: org/apache/spark/api/java/JavaSparkContext.accumulable:(Ljava/lang/Object;Ljava/lang/String;Lorg/apache/spark/AccumulableParam;)Lorg/apache/spark/Accumulable;]
JavaSparkContext.accumulator ( double initialValue, String name ) : org.apache.spark.Accumulator<Double>
[mangled: org/apache/spark/api/java/JavaSparkContext.accumulator:(DLjava/lang/String;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.accumulator ( int initialValue, String name ) : org.apache.spark.Accumulator<Integer>
[mangled: org/apache/spark/api/java/JavaSparkContext.accumulator:(ILjava/lang/String;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.accumulator ( T initialValue, String name, org.apache.spark.AccumulatorParam<T> accumulatorParam ) : org.apache.spark.Accumulator<T>
[mangled: org/apache/spark/api/java/JavaSparkContext.accumulator:(Ljava/lang/Object;Ljava/lang/String;Lorg/apache/spark/AccumulatorParam;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.binaryFiles ( String path ) : JavaPairRDD<String,org.apache.spark.input.PortableDataStream>
[mangled: org/apache/spark/api/java/JavaSparkContext.binaryFiles:(Ljava/lang/String;)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaSparkContext.binaryFiles ( String path, int minPartitions ) : JavaPairRDD<String,org.apache.spark.input.PortableDataStream>
[mangled: org/apache/spark/api/java/JavaSparkContext.binaryFiles:(Ljava/lang/String;I)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaSparkContext.binaryRecords ( String path, int recordLength ) : JavaRDD<byte[ ]>
[mangled: org/apache/spark/api/java/JavaSparkContext.binaryRecords:(Ljava/lang/String;I)Lorg/apache/spark/api/java/JavaRDD;]
JavaSparkContext.close ( ) : void
[mangled: org/apache/spark/api/java/JavaSparkContext.close:()V]
JavaSparkContext.doubleAccumulator ( double initialValue, String name ) : org.apache.spark.Accumulator<Double>
[mangled: org/apache/spark/api/java/JavaSparkContext.doubleAccumulator:(DLjava/lang/String;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.emptyRDD ( ) : JavaRDD<T>
[mangled: org/apache/spark/api/java/JavaSparkContext.emptyRDD:()Lorg/apache/spark/api/java/JavaRDD;]
JavaSparkContext.intAccumulator ( int initialValue, String name ) : org.apache.spark.Accumulator<Integer>
[mangled: org/apache/spark/api/java/JavaSparkContext.intAccumulator:(ILjava/lang/String;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.setLogLevel ( String logLevel ) : void
[mangled: org/apache/spark/api/java/JavaSparkContext.setLogLevel:(Ljava/lang/String;)V]
JavaSparkContext.statusTracker ( ) : JavaSparkStatusTracker
[mangled: org/apache/spark/api/java/JavaSparkContext.statusTracker:()Lorg/apache/spark/api/java/JavaSparkStatusTracker;]
JavaSparkContext.version ( ) : String
[mangled: org/apache/spark/api/java/JavaSparkContext.version:()Ljava/lang/String;]
spark-core_2.10-1.4.0.jar, RDD<T>.class
package org.apache.spark.rdd
RDD<T>.countApproxDistinct ( int p, int sp ) : long
[mangled: org/apache/spark/rdd/RDD<T>.countApproxDistinct:(II)J]
RDD<T>.creationSite ( ) : org.apache.spark.util.CallSite
[mangled: org/apache/spark/rdd/RDD<T>.creationSite:()Lorg/apache/spark/util/CallSite;]
RDD<T>.doubleRDDToDoubleRDDFunctions ( RDD<Object> p1 ) [static] : DoubleRDDFunctions
[mangled: org/apache/spark/rdd/RDD<T>.doubleRDDToDoubleRDDFunctions:(Lorg/apache/spark/rdd/RDD;)Lorg/apache/spark/rdd/DoubleRDDFunctions;]
RDD<T>.isEmpty ( ) : boolean
[mangled: org/apache/spark/rdd/RDD<T>.isEmpty:()Z]
RDD<T>.logName ( ) : String
[mangled: org/apache/spark/rdd/RDD<T>.logName:()Ljava/lang/String;]
RDD<T>.numericRDDToDoubleRDDFunctions ( RDD<T> p1, scala.math.Numeric<T> p2 ) [static] : DoubleRDDFunctions
[mangled: org/apache/spark/rdd/RDD<T>.numericRDDToDoubleRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/math/Numeric;)Lorg/apache/spark/rdd/DoubleRDDFunctions;]
RDD<T>.RDD..doCheckpointCalled ( ) : boolean
[mangled: org/apache/spark/rdd/RDD<T>.org.apache.spark.rdd.RDD..doCheckpointCalled:()Z]
RDD<T>.RDD..doCheckpointCalled_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/rdd/RDD<T>.org.apache.spark.rdd.RDD..doCheckpointCalled_.eq:(Z)V]
RDD<T>.RDD..sc ( ) : org.apache.spark.SparkContext
[mangled: org/apache/spark/rdd/RDD<T>.org.apache.spark.rdd.RDD..sc:()Lorg/apache/spark/SparkContext;]
RDD<T>.parent ( int j, scala.reflect.ClassTag<U> p2 ) : RDD<U>
[mangled: org/apache/spark/rdd/RDD<T>.parent:(ILscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
RDD<T>.randomSampleWithRange ( double lb, double ub, long seed ) : RDD<T>
[mangled: org/apache/spark/rdd/RDD<T>.randomSampleWithRange:(DDJ)Lorg/apache/spark/rdd/RDD;]
RDD<T>.rddToAsyncRDDActions ( RDD<T> p1, scala.reflect.ClassTag<T> p2 ) [static] : AsyncRDDActions<T>
[mangled: org/apache/spark/rdd/RDD<T>.rddToAsyncRDDActions:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/AsyncRDDActions;]
RDD<T>.rddToOrderedRDDFunctions ( RDD<scala.Tuple2<K,V>> p1, scala.math.Ordering<K> p2, scala.reflect.ClassTag<K> p3, scala.reflect.ClassTag<V> p4 ) [static] : OrderedRDDFunctions<K,V,scala.Tuple2<K,V>>
[mangled: org/apache/spark/rdd/RDD<T>.rddToOrderedRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/math/Ordering;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/OrderedRDDFunctions;]
RDD<T>.rddToPairRDDFunctions ( RDD<scala.Tuple2<K,V>> p1, scala.reflect.ClassTag<K> p2, scala.reflect.ClassTag<V> p3, scala.math.Ordering<K> p4 ) [static] : PairRDDFunctions<K,V>
[mangled: org/apache/spark/rdd/RDD<T>.rddToPairRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/math/Ordering;)Lorg/apache/spark/rdd/PairRDDFunctions;]
RDD<T>.rddToSequenceFileRDDFunctions ( RDD<scala.Tuple2<K,V>> p1, scala.reflect.ClassTag<K> p2, scala.reflect.ClassTag<V> p3, org.apache.spark.WritableFactory<K> p4, org.apache.spark.WritableFactory<V> p5 ) [static] : SequenceFileRDDFunctions<K,V>
[mangled: org/apache/spark/rdd/RDD<T>.rddToSequenceFileRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lorg/apache/spark/WritableFactory;Lorg/apache/spark/WritableFactory;)Lorg/apache/spark/rdd/SequenceFileRDDFunctions;]
RDD<T>.retag ( Class<T> cls ) : RDD<T>
[mangled: org/apache/spark/rdd/RDD<T>.retag:(Ljava/lang/Class;)Lorg/apache/spark/rdd/RDD;]
RDD<T>.retag ( scala.reflect.ClassTag<T> classTag ) : RDD<T>
[mangled: org/apache/spark/rdd/RDD<T>.retag:(Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
RDD<T>.scope ( ) : scala.Option<RDDOperationScope>
[mangled: org/apache/spark/rdd/RDD<T>.scope:()Lscala/Option;]
RDD<T>.sortBy ( scala.Function1<T,K> f, boolean ascending, int numPartitions, scala.math.Ordering<K> ord, scala.reflect.ClassTag<K> ctag ) : RDD<T>
[mangled: org/apache/spark/rdd/RDD<T>.sortBy:(Lscala/Function1;ZILscala/math/Ordering;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
RDD<T>.treeAggregate ( U zeroValue, scala.Function2<U,T,U> seqOp, scala.Function2<U,U,U> combOp, int depth, scala.reflect.ClassTag<U> p5 ) : U
[mangled: org/apache/spark/rdd/RDD<T>.treeAggregate:(Ljava/lang/Object;Lscala/Function2;Lscala/Function2;ILscala/reflect/ClassTag;)Ljava/lang/Object;]
RDD<T>.treeReduce ( scala.Function2<T,T,T> f, int depth ) : T
[mangled: org/apache/spark/rdd/RDD<T>.treeReduce:(Lscala/Function2;I)Ljava/lang/Object;]
RDD<T>.withScope ( scala.Function0<U> body ) : U
[mangled: org/apache/spark/rdd/RDD<T>.withScope:(Lscala/Function0;)Ljava/lang/Object;]
spark-core_2.10-1.4.0.jar, SparkConf.class
package org.apache.spark
SparkConf.getAppId ( ) : String
[mangled: org/apache/spark/SparkConf.getAppId:()Ljava/lang/String;]
SparkConf.getDeprecatedConfig ( String p1, SparkConf p2 ) [static] : scala.Option<String>
[mangled: org/apache/spark/SparkConf.getDeprecatedConfig:(Ljava/lang/String;Lorg/apache/spark/SparkConf;)Lscala/Option;]
SparkConf.getenv ( String name ) : String
[mangled: org/apache/spark/SparkConf.getenv:(Ljava/lang/String;)Ljava/lang/String;]
SparkConf.getSizeAsBytes ( String key ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsBytes:(Ljava/lang/String;)J]
SparkConf.getSizeAsBytes ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsBytes:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getSizeAsGb ( String key ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsGb:(Ljava/lang/String;)J]
SparkConf.getSizeAsGb ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsGb:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getSizeAsKb ( String key ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsKb:(Ljava/lang/String;)J]
SparkConf.getSizeAsKb ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsKb:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getSizeAsMb ( String key ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsMb:(Ljava/lang/String;)J]
SparkConf.getSizeAsMb ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsMb:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getTimeAsMs ( String key ) : long
[mangled: org/apache/spark/SparkConf.getTimeAsMs:(Ljava/lang/String;)J]
SparkConf.getTimeAsMs ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getTimeAsMs:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getTimeAsSeconds ( String key ) : long
[mangled: org/apache/spark/SparkConf.getTimeAsSeconds:(Ljava/lang/String;)J]
SparkConf.getTimeAsSeconds ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getTimeAsSeconds:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.isAkkaConf ( String p1 ) [static] : boolean
[mangled: org/apache/spark/SparkConf.isAkkaConf:(Ljava/lang/String;)Z]
SparkConf.isExecutorStartupConf ( String p1 ) [static] : boolean
[mangled: org/apache/spark/SparkConf.isExecutorStartupConf:(Ljava/lang/String;)Z]
SparkConf.isSparkPortConf ( String p1 ) [static] : boolean
[mangled: org/apache/spark/SparkConf.isSparkPortConf:(Ljava/lang/String;)Z]
SparkConf.logDeprecationWarning ( String p1 ) [static] : void
[mangled: org/apache/spark/SparkConf.logDeprecationWarning:(Ljava/lang/String;)V]
SparkConf.logName ( ) : String
[mangled: org/apache/spark/SparkConf.logName:()Ljava/lang/String;]
SparkConf.registerKryoClasses ( Class<?>[ ] classes ) : SparkConf
[mangled: org/apache/spark/SparkConf.registerKryoClasses:([Ljava/lang/Class;)Lorg/apache/spark/SparkConf;]
spark-core_2.10-1.4.0.jar, SparkContext.class
package org.apache.spark
SparkContext.accumulable ( R initialValue, String name, AccumulableParam<R,T> param ) : Accumulable<R,T>
[mangled: org/apache/spark/SparkContext.accumulable:(Ljava/lang/Object;Ljava/lang/String;Lorg/apache/spark/AccumulableParam;)Lorg/apache/spark/Accumulable;]
SparkContext.accumulator ( T initialValue, String name, AccumulatorParam<T> param ) : Accumulator<T>
[mangled: org/apache/spark/SparkContext.accumulator:(Ljava/lang/Object;Ljava/lang/String;Lorg/apache/spark/AccumulatorParam;)Lorg/apache/spark/Accumulator;]
SparkContext.addFile ( String path, boolean recursive ) : void
[mangled: org/apache/spark/SparkContext.addFile:(Ljava/lang/String;Z)V]
SparkContext.applicationAttemptId ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.applicationAttemptId:()Lscala/Option;]
SparkContext.applicationId ( ) : String
[mangled: org/apache/spark/SparkContext.applicationId:()Ljava/lang/String;]
SparkContext.binaryFiles ( String path, int minPartitions ) : rdd.RDD<scala.Tuple2<String,input.PortableDataStream>>
[mangled: org/apache/spark/SparkContext.binaryFiles:(Ljava/lang/String;I)Lorg/apache/spark/rdd/RDD;]
SparkContext.binaryRecords ( String path, int recordLength, org.apache.hadoop.conf.Configuration conf ) : rdd.RDD<byte[ ]>
[mangled: org/apache/spark/SparkContext.binaryRecords:(Ljava/lang/String;ILorg/apache/hadoop/conf/Configuration;)Lorg/apache/spark/rdd/RDD;]
SparkContext.clean ( F f, boolean checkSerializable ) : F
[mangled: org/apache/spark/SparkContext.clean:(Ljava/lang/Object;Z)Ljava/lang/Object;]
SparkContext.createSparkEnv ( SparkConf conf, boolean isLocal, scheduler.LiveListenerBus listenerBus ) : SparkEnv
[mangled: org/apache/spark/SparkContext.createSparkEnv:(Lorg/apache/spark/SparkConf;ZLorg/apache/spark/scheduler/LiveListenerBus;)Lorg/apache/spark/SparkEnv;]
SparkContext.eventLogCodec ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.eventLogCodec:()Lscala/Option;]
SparkContext.eventLogDir ( ) : scala.Option<java.net.URI>
[mangled: org/apache/spark/SparkContext.eventLogDir:()Lscala/Option;]
SparkContext.executorAllocationManager ( ) : scala.Option<ExecutorAllocationManager>
[mangled: org/apache/spark/SparkContext.executorAllocationManager:()Lscala/Option;]
SparkContext.externalBlockStoreFolderName ( ) : String
[mangled: org/apache/spark/SparkContext.externalBlockStoreFolderName:()Ljava/lang/String;]
SparkContext.getCallSite ( ) : util.CallSite
[mangled: org/apache/spark/SparkContext.getCallSite:()Lorg/apache/spark/util/CallSite;]
SparkContext.getExecutorThreadDump ( String executorId ) : scala.Option<util.ThreadStackTrace[ ]>
[mangled: org/apache/spark/SparkContext.getExecutorThreadDump:(Ljava/lang/String;)Lscala/Option;]
SparkContext.getOrCreate ( ) [static] : SparkContext
[mangled: org/apache/spark/SparkContext.getOrCreate:()Lorg/apache/spark/SparkContext;]
SparkContext.getOrCreate ( SparkConf p1 ) [static] : SparkContext
[mangled: org/apache/spark/SparkContext.getOrCreate:(Lorg/apache/spark/SparkConf;)Lorg/apache/spark/SparkContext;]
SparkContext.isEventLogEnabled ( ) : boolean
[mangled: org/apache/spark/SparkContext.isEventLogEnabled:()Z]
SparkContext.jobProgressListener ( ) : ui.jobs.JobProgressListener
[mangled: org/apache/spark/SparkContext.jobProgressListener:()Lorg/apache/spark/ui/jobs/JobProgressListener;]
SparkContext.killExecutor ( String executorId ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutor:(Ljava/lang/String;)Z]
SparkContext.killExecutors ( scala.collection.Seq<String> executorIds ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutors:(Lscala/collection/Seq;)Z]
SparkContext.logName ( ) : String
[mangled: org/apache/spark/SparkContext.logName:()Ljava/lang/String;]
SparkContext.metricsSystem ( ) : metrics.MetricsSystem
[mangled: org/apache/spark/SparkContext.metricsSystem:()Lorg/apache/spark/metrics/MetricsSystem;]
SparkContext.SparkContext.._conf ( ) : SparkConf
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._conf:()Lorg/apache/spark/SparkConf;]
SparkContext.SparkContext.._env ( ) : SparkEnv
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._env:()Lorg/apache/spark/SparkEnv;]
SparkContext.SparkContext..assertNotStopped ( ) : void
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext..assertNotStopped:()V]
SparkContext.SparkContext..creationSite ( ) : util.CallSite
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext..creationSite:()Lorg/apache/spark/util/CallSite;]
SparkContext.progressBar ( ) : scala.Option<ui.ConsoleProgressBar>
[mangled: org/apache/spark/SparkContext.progressBar:()Lscala/Option;]
SparkContext.range ( long start, long end, long step, int numSlices ) : rdd.RDD<Object>
[mangled: org/apache/spark/SparkContext.range:(JJJI)Lorg/apache/spark/rdd/RDD;]
SparkContext.requestExecutors ( int numAdditionalExecutors ) : boolean
[mangled: org/apache/spark/SparkContext.requestExecutors:(I)Z]
SparkContext.requestTotalExecutors ( int numExecutors ) : boolean
[mangled: org/apache/spark/SparkContext.requestTotalExecutors:(I)Z]
SparkContext.schedulerBackend ( ) : scheduler.SchedulerBackend
[mangled: org/apache/spark/SparkContext.schedulerBackend:()Lorg/apache/spark/scheduler/SchedulerBackend;]
SparkContext.schedulerBackend_.eq ( scheduler.SchedulerBackend sb ) : void
[mangled: org/apache/spark/SparkContext.schedulerBackend_.eq:(Lorg/apache/spark/scheduler/SchedulerBackend;)V]
SparkContext.setCallSite ( util.CallSite callSite ) : void
[mangled: org/apache/spark/SparkContext.setCallSite:(Lorg/apache/spark/util/CallSite;)V]
SparkContext.setLogLevel ( String logLevel ) : void
[mangled: org/apache/spark/SparkContext.setLogLevel:(Ljava/lang/String;)V]
SparkContext.statusTracker ( ) : SparkStatusTracker
[mangled: org/apache/spark/SparkContext.statusTracker:()Lorg/apache/spark/SparkStatusTracker;]
SparkContext.supportDynamicAllocation ( ) : boolean
[mangled: org/apache/spark/SparkContext.supportDynamicAllocation:()Z]
SparkContext.ui ( ) : scala.Option<ui.SparkUI>
[mangled: org/apache/spark/SparkContext.ui:()Lscala/Option;]
SparkContext.withScope ( scala.Function0<U> body ) : U
[mangled: org/apache/spark/SparkContext.withScope:(Lscala/Function0;)Ljava/lang/Object;]
spark-core_2.10-1.4.0.jar, TaskContext.class
package org.apache.spark
TaskContext.addTaskCompletionListener ( util.TaskCompletionListener p1 ) [abstract] : TaskContext
[mangled: org/apache/spark/TaskContext.addTaskCompletionListener:(Lorg/apache/spark/util/TaskCompletionListener;)Lorg/apache/spark/TaskContext;]
TaskContext.addTaskCompletionListener ( scala.Function1<TaskContext,scala.runtime.BoxedUnit> p1 ) [abstract] : TaskContext
[mangled: org/apache/spark/TaskContext.addTaskCompletionListener:(Lscala/Function1;)Lorg/apache/spark/TaskContext;]
TaskContext.attemptNumber ( ) [abstract] : int
[mangled: org/apache/spark/TaskContext.attemptNumber:()I]
TaskContext.get ( ) [static] : TaskContext
[mangled: org/apache/spark/TaskContext.get:()Lorg/apache/spark/TaskContext;]
TaskContext.isCompleted ( ) [abstract] : boolean
[mangled: org/apache/spark/TaskContext.isCompleted:()Z]
TaskContext.isInterrupted ( ) [abstract] : boolean
[mangled: org/apache/spark/TaskContext.isInterrupted:()Z]
TaskContext.isRunningLocally ( ) [abstract] : boolean
[mangled: org/apache/spark/TaskContext.isRunningLocally:()Z]
TaskContext.taskAttemptId ( ) [abstract] : long
[mangled: org/apache/spark/TaskContext.taskAttemptId:()J]
TaskContext.TaskContext ( )
[mangled: org/apache/spark/TaskContext."<init>":()V]
TaskContext.taskMemoryManager ( ) [abstract] : unsafe.memory.TaskMemoryManager
[mangled: org/apache/spark/TaskContext.taskMemoryManager:()Lorg/apache/spark/unsafe/memory/TaskMemoryManager;]
to the top
Removed Methods (11)
spark-core_2.10-1.0.0.jar, RDD<T>.class
package org.apache.spark.rdd
RDD<T>.creationSiteInfo ( ) : org.apache.spark.util.Utils.CallSiteInfo
[mangled: org/apache/spark/rdd/RDD<T>.creationSiteInfo:()Lorg/apache/spark/util/Utils$CallSiteInfo;]
spark-core_2.10-1.0.0.jar, SparkConf.class
package org.apache.spark
SparkConf.SparkConf..settings ( ) : scala.collection.mutable.HashMap<String,String>
[mangled: org/apache/spark/SparkConf.org.apache.spark.SparkConf..settings:()Lscala/collection/mutable/HashMap;]
spark-core_2.10-1.0.0.jar, SparkContext.class
package org.apache.spark
SparkContext.clean ( F f ) : F
[mangled: org/apache/spark/SparkContext.clean:(Ljava/lang/Object;)Ljava/lang/Object;]
SparkContext.getCallSite ( ) : String
[mangled: org/apache/spark/SparkContext.getCallSite:()Ljava/lang/String;]
SparkContext.ui ( ) : ui.SparkUI
[mangled: org/apache/spark/SparkContext.ui:()Lorg/apache/spark/ui/SparkUI;]
spark-core_2.10-1.0.0.jar, TaskContext.class
package org.apache.spark
TaskContext.completed ( ) : boolean
[mangled: org/apache/spark/TaskContext.completed:()Z]
TaskContext.completed_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/TaskContext.completed_.eq:(Z)V]
TaskContext.executeOnCompleteCallbacks ( ) : void
[mangled: org/apache/spark/TaskContext.executeOnCompleteCallbacks:()V]
TaskContext.interrupted ( ) : boolean
[mangled: org/apache/spark/TaskContext.interrupted:()Z]
TaskContext.interrupted_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/TaskContext.interrupted_.eq:(Z)V]
TaskContext.TaskContext ( int stageId, int partitionId, long attemptId, boolean runningLocally, executor.TaskMetrics taskMetrics )
[mangled: org/apache/spark/TaskContext."<init>":(IIJZLorg/apache/spark/executor/TaskMetrics;)V]
to the top
Problems with Data Types, High Severity (20)
spark-core_2.10-1.0.0.jar
package org.apache.spark
[+] TaskContext (16)
| Change | Effect |
---|
1 | This class became abstract. | A client program may be interrupted by InstantiationError exception. |
2 | Method addOnCompleteCallback ( scala.Function0<scala.runtime.BoxedUnit> ) became abstract. | A client program may be interrupted by InstantiationError exception. |
3 | Method attemptId ( ) became abstract. | A client program may be interrupted by InstantiationError exception. |
4 | Method partitionId ( ) became abstract. | A client program may be interrupted by InstantiationError exception. |
5 | Method runningLocally ( ) became abstract. | A client program may be interrupted by InstantiationError exception. |
6 | Method stageId ( ) became abstract. | A client program may be interrupted by InstantiationError exception. |
7 | Method taskMetrics ( ) became abstract. | A client program may be interrupted by InstantiationError exception. |
8 | Removed super-interface scala.Serializable. | A client program may be interrupted by NoSuchMethodError exception. |
9 | Abstract method addTaskCompletionListener ( util.TaskCompletionListener ) has been added to this class. | This class became abstract and a client program may be interrupted by InstantiationError exception. |
10 | Abstract method addTaskCompletionListener ( scala.Function1<TaskContext,scala.runtime.BoxedUnit> ) has been added to this class. | This class became abstract and a client program may be interrupted by InstantiationError exception. |
11 | Abstract method attemptNumber ( ) has been added to this class. | This class became abstract and a client program may be interrupted by InstantiationError exception. |
12 | Abstract method isCompleted ( ) has been added to this class. | This class became abstract and a client program may be interrupted by InstantiationError exception. |
13 | Abstract method isInterrupted ( ) has been added to this class. | This class became abstract and a client program may be interrupted by InstantiationError exception. |
14 | Abstract method isRunningLocally ( ) has been added to this class. | This class became abstract and a client program may be interrupted by InstantiationError exception. |
15 | Abstract method taskAttemptId ( ) has been added to this class. | This class became abstract and a client program may be interrupted by InstantiationError exception. |
16 | Abstract method taskMemoryManager ( ) has been added to this class. | This class became abstract and a client program may be interrupted by InstantiationError exception. |
[+] affected methods (9)
compute ( Partition, TaskContext )2nd parameter 'p2' of this abstract method has type 'TaskContext'.
computeOrReadCheckpoint ( Partition, TaskContext )2nd parameter 'context' of this method has type 'TaskContext'.
iterator ( Partition, TaskContext )2nd parameter 'context' of this method has type 'TaskContext'.
addOnCompleteCallback ( scala.Function0<scala.runtime.BoxedUnit> )This method is from 'TaskContext' class.
attemptId ( )This method is from 'TaskContext' class.
partitionId ( )This method is from 'TaskContext' class.
runningLocally ( )This method is from 'TaskContext' class.
stageId ( )This method is from 'TaskContext' class.
taskMetrics ( )This method is from 'TaskContext' class.
package org.apache.spark.api.java
[+] JavaDoubleRDD (1)
| Change | Effect |
---|
1 | Removed super-interface JavaRDDLike<java.lang.Double,JavaDoubleRDD>. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (3)
parallelizeDoubles ( java.util.List<java.lang.Double> )Return value of this method has type 'JavaDoubleRDD'.
parallelizeDoubles ( java.util.List<java.lang.Double>, int )Return value of this method has type 'JavaDoubleRDD'.
union ( JavaDoubleRDD, java.util.List<JavaDoubleRDD> )1st parameter 'first' of this method has type 'JavaDoubleRDD'.
[+] JavaPairRDD<K,V> (1)
| Change | Effect |
---|
1 | Removed super-interface JavaRDDLike<scala.Tuple2<K,V>,JavaPairRDD<K,V>>. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (11)
hadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
newAPIHadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, org.apache.hadoop.conf.Configuration )Return value of this method has type 'JavaPairRDD<K,V>'.
newAPIHadoopRDD ( org.apache.hadoop.conf.Configuration, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
parallelizePairs ( java.util.List<scala.Tuple2<K,V>> )Return value of this method has type 'JavaPairRDD<K,V>'.
parallelizePairs ( java.util.List<scala.Tuple2<K,V>>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
union ( JavaPairRDD<K,V>, java.util.List<JavaPairRDD<K,V>> )Return value of this method has type 'JavaPairRDD<K,V>'.
[+] JavaRDD<T> (1)
| Change | Effect |
---|
1 | Removed super-interface JavaRDDLike<T,JavaRDD<T>>. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (7)
checkpointFile ( java.lang.String )Return value of this method has type 'JavaRDD<T>'.
objectFile ( java.lang.String )Return value of this method has type 'JavaRDD<T>'.
objectFile ( java.lang.String, int )Return value of this method has type 'JavaRDD<T>'.
parallelize ( java.util.List<T> )Return value of this method has type 'JavaRDD<T>'.
parallelize ( java.util.List<T>, int )Return value of this method has type 'JavaRDD<T>'.
union ( JavaRDD<T>, java.util.List<JavaRDD<T>> )Return value of this method has type 'JavaRDD<T>'.
toJavaRDD ( )Return value of this method has type 'JavaRDD<T>'.
package org.apache.spark.rdd
[+] PairRDDFunctions<K,V> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.hadoop.mapreduce.SparkHadoopMapReduceUtil. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
rddToPairRDDFunctions ( RDD<scala.Tuple2<K,V>>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.math.Ordering<K> )Return value of this method has type 'PairRDDFunctions<K,V>'.
to the top
Problems with Methods, High Severity (6)
spark-core_2.10-1.0.0.jar, TaskContext
package org.apache.spark
[+] TaskContext.addOnCompleteCallback ( scala.Function0<scala.runtime.BoxedUnit> f ) : void (1)
[mangled: org/apache/spark/TaskContext.addOnCompleteCallback:(Lscala/Function0;)V]
| Change | Effect |
---|
1 | Method became abstract.
| A client program trying to create an instance of the method's class may be interrupted by InstantiationError exception. |
[+] TaskContext.attemptId ( ) : long (1)
[mangled: org/apache/spark/TaskContext.attemptId:()J]
| Change | Effect |
---|
1 | Method became abstract.
| A client program trying to create an instance of the method's class may be interrupted by InstantiationError exception. |
[+] TaskContext.partitionId ( ) : int (1)
[mangled: org/apache/spark/TaskContext.partitionId:()I]
| Change | Effect |
---|
1 | Method became abstract.
| A client program trying to create an instance of the method's class may be interrupted by InstantiationError exception. |
[+] TaskContext.runningLocally ( ) : boolean (1)
[mangled: org/apache/spark/TaskContext.runningLocally:()Z]
| Change | Effect |
---|
1 | Method became abstract.
| A client program trying to create an instance of the method's class may be interrupted by InstantiationError exception. |
[+] TaskContext.stageId ( ) : int (1)
[mangled: org/apache/spark/TaskContext.stageId:()I]
| Change | Effect |
---|
1 | Method became abstract.
| A client program trying to create an instance of the method's class may be interrupted by InstantiationError exception. |
[+] TaskContext.taskMetrics ( ) : executor.TaskMetrics (1)
[mangled: org/apache/spark/TaskContext.taskMetrics:()Lorg/apache/spark/executor/TaskMetrics;]
| Change | Effect |
---|
1 | Method became abstract.
| A client program trying to create an instance of the method's class may be interrupted by InstantiationError exception. |
to the top
Problems with Data Types, Low Severity (4)
spark-core_2.10-1.0.0.jar
package org.apache.spark.api.java
[+] JavaDoubleRDD (1)
| Change | Effect |
---|
1 | Added super-class AbstractJavaRDDLike<java.lang.Double,JavaDoubleRDD>. | A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (3)
parallelizeDoubles ( java.util.List<java.lang.Double> )Return value of this method has type 'JavaDoubleRDD'.
parallelizeDoubles ( java.util.List<java.lang.Double>, int )Return value of this method has type 'JavaDoubleRDD'.
union ( JavaDoubleRDD, java.util.List<JavaDoubleRDD> )Return value of this method has type 'JavaDoubleRDD'.
[+] JavaPairRDD<K,V> (1)
| Change | Effect |
---|
1 | Added super-class AbstractJavaRDDLike<scala.Tuple2<K,V>,JavaPairRDD<K,V>>. | A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (11)
hadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
newAPIHadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, org.apache.hadoop.conf.Configuration )Return value of this method has type 'JavaPairRDD<K,V>'.
newAPIHadoopRDD ( org.apache.hadoop.conf.Configuration, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
parallelizePairs ( java.util.List<scala.Tuple2<K,V>> )Return value of this method has type 'JavaPairRDD<K,V>'.
parallelizePairs ( java.util.List<scala.Tuple2<K,V>>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
union ( JavaPairRDD<K,V>, java.util.List<JavaPairRDD<K,V>> )1st parameter 'first' of this method has type 'JavaPairRDD<K,V>'.
[+] JavaRDD<T> (1)
| Change | Effect |
---|
1 | Added super-class AbstractJavaRDDLike<T,JavaRDD<T>>. | A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (7)
checkpointFile ( java.lang.String )Return value of this method has type 'JavaRDD<T>'.
objectFile ( java.lang.String )Return value of this method has type 'JavaRDD<T>'.
objectFile ( java.lang.String, int )Return value of this method has type 'JavaRDD<T>'.
parallelize ( java.util.List<T> )Return value of this method has type 'JavaRDD<T>'.
parallelize ( java.util.List<T>, int )Return value of this method has type 'JavaRDD<T>'.
union ( JavaRDD<T>, java.util.List<JavaRDD<T>> )1st parameter 'first' of this method has type 'JavaRDD<T>'.
toJavaRDD ( )Return value of this method has type 'JavaRDD<T>'.
package org.apache.spark.scheduler
[+] LiveListenerBus (1)
| Change | Effect |
---|
1 | Added super-class org.apache.spark.util.AsynchronousListenerBus<SparkListener,SparkListenerEvent>. | A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (1)
listenerBus ( )Return value of this method has type 'LiveListenerBus'.
to the top
Other Changes in Data Types (1)
spark-core_2.10-1.0.0.jar
package org.apache.spark.broadcast
[+] Broadcast<T> (1)
| Change | Effect |
---|
1 | Added super-interface org.apache.spark.Logging. | No effect. |
[+] affected methods (2)
broadcast ( T )Return value of this method has type 'Broadcast<T>'.
broadcast ( T, scala.reflect.ClassTag<T> )Return value of this method has type 'Broadcast<T>'.
to the top
Java ARchives (1)
spark-core_2.10-1.0.0.jar
to the top
Generated on Mon Jun 29 20:27:03 2015 for elasticsearch-spark_2.10-2.1.0.Beta1 by Java API Compliance Checker 1.4.1
A tool for checking backward compatibility of a Java library API