Binary compatibility report for the infinispan-spark_2.10-0.1 library between 1.4.0 and 1.0.0 versions (relating to the portability of client application infinispan-spark_2.10-0.1.jar)
Test Info
Library Name | infinispan-spark_2.10-0.1 |
Version #1 | 1.4.0 |
Version #2 | 1.0.0 |
Java Version | 1.7.0_75 |
Test Results
Total Java ARchives | 3 |
---|
Total Methods / Classes | 495 / 2720 |
---|
Verdict | Incompatible (79.8%) |
Problem Summary
| Severity | Count |
---|
Added Methods | - | 15 |
---|
Removed Methods | High | 113 |
---|
Problems with Data Types | High | 19 |
---|
Medium | 6 |
Low | 3 |
Problems with Methods | High | 0 |
---|
Medium | 0 |
Low | 3 |
Added Methods (15)
spark-core_2.10-1.0.0.jar, SparkContext.class
package org.apache.spark
SparkContext.clean ( F f ) : F
[mangled: org/apache/spark/SparkContext.clean:(Ljava/lang/Object;)Ljava/lang/Object;]
SparkContext.getCallSite ( ) : String
[mangled: org/apache/spark/SparkContext.getCallSite:()Ljava/lang/String;]
SparkContext.ui ( ) : ui.SparkUI
[mangled: org/apache/spark/SparkContext.ui:()Lorg/apache/spark/ui/SparkUI;]
spark-core_2.10-1.0.0.jar, TaskContext.class
package org.apache.spark
TaskContext.completed ( ) : boolean
[mangled: org/apache/spark/TaskContext.completed:()Z]
TaskContext.completed_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/TaskContext.completed_.eq:(Z)V]
TaskContext.executeOnCompleteCallbacks ( ) : void
[mangled: org/apache/spark/TaskContext.executeOnCompleteCallbacks:()V]
TaskContext.interrupted ( ) : boolean
[mangled: org/apache/spark/TaskContext.interrupted:()Z]
TaskContext.interrupted_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/TaskContext.interrupted_.eq:(Z)V]
TaskContext.TaskContext ( int stageId, int partitionId, long attemptId, boolean runningLocally, executor.TaskMetrics taskMetrics )
[mangled: org/apache/spark/TaskContext."<init>":(IIJZLorg/apache/spark/executor/TaskMetrics;)V]
spark-streaming_2.10-1.0.0.jar, JavaStreamingContext.class
package org.apache.spark.streaming.api.java
JavaStreamingContext.fileStream ( String directory ) : JavaPairInputDStream<K,V>
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.fileStream:(Ljava/lang/String;)Lorg/apache/spark/streaming/api/java/JavaPairInputDStream;]
spark-streaming_2.10-1.0.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.getNewReceiverStreamId ( ) : int
[mangled: org/apache/spark/streaming/StreamingContext.getNewReceiverStreamId:()I]
StreamingContext.state ( ) : scala.Enumeration.Value
[mangled: org/apache/spark/streaming/StreamingContext.state:()Lscala/Enumeration$Value;]
StreamingContext.state_.eq ( scala.Enumeration.Value p1 ) : void
[mangled: org/apache/spark/streaming/StreamingContext.state_.eq:(Lscala/Enumeration$Value;)V]
StreamingContext.StreamingContextState ( ) : StreamingContext.StreamingContextState.
[mangled: org/apache/spark/streaming/StreamingContext.StreamingContextState:()Lorg/apache/spark/streaming/StreamingContext$StreamingContextState$;]
StreamingContext.uiTab ( ) : ui.StreamingTab
[mangled: org/apache/spark/streaming/StreamingContext.uiTab:()Lorg/apache/spark/streaming/ui/StreamingTab;]
to the top
Removed Methods (113)
spark-core_2.10-1.4.0.jar, DeserializationStream.class
package org.apache.spark.serializer
DeserializationStream.asKeyValueIterator ( ) : scala.collection.Iterator<scala.Tuple2<Object,Object>>
[mangled: org/apache/spark/serializer/DeserializationStream.asKeyValueIterator:()Lscala/collection/Iterator;]
DeserializationStream.DeserializationStream ( )
[mangled: org/apache/spark/serializer/DeserializationStream."<init>":()V]
DeserializationStream.readKey ( scala.reflect.ClassTag<T> p1 ) : T
[mangled: org/apache/spark/serializer/DeserializationStream.readKey:(Lscala/reflect/ClassTag;)Ljava/lang/Object;]
DeserializationStream.readValue ( scala.reflect.ClassTag<T> p1 ) : T
[mangled: org/apache/spark/serializer/DeserializationStream.readValue:(Lscala/reflect/ClassTag;)Ljava/lang/Object;]
spark-core_2.10-1.4.0.jar, JavaSparkContext.class
package org.apache.spark.api.java
JavaSparkContext.accumulable ( T initialValue, String name, org.apache.spark.AccumulableParam<T,R> param ) : org.apache.spark.Accumulable<T,R>
[mangled: org/apache/spark/api/java/JavaSparkContext.accumulable:(Ljava/lang/Object;Ljava/lang/String;Lorg/apache/spark/AccumulableParam;)Lorg/apache/spark/Accumulable;]
JavaSparkContext.accumulator ( double initialValue, String name ) : org.apache.spark.Accumulator<Double>
[mangled: org/apache/spark/api/java/JavaSparkContext.accumulator:(DLjava/lang/String;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.accumulator ( int initialValue, String name ) : org.apache.spark.Accumulator<Integer>
[mangled: org/apache/spark/api/java/JavaSparkContext.accumulator:(ILjava/lang/String;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.accumulator ( T initialValue, String name, org.apache.spark.AccumulatorParam<T> accumulatorParam ) : org.apache.spark.Accumulator<T>
[mangled: org/apache/spark/api/java/JavaSparkContext.accumulator:(Ljava/lang/Object;Ljava/lang/String;Lorg/apache/spark/AccumulatorParam;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.binaryFiles ( String path ) : JavaPairRDD<String,org.apache.spark.input.PortableDataStream>
[mangled: org/apache/spark/api/java/JavaSparkContext.binaryFiles:(Ljava/lang/String;)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaSparkContext.binaryFiles ( String path, int minPartitions ) : JavaPairRDD<String,org.apache.spark.input.PortableDataStream>
[mangled: org/apache/spark/api/java/JavaSparkContext.binaryFiles:(Ljava/lang/String;I)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaSparkContext.binaryRecords ( String path, int recordLength ) : JavaRDD<byte[ ]>
[mangled: org/apache/spark/api/java/JavaSparkContext.binaryRecords:(Ljava/lang/String;I)Lorg/apache/spark/api/java/JavaRDD;]
JavaSparkContext.close ( ) : void
[mangled: org/apache/spark/api/java/JavaSparkContext.close:()V]
JavaSparkContext.doubleAccumulator ( double initialValue, String name ) : org.apache.spark.Accumulator<Double>
[mangled: org/apache/spark/api/java/JavaSparkContext.doubleAccumulator:(DLjava/lang/String;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.emptyRDD ( ) : JavaRDD<T>
[mangled: org/apache/spark/api/java/JavaSparkContext.emptyRDD:()Lorg/apache/spark/api/java/JavaRDD;]
JavaSparkContext.intAccumulator ( int initialValue, String name ) : org.apache.spark.Accumulator<Integer>
[mangled: org/apache/spark/api/java/JavaSparkContext.intAccumulator:(ILjava/lang/String;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.setLogLevel ( String logLevel ) : void
[mangled: org/apache/spark/api/java/JavaSparkContext.setLogLevel:(Ljava/lang/String;)V]
JavaSparkContext.statusTracker ( ) : JavaSparkStatusTracker
[mangled: org/apache/spark/api/java/JavaSparkContext.statusTracker:()Lorg/apache/spark/api/java/JavaSparkStatusTracker;]
JavaSparkContext.version ( ) : String
[mangled: org/apache/spark/api/java/JavaSparkContext.version:()Ljava/lang/String;]
spark-core_2.10-1.4.0.jar, Logging.class
package org.apache.spark
Logging.logName ( ) [abstract] : String
[mangled: org/apache/spark/Logging.logName:()Ljava/lang/String;]
spark-core_2.10-1.4.0.jar, SerializationStream.class
package org.apache.spark.serializer
SerializationStream.SerializationStream ( )
[mangled: org/apache/spark/serializer/SerializationStream."<init>":()V]
SerializationStream.writeKey ( T key, scala.reflect.ClassTag<T> p2 ) : SerializationStream
[mangled: org/apache/spark/serializer/SerializationStream.writeKey:(Ljava/lang/Object;Lscala/reflect/ClassTag;)Lorg/apache/spark/serializer/SerializationStream;]
SerializationStream.writeValue ( T value, scala.reflect.ClassTag<T> p2 ) : SerializationStream
[mangled: org/apache/spark/serializer/SerializationStream.writeValue:(Ljava/lang/Object;Lscala/reflect/ClassTag;)Lorg/apache/spark/serializer/SerializationStream;]
spark-core_2.10-1.4.0.jar, Serializer.class
package org.apache.spark.serializer
Serializer.defaultClassLoader ( ) : scala.Option<ClassLoader>
[mangled: org/apache/spark/serializer/Serializer.defaultClassLoader:()Lscala/Option;]
Serializer.defaultClassLoader_.eq ( scala.Option<ClassLoader> p1 ) : void
[mangled: org/apache/spark/serializer/Serializer.defaultClassLoader_.eq:(Lscala/Option;)V]
Serializer.getSerializer ( Serializer p1 ) [static] : Serializer
[mangled: org/apache/spark/serializer/Serializer.getSerializer:(Lorg/apache/spark/serializer/Serializer;)Lorg/apache/spark/serializer/Serializer;]
Serializer.getSerializer ( scala.Option<Serializer> p1 ) [static] : Serializer
[mangled: org/apache/spark/serializer/Serializer.getSerializer:(Lscala/Option;)Lorg/apache/spark/serializer/Serializer;]
Serializer.Serializer ( )
[mangled: org/apache/spark/serializer/Serializer."<init>":()V]
Serializer.setDefaultClassLoader ( ClassLoader classLoader ) : Serializer
[mangled: org/apache/spark/serializer/Serializer.setDefaultClassLoader:(Ljava/lang/ClassLoader;)Lorg/apache/spark/serializer/Serializer;]
Serializer.supportsRelocationOfSerializedObjects ( ) : boolean
[mangled: org/apache/spark/serializer/Serializer.supportsRelocationOfSerializedObjects:()Z]
spark-core_2.10-1.4.0.jar, SparkContext.class
package org.apache.spark
SparkContext.accumulable ( R initialValue, String name, AccumulableParam<R,T> param ) : Accumulable<R,T>
[mangled: org/apache/spark/SparkContext.accumulable:(Ljava/lang/Object;Ljava/lang/String;Lorg/apache/spark/AccumulableParam;)Lorg/apache/spark/Accumulable;]
SparkContext.accumulator ( T initialValue, String name, AccumulatorParam<T> param ) : Accumulator<T>
[mangled: org/apache/spark/SparkContext.accumulator:(Ljava/lang/Object;Ljava/lang/String;Lorg/apache/spark/AccumulatorParam;)Lorg/apache/spark/Accumulator;]
SparkContext.addFile ( String path, boolean recursive ) : void
[mangled: org/apache/spark/SparkContext.addFile:(Ljava/lang/String;Z)V]
SparkContext.applicationAttemptId ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.applicationAttemptId:()Lscala/Option;]
SparkContext.applicationId ( ) : String
[mangled: org/apache/spark/SparkContext.applicationId:()Ljava/lang/String;]
SparkContext.binaryFiles ( String path, int minPartitions ) : rdd.RDD<scala.Tuple2<String,input.PortableDataStream>>
[mangled: org/apache/spark/SparkContext.binaryFiles:(Ljava/lang/String;I)Lorg/apache/spark/rdd/RDD;]
SparkContext.binaryRecords ( String path, int recordLength, org.apache.hadoop.conf.Configuration conf ) : rdd.RDD<byte[ ]>
[mangled: org/apache/spark/SparkContext.binaryRecords:(Ljava/lang/String;ILorg/apache/hadoop/conf/Configuration;)Lorg/apache/spark/rdd/RDD;]
SparkContext.clean ( F f, boolean checkSerializable ) : F
[mangled: org/apache/spark/SparkContext.clean:(Ljava/lang/Object;Z)Ljava/lang/Object;]
SparkContext.createSparkEnv ( SparkConf conf, boolean isLocal, scheduler.LiveListenerBus listenerBus ) : SparkEnv
[mangled: org/apache/spark/SparkContext.createSparkEnv:(Lorg/apache/spark/SparkConf;ZLorg/apache/spark/scheduler/LiveListenerBus;)Lorg/apache/spark/SparkEnv;]
SparkContext.eventLogCodec ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.eventLogCodec:()Lscala/Option;]
SparkContext.eventLogDir ( ) : scala.Option<java.net.URI>
[mangled: org/apache/spark/SparkContext.eventLogDir:()Lscala/Option;]
SparkContext.executorAllocationManager ( ) : scala.Option<ExecutorAllocationManager>
[mangled: org/apache/spark/SparkContext.executorAllocationManager:()Lscala/Option;]
SparkContext.externalBlockStoreFolderName ( ) : String
[mangled: org/apache/spark/SparkContext.externalBlockStoreFolderName:()Ljava/lang/String;]
SparkContext.getCallSite ( ) : util.CallSite
[mangled: org/apache/spark/SparkContext.getCallSite:()Lorg/apache/spark/util/CallSite;]
SparkContext.getExecutorThreadDump ( String executorId ) : scala.Option<util.ThreadStackTrace[ ]>
[mangled: org/apache/spark/SparkContext.getExecutorThreadDump:(Ljava/lang/String;)Lscala/Option;]
SparkContext.getOrCreate ( ) [static] : SparkContext
[mangled: org/apache/spark/SparkContext.getOrCreate:()Lorg/apache/spark/SparkContext;]
SparkContext.getOrCreate ( SparkConf p1 ) [static] : SparkContext
[mangled: org/apache/spark/SparkContext.getOrCreate:(Lorg/apache/spark/SparkConf;)Lorg/apache/spark/SparkContext;]
SparkContext.isEventLogEnabled ( ) : boolean
[mangled: org/apache/spark/SparkContext.isEventLogEnabled:()Z]
SparkContext.jobProgressListener ( ) : ui.jobs.JobProgressListener
[mangled: org/apache/spark/SparkContext.jobProgressListener:()Lorg/apache/spark/ui/jobs/JobProgressListener;]
SparkContext.killExecutor ( String executorId ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutor:(Ljava/lang/String;)Z]
SparkContext.killExecutors ( scala.collection.Seq<String> executorIds ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutors:(Lscala/collection/Seq;)Z]
SparkContext.logName ( ) : String
[mangled: org/apache/spark/SparkContext.logName:()Ljava/lang/String;]
SparkContext.metricsSystem ( ) : metrics.MetricsSystem
[mangled: org/apache/spark/SparkContext.metricsSystem:()Lorg/apache/spark/metrics/MetricsSystem;]
SparkContext.SparkContext.._conf ( ) : SparkConf
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._conf:()Lorg/apache/spark/SparkConf;]
SparkContext.SparkContext.._env ( ) : SparkEnv
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._env:()Lorg/apache/spark/SparkEnv;]
SparkContext.SparkContext..assertNotStopped ( ) : void
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext..assertNotStopped:()V]
SparkContext.SparkContext..creationSite ( ) : util.CallSite
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext..creationSite:()Lorg/apache/spark/util/CallSite;]
SparkContext.progressBar ( ) : scala.Option<ui.ConsoleProgressBar>
[mangled: org/apache/spark/SparkContext.progressBar:()Lscala/Option;]
SparkContext.range ( long start, long end, long step, int numSlices ) : rdd.RDD<Object>
[mangled: org/apache/spark/SparkContext.range:(JJJI)Lorg/apache/spark/rdd/RDD;]
SparkContext.requestExecutors ( int numAdditionalExecutors ) : boolean
[mangled: org/apache/spark/SparkContext.requestExecutors:(I)Z]
SparkContext.requestTotalExecutors ( int numExecutors ) : boolean
[mangled: org/apache/spark/SparkContext.requestTotalExecutors:(I)Z]
SparkContext.schedulerBackend ( ) : scheduler.SchedulerBackend
[mangled: org/apache/spark/SparkContext.schedulerBackend:()Lorg/apache/spark/scheduler/SchedulerBackend;]
SparkContext.schedulerBackend_.eq ( scheduler.SchedulerBackend sb ) : void
[mangled: org/apache/spark/SparkContext.schedulerBackend_.eq:(Lorg/apache/spark/scheduler/SchedulerBackend;)V]
SparkContext.setCallSite ( util.CallSite callSite ) : void
[mangled: org/apache/spark/SparkContext.setCallSite:(Lorg/apache/spark/util/CallSite;)V]
SparkContext.setLogLevel ( String logLevel ) : void
[mangled: org/apache/spark/SparkContext.setLogLevel:(Ljava/lang/String;)V]
SparkContext.statusTracker ( ) : SparkStatusTracker
[mangled: org/apache/spark/SparkContext.statusTracker:()Lorg/apache/spark/SparkStatusTracker;]
SparkContext.supportDynamicAllocation ( ) : boolean
[mangled: org/apache/spark/SparkContext.supportDynamicAllocation:()Z]
SparkContext.ui ( ) : scala.Option<ui.SparkUI>
[mangled: org/apache/spark/SparkContext.ui:()Lscala/Option;]
SparkContext.withScope ( scala.Function0<U> body ) : U
[mangled: org/apache/spark/SparkContext.withScope:(Lscala/Function0;)Ljava/lang/Object;]
spark-core_2.10-1.4.0.jar, StorageLevel.class
package org.apache.spark.storage
StorageLevel.fromString ( String p1 ) [static] : StorageLevel
[mangled: org/apache/spark/storage/StorageLevel.fromString:(Ljava/lang/String;)Lorg/apache/spark/storage/StorageLevel;]
StorageLevel.StorageLevel.._deserialized_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/storage/StorageLevel.org.apache.spark.storage.StorageLevel.._deserialized_.eq:(Z)V]
StorageLevel.StorageLevel.._replication ( ) : int
[mangled: org/apache/spark/storage/StorageLevel.org.apache.spark.storage.StorageLevel.._replication:()I]
StorageLevel.StorageLevel.._replication_.eq ( int p1 ) : void
[mangled: org/apache/spark/storage/StorageLevel.org.apache.spark.storage.StorageLevel.._replication_.eq:(I)V]
StorageLevel.StorageLevel.._useDisk_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/storage/StorageLevel.org.apache.spark.storage.StorageLevel.._useDisk_.eq:(Z)V]
StorageLevel.StorageLevel.._useMemory_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/storage/StorageLevel.org.apache.spark.storage.StorageLevel.._useMemory_.eq:(Z)V]
StorageLevel.StorageLevel.._useOffHeap_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/storage/StorageLevel.org.apache.spark.storage.StorageLevel.._useOffHeap_.eq:(Z)V]
spark-core_2.10-1.4.0.jar, TaskContext.class
package org.apache.spark
TaskContext.addTaskCompletionListener ( util.TaskCompletionListener p1 ) [abstract] : TaskContext
[mangled: org/apache/spark/TaskContext.addTaskCompletionListener:(Lorg/apache/spark/util/TaskCompletionListener;)Lorg/apache/spark/TaskContext;]
TaskContext.addTaskCompletionListener ( scala.Function1<TaskContext,scala.runtime.BoxedUnit> p1 ) [abstract] : TaskContext
[mangled: org/apache/spark/TaskContext.addTaskCompletionListener:(Lscala/Function1;)Lorg/apache/spark/TaskContext;]
TaskContext.attemptNumber ( ) [abstract] : int
[mangled: org/apache/spark/TaskContext.attemptNumber:()I]
TaskContext.get ( ) [static] : TaskContext
[mangled: org/apache/spark/TaskContext.get:()Lorg/apache/spark/TaskContext;]
TaskContext.isCompleted ( ) [abstract] : boolean
[mangled: org/apache/spark/TaskContext.isCompleted:()Z]
TaskContext.isInterrupted ( ) [abstract] : boolean
[mangled: org/apache/spark/TaskContext.isInterrupted:()Z]
TaskContext.isRunningLocally ( ) [abstract] : boolean
[mangled: org/apache/spark/TaskContext.isRunningLocally:()Z]
TaskContext.taskAttemptId ( ) [abstract] : long
[mangled: org/apache/spark/TaskContext.taskAttemptId:()J]
TaskContext.TaskContext ( )
[mangled: org/apache/spark/TaskContext."<init>":()V]
TaskContext.taskMemoryManager ( ) [abstract] : unsafe.memory.TaskMemoryManager
[mangled: org/apache/spark/TaskContext.taskMemoryManager:()Lorg/apache/spark/unsafe/memory/TaskMemoryManager;]
spark-streaming_2.10-1.4.0.jar, JavaStreamingContext.class
package org.apache.spark.streaming.api.java
JavaStreamingContext.awaitTerminationOrTimeout ( long timeout ) : boolean
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.awaitTerminationOrTimeout:(J)Z]
JavaStreamingContext.binaryRecordsStream ( String directory, int recordLength ) : JavaDStream<byte[ ]>
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.binaryRecordsStream:(Ljava/lang/String;I)Lorg/apache/spark/streaming/api/java/JavaDStream;]
JavaStreamingContext.close ( ) : void
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.close:()V]
JavaStreamingContext.fileStream ( String directory, Class<K> kClass, Class<V> vClass, Class<F> fClass ) : JavaPairInputDStream<K,V>
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.fileStream:(Ljava/lang/String;Ljava/lang/Class;Ljava/lang/Class;Ljava/lang/Class;)Lorg/apache/spark/streaming/api/java/JavaPairInputDStream;]
JavaStreamingContext.fileStream ( String directory, Class<K> kClass, Class<V> vClass, Class<F> fClass, org.apache.spark.api.java.function.Function<org.apache.hadoop.fs.Path,Boolean> filter, boolean newFilesOnly ) : JavaPairInputDStream<K,V>
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.fileStream:(Ljava/lang/String;Ljava/lang/Class;Ljava/lang/Class;Ljava/lang/Class;Lorg/apache/spark/api/java/function/Function;Z)Lorg/apache/spark/streaming/api/java/JavaPairInputDStream;]
JavaStreamingContext.fileStream ( String directory, Class<K> kClass, Class<V> vClass, Class<F> fClass, org.apache.spark.api.java.function.Function<org.apache.hadoop.fs.Path,Boolean> filter, boolean newFilesOnly, org.apache.hadoop.conf.Configuration conf ) : JavaPairInputDStream<K,V>
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.fileStream:(Ljava/lang/String;Ljava/lang/Class;Ljava/lang/Class;Ljava/lang/Class;Lorg/apache/spark/api/java/function/Function;ZLorg/apache/hadoop/conf/Configuration;)Lorg/apache/spark/streaming/api/java/JavaPairInputDStream;]
JavaStreamingContext.getOrCreate ( String p1, org.apache.spark.api.java.function.Function0<JavaStreamingContext> p2 ) [static] : JavaStreamingContext
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.getOrCreate:(Ljava/lang/String;Lorg/apache/spark/api/java/function/Function0;)Lorg/apache/spark/streaming/api/java/JavaStreamingContext;]
JavaStreamingContext.getOrCreate ( String p1, org.apache.spark.api.java.function.Function0<JavaStreamingContext> p2, org.apache.hadoop.conf.Configuration p3 ) [static] : JavaStreamingContext
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.getOrCreate:(Ljava/lang/String;Lorg/apache/spark/api/java/function/Function0;Lorg/apache/hadoop/conf/Configuration;)Lorg/apache/spark/streaming/api/java/JavaStreamingContext;]
JavaStreamingContext.getOrCreate ( String p1, org.apache.spark.api.java.function.Function0<JavaStreamingContext> p2, org.apache.hadoop.conf.Configuration p3, boolean p4 ) [static] : JavaStreamingContext
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.getOrCreate:(Ljava/lang/String;Lorg/apache/spark/api/java/function/Function0;Lorg/apache/hadoop/conf/Configuration;Z)Lorg/apache/spark/streaming/api/java/JavaStreamingContext;]
JavaStreamingContext.getState ( ) : org.apache.spark.streaming.StreamingContextState
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.getState:()Lorg/apache/spark/streaming/StreamingContextState;]
spark-streaming_2.10-1.4.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.awaitTerminationOrTimeout ( long timeout ) : boolean
[mangled: org/apache/spark/streaming/StreamingContext.awaitTerminationOrTimeout:(J)Z]
StreamingContext.binaryRecordsStream ( String directory, int recordLength ) : dstream.DStream<byte[ ]>
[mangled: org/apache/spark/streaming/StreamingContext.binaryRecordsStream:(Ljava/lang/String;I)Lorg/apache/spark/streaming/dstream/DStream;]
StreamingContext.fileStream ( String directory, scala.Function1<org.apache.hadoop.fs.Path,Object> filter, boolean newFilesOnly, org.apache.hadoop.conf.Configuration conf, scala.reflect.ClassTag<K> p5, scala.reflect.ClassTag<V> p6, scala.reflect.ClassTag<F> p7 ) : dstream.InputDStream<scala.Tuple2<K,V>>
[mangled: org/apache/spark/streaming/StreamingContext.fileStream:(Ljava/lang/String;Lscala/Function1;ZLorg/apache/hadoop/conf/Configuration;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/streaming/dstream/InputDStream;]
StreamingContext.getActive ( ) [static] : scala.Option<StreamingContext>
[mangled: org/apache/spark/streaming/StreamingContext.getActive:()Lscala/Option;]
StreamingContext.getActiveOrCreate ( scala.Function0<StreamingContext> p1 ) [static] : StreamingContext
[mangled: org/apache/spark/streaming/StreamingContext.getActiveOrCreate:(Lscala/Function0;)Lorg/apache/spark/streaming/StreamingContext;]
StreamingContext.getActiveOrCreate ( String p1, scala.Function0<StreamingContext> p2, org.apache.hadoop.conf.Configuration p3, boolean p4 ) [static] : StreamingContext
[mangled: org/apache/spark/streaming/StreamingContext.getActiveOrCreate:(Ljava/lang/String;Lscala/Function0;Lorg/apache/hadoop/conf/Configuration;Z)Lorg/apache/spark/streaming/StreamingContext;]
StreamingContext.getNewInputStreamId ( ) : int
[mangled: org/apache/spark/streaming/StreamingContext.getNewInputStreamId:()I]
StreamingContext.getState ( ) : StreamingContextState
[mangled: org/apache/spark/streaming/StreamingContext.getState:()Lorg/apache/spark/streaming/StreamingContextState;]
StreamingContext.isCheckpointingEnabled ( ) : boolean
[mangled: org/apache/spark/streaming/StreamingContext.isCheckpointingEnabled:()Z]
StreamingContext.logName ( ) : String
[mangled: org/apache/spark/streaming/StreamingContext.logName:()Ljava/lang/String;]
StreamingContext.StreamingContext..startSite ( ) : java.util.concurrent.atomic.AtomicReference<org.apache.spark.util.CallSite>
[mangled: org/apache/spark/streaming/StreamingContext.org.apache.spark.streaming.StreamingContext..startSite:()Ljava/util/concurrent/atomic/AtomicReference;]
StreamingContext.StreamingContext..stopOnShutdown ( ) : void
[mangled: org/apache/spark/streaming/StreamingContext.org.apache.spark.streaming.StreamingContext..stopOnShutdown:()V]
StreamingContext.progressListener ( ) : ui.StreamingJobProgressListener
[mangled: org/apache/spark/streaming/StreamingContext.progressListener:()Lorg/apache/spark/streaming/ui/StreamingJobProgressListener;]
StreamingContext.StreamingContext ( String path )
[mangled: org/apache/spark/streaming/StreamingContext."<init>":(Ljava/lang/String;)V]
StreamingContext.StreamingContext ( String path, org.apache.spark.SparkContext sparkContext )
[mangled: org/apache/spark/streaming/StreamingContext."<init>":(Ljava/lang/String;Lorg/apache/spark/SparkContext;)V]
StreamingContext.uiTab ( ) : scala.Option<ui.StreamingTab>
[mangled: org/apache/spark/streaming/StreamingContext.uiTab:()Lscala/Option;]
StreamingContext.withNamedScope ( String name, scala.Function0<U> body ) : U
[mangled: org/apache/spark/streaming/StreamingContext.withNamedScope:(Ljava/lang/String;Lscala/Function0;)Ljava/lang/Object;]
StreamingContext.withScope ( scala.Function0<U> body ) : U
[mangled: org/apache/spark/streaming/StreamingContext.withScope:(Lscala/Function0;)Ljava/lang/Object;]
to the top
Problems with Data Types, High Severity (19)
spark-core_2.10-1.4.0.jar
package org.apache.spark
[+] Logging (1)
| Change | Effect |
---|
1 | Abstract method logName ( ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (14)
isTraceEnabled ( )This abstract method is from 'Logging' interface.
log ( )This abstract method is from 'Logging' interface.
logDebug ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logDebug ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
logError ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logError ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
logInfo ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logInfo ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
logTrace ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logTrace ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
logWarning ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logWarning ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
Logging..log_ ( )This abstract method is from 'Logging' interface.
Logging..log__.eq ( org.slf4j.Logger )This abstract method is from 'Logging' interface.
[+] SparkContext (1)
| Change | Effect |
---|
1 | Removed super-interface ExecutorAllocationClient. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (151)
fromSparkContext ( SparkContext )1st parameter 'p1' of this method has type 'SparkContext'.
JavaSparkContext ( SparkContext )1st parameter 'sc' of this method has type 'SparkContext'.
sc ( )Return value of this method has type 'SparkContext'.
toSparkContext ( api.java.JavaSparkContext )Return value of this method has type 'SparkContext'.
accumulable ( R, AccumulableParam<R,T> )This method is from 'SparkContext' class.
accumulableCollection ( R, scala.Function1<R,scala.collection.generic.Growable<T>>, scala.reflect.ClassTag<R> )This method is from 'SparkContext' class.
accumulator ( T, AccumulatorParam<T> )This method is from 'SparkContext' class.
addedFiles ( )This method is from 'SparkContext' class.
addedJars ( )This method is from 'SparkContext' class.
addFile ( java.lang.String )This method is from 'SparkContext' class.
addJar ( java.lang.String )This method is from 'SparkContext' class.
addSparkListener ( scheduler.SparkListener )This method is from 'SparkContext' class.
appName ( )This method is from 'SparkContext' class.
booleanWritableConverter ( )This method is from 'SparkContext' class.
boolToBoolWritable ( boolean )This method is from 'SparkContext' class.
broadcast ( T, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
bytesToBytesWritable ( byte[ ] )This method is from 'SparkContext' class.
bytesWritableConverter ( )This method is from 'SparkContext' class.
cancelAllJobs ( )This method is from 'SparkContext' class.
cancelJob ( int )This method is from 'SparkContext' class.
cancelJobGroup ( java.lang.String )This method is from 'SparkContext' class.
cancelStage ( int )This method is from 'SparkContext' class.
checkpointDir ( )This method is from 'SparkContext' class.
checkpointDir_.eq ( scala.Option<java.lang.String> )This method is from 'SparkContext' class.
checkpointFile ( java.lang.String, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
cleaner ( )This method is from 'SparkContext' class.
cleanup ( long )This method is from 'SparkContext' class.
clearCallSite ( )This method is from 'SparkContext' class.
clearJobGroup ( )This method is from 'SparkContext' class.
conf ( )This method is from 'SparkContext' class.
dagScheduler ( )This method is from 'SparkContext' class.
dagScheduler_.eq ( scheduler.DAGScheduler )This method is from 'SparkContext' class.
defaultMinPartitions ( )This method is from 'SparkContext' class.
defaultParallelism ( )This method is from 'SparkContext' class.
doubleRDDToDoubleRDDFunctions ( rdd.RDD<java.lang.Object> )This method is from 'SparkContext' class.
doubleToDoubleWritable ( double )This method is from 'SparkContext' class.
doubleWritableConverter ( )This method is from 'SparkContext' class.
emptyRDD ( scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
env ( )This method is from 'SparkContext' class.
eventLogger ( )This method is from 'SparkContext' class.
executorEnvs ( )This method is from 'SparkContext' class.
executorMemory ( )This method is from 'SparkContext' class.
files ( )This method is from 'SparkContext' class.
floatToFloatWritable ( float )This method is from 'SparkContext' class.
floatWritableConverter ( )This method is from 'SparkContext' class.
getAllPools ( )This method is from 'SparkContext' class.
getCheckpointDir ( )This method is from 'SparkContext' class.
getConf ( )This method is from 'SparkContext' class.
getExecutorMemoryStatus ( )This method is from 'SparkContext' class.
getExecutorStorageStatus ( )This method is from 'SparkContext' class.
getLocalProperties ( )This method is from 'SparkContext' class.
getLocalProperty ( java.lang.String )This method is from 'SparkContext' class.
getPersistentRDDs ( )This method is from 'SparkContext' class.
getPoolForName ( java.lang.String )This method is from 'SparkContext' class.
getPreferredLocs ( rdd.RDD<?>, int )This method is from 'SparkContext' class.
getRDDStorageInfo ( )This method is from 'SparkContext' class.
getSchedulingMode ( )This method is from 'SparkContext' class.
getSparkHome ( )This method is from 'SparkContext' class.
hadoopConfiguration ( )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, int, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, java.lang.Class<? extends org.apache.hadoop.mapred.InputFormat<K,V>>, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<? extends org.apache.hadoop.mapred.InputFormat<K,V>>, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
intToIntWritable ( int )This method is from 'SparkContext' class.
intWritableConverter ( )This method is from 'SparkContext' class.
isLocal ( )This method is from 'SparkContext' class.
isTraceEnabled ( )This method is from 'SparkContext' class.
jarOfClass ( java.lang.Class<?> )This method is from 'SparkContext' class.
jarOfObject ( java.lang.Object )This method is from 'SparkContext' class.
jars ( )This method is from 'SparkContext' class.
listenerBus ( )This method is from 'SparkContext' class.
log ( )This method is from 'SparkContext' class.
logDebug ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logDebug ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logError ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logError ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logInfo ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logInfo ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logTrace ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logTrace ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logWarning ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logWarning ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
longToLongWritable ( long )This method is from 'SparkContext' class.
longWritableConverter ( )This method is from 'SparkContext' class.
makeRDD ( scala.collection.Seq<scala.Tuple2<T,scala.collection.Seq<java.lang.String>>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
makeRDD ( scala.collection.Seq<T>, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
master ( )This method is from 'SparkContext' class.
metadataCleaner ( )This method is from 'SparkContext' class.
newAPIHadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, org.apache.hadoop.conf.Configuration )This method is from 'SparkContext' class.
newAPIHadoopFile ( java.lang.String, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
newAPIHadoopRDD ( org.apache.hadoop.conf.Configuration, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )This method is from 'SparkContext' class.
newRddId ( )This method is from 'SparkContext' class.
newShuffleId ( )This method is from 'SparkContext' class.
numericRDDToDoubleRDDFunctions ( rdd.RDD<T>, scala.math.Numeric<T> )This method is from 'SparkContext' class.
objectFile ( java.lang.String, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
Logging..log_ ( )This method is from 'SparkContext' class.
Logging..log__.eq ( org.slf4j.Logger )This method is from 'SparkContext' class.
SparkContext..warnSparkMem ( java.lang.String )This method is from 'SparkContext' class.
parallelize ( scala.collection.Seq<T>, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
persistentRdds ( )This method is from 'SparkContext' class.
persistRDD ( rdd.RDD<?> )This method is from 'SparkContext' class.
preferredNodeLocationData ( )This method is from 'SparkContext' class.
preferredNodeLocationData_.eq ( scala.collection.Map<java.lang.String,scala.collection.Set<scheduler.SplitInfo>> )This method is from 'SparkContext' class.
rddToAsyncRDDActions ( rdd.RDD<T>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
rddToOrderedRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.math.Ordering<K>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V> )This method is from 'SparkContext' class.
rddToPairRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.math.Ordering<K> )This method is from 'SparkContext' class.
rddToSequenceFileRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.Function1<K,org.apache.hadoop.io.Writable>, scala.reflect.ClassTag<K>, scala.Function1<V,org.apache.hadoop.io.Writable>, scala.reflect.ClassTag<V> )This method is from 'SparkContext' class.
runApproximateJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, partial.ApproximateEvaluator<U,R>, long )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, boolean, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, boolean, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, boolean, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, int, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.Function0<WritableConverter<K>>, scala.Function0<WritableConverter<V>> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
setCallSite ( java.lang.String )This method is from 'SparkContext' class.
setCheckpointDir ( java.lang.String )This method is from 'SparkContext' class.
setJobDescription ( java.lang.String )This method is from 'SparkContext' class.
setJobGroup ( java.lang.String, java.lang.String, boolean )This method is from 'SparkContext' class.
setLocalProperties ( java.util.Properties )This method is from 'SparkContext' class.
setLocalProperty ( java.lang.String, java.lang.String )This method is from 'SparkContext' class.
SparkContext ( )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, java.lang.String )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, java.lang.String, scala.collection.Seq<java.lang.String> )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, java.lang.String, scala.collection.Seq<java.lang.String>, scala.collection.Map<java.lang.String,java.lang.String>, scala.collection.Map<java.lang.String,scala.collection.Set<scheduler.SplitInfo>> )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, SparkConf )This constructor is from 'SparkContext' class.
SparkContext ( SparkConf )This constructor is from 'SparkContext' class.
SparkContext ( SparkConf, scala.collection.Map<java.lang.String,scala.collection.Set<scheduler.SplitInfo>> )This constructor is from 'SparkContext' class.
sparkUser ( )This method is from 'SparkContext' class.
startTime ( )This method is from 'SparkContext' class.
stop ( )This method is from 'SparkContext' class.
stringToText ( java.lang.String )This method is from 'SparkContext' class.
stringWritableConverter ( )This method is from 'SparkContext' class.
submitJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.Function0<R> )This method is from 'SparkContext' class.
taskScheduler ( )This method is from 'SparkContext' class.
taskScheduler_.eq ( scheduler.TaskScheduler )This method is from 'SparkContext' class.
textFile ( java.lang.String, int )This method is from 'SparkContext' class.
union ( rdd.RDD<T>, scala.collection.Seq<rdd.RDD<T>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
union ( scala.collection.Seq<rdd.RDD<T>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
unpersistRDD ( int, boolean )This method is from 'SparkContext' class.
version ( )This method is from 'SparkContext' class.
wholeTextFiles ( java.lang.String, int )This method is from 'SparkContext' class.
writableWritableConverter ( )This method is from 'SparkContext' class.
sc ( )Return value of this method has type 'SparkContext'.
sparkContext ( )Return value of this method has type 'SparkContext'.
StreamingContext ( SparkContext, streaming.Checkpoint, streaming.Duration )1st parameter 'sc_' of this method has type 'SparkContext'.
StreamingContext ( SparkContext, streaming.Duration )1st parameter 'sparkContext' of this method has type 'SparkContext'.
[+] TaskContext (9)
| Change | Effect |
---|
1 | Abstract method addTaskCompletionListener ( util.TaskCompletionListener ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
2 | Abstract method addTaskCompletionListener ( scala.Function1<TaskContext,scala.runtime.BoxedUnit> ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
3 | Abstract method attemptNumber ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
4 | Abstract method isCompleted ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
5 | Abstract method isInterrupted ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
6 | Abstract method isRunningLocally ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
7 | Abstract method taskAttemptId ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
8 | Abstract method taskMemoryManager ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
9 | Removed super-interface java.io.Serializable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (3)
partitionId ( )This abstract method is from 'TaskContext' abstract class.
stageId ( )This abstract method is from 'TaskContext' abstract class.
taskMetrics ( )This abstract method is from 'TaskContext' abstract class.
package org.apache.spark.api.java
[+] JavaSparkContext (1)
| Change | Effect |
---|
1 | Removed super-interface java.io.Closeable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (72)
accumulable ( T, org.apache.spark.AccumulableParam<T,R> )This method is from 'JavaSparkContext' class.
accumulator ( double )This method is from 'JavaSparkContext' class.
accumulator ( int )This method is from 'JavaSparkContext' class.
accumulator ( T, org.apache.spark.AccumulatorParam<T> )This method is from 'JavaSparkContext' class.
addFile ( java.lang.String )This method is from 'JavaSparkContext' class.
addJar ( java.lang.String )This method is from 'JavaSparkContext' class.
appName ( )This method is from 'JavaSparkContext' class.
broadcast ( T )This method is from 'JavaSparkContext' class.
cancelAllJobs ( )This method is from 'JavaSparkContext' class.
cancelJobGroup ( java.lang.String )This method is from 'JavaSparkContext' class.
checkpointFile ( java.lang.String )This method is from 'JavaSparkContext' class.
clearCallSite ( )This method is from 'JavaSparkContext' class.
clearJobGroup ( )This method is from 'JavaSparkContext' class.
defaultMinPartitions ( )This method is from 'JavaSparkContext' class.
defaultParallelism ( )This method is from 'JavaSparkContext' class.
doubleAccumulator ( double )This method is from 'JavaSparkContext' class.
env ( )This method is from 'JavaSparkContext' class.
fromSparkContext ( org.apache.spark.SparkContext )Return value of this method has type 'JavaSparkContext'.
getCheckpointDir ( )This method is from 'JavaSparkContext' class.
getConf ( )This method is from 'JavaSparkContext' class.
getLocalProperty ( java.lang.String )This method is from 'JavaSparkContext' class.
getSparkHome ( )This method is from 'JavaSparkContext' class.
hadoopConfiguration ( )This method is from 'JavaSparkContext' class.
hadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )This method is from 'JavaSparkContext' class.
hadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'JavaSparkContext' class.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )This method is from 'JavaSparkContext' class.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'JavaSparkContext' class.
intAccumulator ( int )This method is from 'JavaSparkContext' class.
isLocal ( )This method is from 'JavaSparkContext' class.
jarOfClass ( java.lang.Class<?> )This method is from 'JavaSparkContext' class.
jarOfObject ( java.lang.Object )This method is from 'JavaSparkContext' class.
jars ( )This method is from 'JavaSparkContext' class.
JavaSparkContext ( )This constructor is from 'JavaSparkContext' class.
JavaSparkContext ( java.lang.String, java.lang.String )This constructor is from 'JavaSparkContext' class.
JavaSparkContext ( java.lang.String, java.lang.String, java.lang.String, java.lang.String )This constructor is from 'JavaSparkContext' class.
JavaSparkContext ( java.lang.String, java.lang.String, java.lang.String, java.lang.String[ ] )This constructor is from 'JavaSparkContext' class.
JavaSparkContext ( java.lang.String, java.lang.String, java.lang.String, java.lang.String[ ], java.util.Map<java.lang.String,java.lang.String> )This constructor is from 'JavaSparkContext' class.
JavaSparkContext ( java.lang.String, java.lang.String, org.apache.spark.SparkConf )This constructor is from 'JavaSparkContext' class.
JavaSparkContext ( org.apache.spark.SparkConf )This constructor is from 'JavaSparkContext' class.
JavaSparkContext ( org.apache.spark.SparkContext )This constructor is from 'JavaSparkContext' class.
master ( )This method is from 'JavaSparkContext' class.
newAPIHadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, org.apache.hadoop.conf.Configuration )This method is from 'JavaSparkContext' class.
newAPIHadoopRDD ( org.apache.hadoop.conf.Configuration, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )This method is from 'JavaSparkContext' class.
objectFile ( java.lang.String )This method is from 'JavaSparkContext' class.
objectFile ( java.lang.String, int )This method is from 'JavaSparkContext' class.
parallelize ( java.util.List<T> )This method is from 'JavaSparkContext' class.
parallelize ( java.util.List<T>, int )This method is from 'JavaSparkContext' class.
parallelizeDoubles ( java.util.List<java.lang.Double> )This method is from 'JavaSparkContext' class.
parallelizeDoubles ( java.util.List<java.lang.Double>, int )This method is from 'JavaSparkContext' class.
parallelizePairs ( java.util.List<scala.Tuple2<K,V>> )This method is from 'JavaSparkContext' class.
parallelizePairs ( java.util.List<scala.Tuple2<K,V>>, int )This method is from 'JavaSparkContext' class.
sc ( )This method is from 'JavaSparkContext' class.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V> )This method is from 'JavaSparkContext' class.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'JavaSparkContext' class.
setCallSite ( java.lang.String )This method is from 'JavaSparkContext' class.
setCheckpointDir ( java.lang.String )This method is from 'JavaSparkContext' class.
setJobGroup ( java.lang.String, java.lang.String )This method is from 'JavaSparkContext' class.
setJobGroup ( java.lang.String, java.lang.String, boolean )This method is from 'JavaSparkContext' class.
setLocalProperty ( java.lang.String, java.lang.String )This method is from 'JavaSparkContext' class.
sparkUser ( )This method is from 'JavaSparkContext' class.
startTime ( )This method is from 'JavaSparkContext' class.
stop ( )This method is from 'JavaSparkContext' class.
textFile ( java.lang.String )This method is from 'JavaSparkContext' class.
textFile ( java.lang.String, int )This method is from 'JavaSparkContext' class.
toSparkContext ( JavaSparkContext )1st parameter 'p1' of this method has type 'JavaSparkContext'.
union ( JavaDoubleRDD, java.util.List<JavaDoubleRDD> )This method is from 'JavaSparkContext' class.
union ( JavaPairRDD<K,V>, java.util.List<JavaPairRDD<K,V>> )This method is from 'JavaSparkContext' class.
union ( JavaRDD<T>, java.util.List<JavaRDD<T>> )This method is from 'JavaSparkContext' class.
wholeTextFiles ( java.lang.String )This method is from 'JavaSparkContext' class.
wholeTextFiles ( java.lang.String, int )This method is from 'JavaSparkContext' class.
JavaStreamingContext ( JavaSparkContext, org.apache.spark.streaming.Duration )1st parameter 'sparkContext' of this method has type 'JavaSparkContext'.
sparkContext ( )Return value of this method has type 'JavaSparkContext'.
package org.apache.spark.broadcast
[+] Broadcast<T> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.Logging. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (2)
broadcast ( T )Return value of this method has type 'Broadcast<T>'.
broadcast ( T, scala.reflect.ClassTag<T> )Return value of this method has type 'Broadcast<T>'.
package org.apache.spark.rdd
[+] PairRDDFunctions<K,V> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.mapreduce.SparkHadoopMapReduceUtil. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
rddToPairRDDFunctions ( RDD<scala.Tuple2<K,V>>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.math.Ordering<K> )Return value of this method has type 'PairRDDFunctions<K,V>'.
package org.apache.spark.serializer
[+] DeserializationStream (1)
| Change | Effect |
---|
1 | This class became interface. | A client program may be interrupted by IncompatibleClassChangeError or InstantiationError exception dependent on the usage of this class. |
[+] affected methods (3)
asIterator ( )This method is from 'DeserializationStream' abstract class.
close ( )This abstract method is from 'DeserializationStream' abstract class.
readObject ( scala.reflect.ClassTag<T> )This abstract method is from 'DeserializationStream' abstract class.
[+] SerializationStream (1)
| Change | Effect |
---|
1 | This class became interface. | A client program may be interrupted by IncompatibleClassChangeError or InstantiationError exception dependent on the usage of this class. |
[+] affected methods (4)
close ( )This abstract method is from 'SerializationStream' abstract class.
flush ( )This abstract method is from 'SerializationStream' abstract class.
writeAll ( scala.collection.Iterator<T>, scala.reflect.ClassTag<T> )Return value of this method has type 'SerializationStream'.
writeObject ( T, scala.reflect.ClassTag<T> )Return value of this abstract method has type 'SerializationStream'.
[+] Serializer (1)
| Change | Effect |
---|
1 | This class became interface. | A client program may be interrupted by IncompatibleClassChangeError or InstantiationError exception dependent on the usage of this class. |
[+] affected methods (1)
newInstance ( )This abstract method is from 'Serializer' abstract class.
[+] SerializerInstance (1)
| Change | Effect |
---|
1 | This class became interface. | A client program may be interrupted by IncompatibleClassChangeError or InstantiationError exception dependent on the usage of this class. |
[+] affected methods (1)
newInstance ( )Return value of this abstract method has type 'SerializerInstance'.
spark-streaming_2.10-1.4.0.jar
package org.apache.spark.streaming.api.java
[+] JavaStreamingContext (1)
| Change | Effect |
---|
1 | Removed super-interface java.io.Closeable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (40)
actorStream ( akka.actor.Props, java.lang.String )This method is from 'JavaStreamingContext' class.
actorStream ( akka.actor.Props, java.lang.String, org.apache.spark.storage.StorageLevel )This method is from 'JavaStreamingContext' class.
actorStream ( akka.actor.Props, java.lang.String, org.apache.spark.storage.StorageLevel, akka.actor.SupervisorStrategy )This method is from 'JavaStreamingContext' class.
addStreamingListener ( org.apache.spark.streaming.scheduler.StreamingListener )This method is from 'JavaStreamingContext' class.
awaitTermination ( )This method is from 'JavaStreamingContext' class.
checkpoint ( java.lang.String )This method is from 'JavaStreamingContext' class.
getOrCreate ( java.lang.String, org.apache.hadoop.conf.Configuration, JavaStreamingContextFactory )Return value of this method has type 'JavaStreamingContext'.
getOrCreate ( java.lang.String, org.apache.hadoop.conf.Configuration, JavaStreamingContextFactory, boolean )Return value of this method has type 'JavaStreamingContext'.
getOrCreate ( java.lang.String, JavaStreamingContextFactory )Return value of this method has type 'JavaStreamingContext'.
jarOfClass ( java.lang.Class<?> )This method is from 'JavaStreamingContext' class.
JavaStreamingContext ( java.lang.String )This constructor is from 'JavaStreamingContext' class.
JavaStreamingContext ( java.lang.String, java.lang.String, org.apache.spark.streaming.Duration )This constructor is from 'JavaStreamingContext' class.
JavaStreamingContext ( java.lang.String, java.lang.String, org.apache.spark.streaming.Duration, java.lang.String, java.lang.String )This constructor is from 'JavaStreamingContext' class.
JavaStreamingContext ( java.lang.String, java.lang.String, org.apache.spark.streaming.Duration, java.lang.String, java.lang.String[ ] )This constructor is from 'JavaStreamingContext' class.
JavaStreamingContext ( java.lang.String, java.lang.String, org.apache.spark.streaming.Duration, java.lang.String, java.lang.String[ ], java.util.Map<java.lang.String,java.lang.String> )This constructor is from 'JavaStreamingContext' class.
JavaStreamingContext ( java.lang.String, org.apache.hadoop.conf.Configuration )This constructor is from 'JavaStreamingContext' class.
JavaStreamingContext ( org.apache.spark.api.java.JavaSparkContext, org.apache.spark.streaming.Duration )This constructor is from 'JavaStreamingContext' class.
JavaStreamingContext ( org.apache.spark.SparkConf, org.apache.spark.streaming.Duration )This constructor is from 'JavaStreamingContext' class.
JavaStreamingContext ( org.apache.spark.streaming.StreamingContext )This constructor is from 'JavaStreamingContext' class.
queueStream ( java.util.Queue<org.apache.spark.api.java.JavaRDD<T>> )This method is from 'JavaStreamingContext' class.
queueStream ( java.util.Queue<org.apache.spark.api.java.JavaRDD<T>>, boolean )This method is from 'JavaStreamingContext' class.
queueStream ( java.util.Queue<org.apache.spark.api.java.JavaRDD<T>>, boolean, org.apache.spark.api.java.JavaRDD<T> )This method is from 'JavaStreamingContext' class.
rawSocketStream ( java.lang.String, int )This method is from 'JavaStreamingContext' class.
rawSocketStream ( java.lang.String, int, org.apache.spark.storage.StorageLevel )This method is from 'JavaStreamingContext' class.
receiverStream ( org.apache.spark.streaming.receiver.Receiver<T> )This method is from 'JavaStreamingContext' class.
remember ( org.apache.spark.streaming.Duration )This method is from 'JavaStreamingContext' class.
socketStream ( java.lang.String, int, org.apache.spark.api.java.function.Function<java.io.InputStream,java.lang.Iterable<T>>, org.apache.spark.storage.StorageLevel )This method is from 'JavaStreamingContext' class.
socketTextStream ( java.lang.String, int )This method is from 'JavaStreamingContext' class.
socketTextStream ( java.lang.String, int, org.apache.spark.storage.StorageLevel )This method is from 'JavaStreamingContext' class.
sparkContext ( )This method is from 'JavaStreamingContext' class.
ssc ( )This method is from 'JavaStreamingContext' class.
start ( )This method is from 'JavaStreamingContext' class.
stop ( )This method is from 'JavaStreamingContext' class.
stop ( boolean )This method is from 'JavaStreamingContext' class.
stop ( boolean, boolean )This method is from 'JavaStreamingContext' class.
textFileStream ( java.lang.String )This method is from 'JavaStreamingContext' class.
transform ( java.util.List<JavaDStream<?>>, org.apache.spark.api.java.function.Function2<java.util.List<org.apache.spark.api.java.JavaRDD<?>>,org.apache.spark.streaming.Time,org.apache.spark.api.java.JavaRDD<T>> )This method is from 'JavaStreamingContext' class.
transformToPair ( java.util.List<JavaDStream<?>>, org.apache.spark.api.java.function.Function2<java.util.List<org.apache.spark.api.java.JavaRDD<?>>,org.apache.spark.streaming.Time,org.apache.spark.api.java.JavaPairRDD<K,V>> )This method is from 'JavaStreamingContext' class.
union ( JavaDStream<T>, java.util.List<JavaDStream<T>> )This method is from 'JavaStreamingContext' class.
union ( JavaPairDStream<K,V>, java.util.List<JavaPairDStream<K,V>> )This method is from 'JavaStreamingContext' class.
to the top
Problems with Data Types, Medium Severity (6)
spark-core_2.10-1.4.0.jar
package org.apache.spark.api.java
[+] JavaDoubleRDD (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaRDDLike<java.lang.Double,JavaDoubleRDD>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (3)
parallelizeDoubles ( java.util.List<java.lang.Double> )Return value of this method has type 'JavaDoubleRDD'.
parallelizeDoubles ( java.util.List<java.lang.Double>, int )Return value of this method has type 'JavaDoubleRDD'.
union ( JavaDoubleRDD, java.util.List<JavaDoubleRDD> )1st parameter 'first' of this method has type 'JavaDoubleRDD'.
[+] JavaPairRDD<K,V> (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaRDDLike<scala.Tuple2<K,V>,JavaPairRDD<K,V>>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (11)
hadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
newAPIHadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, org.apache.hadoop.conf.Configuration )Return value of this method has type 'JavaPairRDD<K,V>'.
newAPIHadoopRDD ( org.apache.hadoop.conf.Configuration, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
parallelizePairs ( java.util.List<scala.Tuple2<K,V>> )Return value of this method has type 'JavaPairRDD<K,V>'.
parallelizePairs ( java.util.List<scala.Tuple2<K,V>>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
union ( JavaPairRDD<K,V>, java.util.List<JavaPairRDD<K,V>> )1st parameter 'first' of this method has type 'JavaPairRDD<K,V>'.
[+] JavaRDD<T> (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaRDDLike<T,JavaRDD<T>>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (7)
checkpointFile ( java.lang.String )Return value of this method has type 'JavaRDD<T>'.
objectFile ( java.lang.String )Return value of this method has type 'JavaRDD<T>'.
objectFile ( java.lang.String, int )Return value of this method has type 'JavaRDD<T>'.
parallelize ( java.util.List<T> )Return value of this method has type 'JavaRDD<T>'.
parallelize ( java.util.List<T>, int )Return value of this method has type 'JavaRDD<T>'.
union ( JavaRDD<T>, java.util.List<JavaRDD<T>> )1st parameter 'first' of this method has type 'JavaRDD<T>'.
queueStream ( java.util.Queue<JavaRDD<T>>, boolean, JavaRDD<T> )3rd parameter 'defaultRDD' of this method has type 'JavaRDD<T>'.
package org.apache.spark.scheduler
[+] LiveListenerBus (1)
| Change | Effect |
---|
1 | Removed super-class org.apache.spark.util.AsynchronousListenerBus<SparkListener,SparkListenerEvent>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (1)
listenerBus ( )Return value of this method has type 'LiveListenerBus'.
spark-streaming_2.10-1.4.0.jar
package org.apache.spark.streaming.api.java
[+] JavaDStream<T> (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaDStreamLike<T,JavaDStream<T>,org.apache.spark.api.java.JavaRDD<T>>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (3)
queueStream ( java.util.Queue<org.apache.spark.api.java.JavaRDD<T>> )Return value of this method has type 'JavaDStream<T>'.
transform ( java.util.List<JavaDStream<?>>, org.apache.spark.api.java.function.Function2<java.util.List<org.apache.spark.api.java.JavaRDD<?>>,org.apache.spark.streaming.Time,org.apache.spark.api.java.JavaRDD<T>> )Return value of this method has type 'JavaDStream<T>'.
union ( JavaDStream<T>, java.util.List<JavaDStream<T>> )1st parameter 'first' of this method has type 'JavaDStream<T>'.
[+] JavaPairDStream<K,V> (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaDStreamLike<scala.Tuple2<K,V>,JavaPairDStream<K,V>,org.apache.spark.api.java.JavaPairRDD<K,V>>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (2)
transformToPair ( java.util.List<JavaDStream<?>>, org.apache.spark.api.java.function.Function2<java.util.List<org.apache.spark.api.java.JavaRDD<?>>,org.apache.spark.streaming.Time,org.apache.spark.api.java.JavaPairRDD<K,V>> )Return value of this method has type 'JavaPairDStream<K,V>'.
union ( JavaPairDStream<K,V>, java.util.List<JavaPairDStream<K,V>> )1st parameter 'first' of this method has type 'JavaPairDStream<K,V>'.
to the top
Problems with Data Types, Low Severity (3)
spark-core_2.10-1.4.0.jar
package org.apache.spark
[+] TaskContext (3)
| Change | Effect |
---|
1 | Abstract method partitionId ( ) became non-abstract. | Some methods in this class may change behavior. |
2 | Abstract method stageId ( ) became non-abstract. | Some methods in this class may change behavior. |
3 | Abstract method taskMetrics ( ) became non-abstract. | Some methods in this class may change behavior. |
[+] affected methods (3)
partitionId ( )This abstract method is from 'TaskContext' abstract class.
stageId ( )This abstract method is from 'TaskContext' abstract class.
taskMetrics ( )This abstract method is from 'TaskContext' abstract class.
to the top
Problems with Methods, Low Severity (3)
spark-core_2.10-1.4.0.jar, TaskContext
package org.apache.spark
[+] TaskContext.partitionId ( ) [abstract] : int (1)
[mangled: org/apache/spark/TaskContext.partitionId:()I]
| Change | Effect |
---|
1 | Method became non-abstract.
| A client program may change behavior. |
[+] TaskContext.stageId ( ) [abstract] : int (1)
[mangled: org/apache/spark/TaskContext.stageId:()I]
| Change | Effect |
---|
1 | Method became non-abstract.
| A client program may change behavior. |
[+] TaskContext.taskMetrics ( ) [abstract] : executor.TaskMetrics (1)
[mangled: org/apache/spark/TaskContext.taskMetrics:()Lorg/apache/spark/executor/TaskMetrics;]
| Change | Effect |
---|
1 | Method became non-abstract.
| A client program may change behavior. |
to the top
Java ARchives (3)
spark-core_2.10-1.4.0.jar
spark-sql_2.10-1.4.0.jar
spark-streaming_2.10-1.4.0.jar
to the top
Generated on Thu Aug 13 09:43:45 2015 for infinispan-spark_2.10-0.1 by Java API Compliance Checker 1.4.1
A tool for checking backward compatibility of a Java library API