Binary compatibility report for the infinispan-spark_2.10-0.1 library between 1.4.0 and 1.3.0 versions (relating to the portability of client application infinispan-spark_2.10-0.1.jar)
Test Info
Library Name | infinispan-spark_2.10-0.1 |
Version #1 | 1.4.0 |
Version #2 | 1.3.0 |
Java Version | 1.7.0_75 |
Test Results
Total Java ARchives | 3 |
---|
Total Methods / Classes | 484 / 2720 |
---|
Verdict | Incompatible (12.5%) |
Problem Summary
| Severity | Count |
---|
Added Methods | - | 4 |
---|
Removed Methods | High | 34 |
---|
Problems with Data Types | High | 2 |
---|
Medium | 5 |
Low | 0 |
Problems with Methods | High | 0 |
---|
Medium | 0 |
Low | 0 |
Added Methods (4)
spark-streaming_2.10-1.3.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.getNewReceiverStreamId ( ) : int
[mangled: org/apache/spark/streaming/StreamingContext.getNewReceiverStreamId:()I]
StreamingContext.state ( ) : scala.Enumeration.Value
[mangled: org/apache/spark/streaming/StreamingContext.state:()Lscala/Enumeration$Value;]
StreamingContext.state_.eq ( scala.Enumeration.Value p1 ) : void
[mangled: org/apache/spark/streaming/StreamingContext.state_.eq:(Lscala/Enumeration$Value;)V]
StreamingContext.StreamingContextState ( ) : StreamingContext.StreamingContextState.
[mangled: org/apache/spark/streaming/StreamingContext.StreamingContextState:()Lorg/apache/spark/streaming/StreamingContext$StreamingContextState$;]
to the top
Removed Methods (34)
spark-core_2.10-1.4.0.jar, DeserializationStream.class
package org.apache.spark.serializer
DeserializationStream.asKeyValueIterator ( ) : scala.collection.Iterator<scala.Tuple2<Object,Object>>
[mangled: org/apache/spark/serializer/DeserializationStream.asKeyValueIterator:()Lscala/collection/Iterator;]
DeserializationStream.readKey ( scala.reflect.ClassTag<T> p1 ) : T
[mangled: org/apache/spark/serializer/DeserializationStream.readKey:(Lscala/reflect/ClassTag;)Ljava/lang/Object;]
DeserializationStream.readValue ( scala.reflect.ClassTag<T> p1 ) : T
[mangled: org/apache/spark/serializer/DeserializationStream.readValue:(Lscala/reflect/ClassTag;)Ljava/lang/Object;]
spark-core_2.10-1.4.0.jar, JavaSparkContext.class
package org.apache.spark.api.java
JavaSparkContext.setLogLevel ( String logLevel ) : void
[mangled: org/apache/spark/api/java/JavaSparkContext.setLogLevel:(Ljava/lang/String;)V]
spark-core_2.10-1.4.0.jar, SerializationStream.class
package org.apache.spark.serializer
SerializationStream.writeKey ( T key, scala.reflect.ClassTag<T> p2 ) : SerializationStream
[mangled: org/apache/spark/serializer/SerializationStream.writeKey:(Ljava/lang/Object;Lscala/reflect/ClassTag;)Lorg/apache/spark/serializer/SerializationStream;]
SerializationStream.writeValue ( T value, scala.reflect.ClassTag<T> p2 ) : SerializationStream
[mangled: org/apache/spark/serializer/SerializationStream.writeValue:(Ljava/lang/Object;Lscala/reflect/ClassTag;)Lorg/apache/spark/serializer/SerializationStream;]
spark-core_2.10-1.4.0.jar, Serializer.class
package org.apache.spark.serializer
Serializer.supportsRelocationOfSerializedObjects ( ) : boolean
[mangled: org/apache/spark/serializer/Serializer.supportsRelocationOfSerializedObjects:()Z]
spark-core_2.10-1.4.0.jar, SparkContext.class
package org.apache.spark
SparkContext.applicationAttemptId ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.applicationAttemptId:()Lscala/Option;]
SparkContext.externalBlockStoreFolderName ( ) : String
[mangled: org/apache/spark/SparkContext.externalBlockStoreFolderName:()Ljava/lang/String;]
SparkContext.getOrCreate ( ) [static] : SparkContext
[mangled: org/apache/spark/SparkContext.getOrCreate:()Lorg/apache/spark/SparkContext;]
SparkContext.getOrCreate ( SparkConf p1 ) [static] : SparkContext
[mangled: org/apache/spark/SparkContext.getOrCreate:(Lorg/apache/spark/SparkConf;)Lorg/apache/spark/SparkContext;]
SparkContext.SparkContext.._conf ( ) : SparkConf
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._conf:()Lorg/apache/spark/SparkConf;]
SparkContext.SparkContext.._env ( ) : SparkEnv
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._env:()Lorg/apache/spark/SparkEnv;]
SparkContext.SparkContext..assertNotStopped ( ) : void
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext..assertNotStopped:()V]
SparkContext.range ( long start, long end, long step, int numSlices ) : rdd.RDD<Object>
[mangled: org/apache/spark/SparkContext.range:(JJJI)Lorg/apache/spark/rdd/RDD;]
SparkContext.setLogLevel ( String logLevel ) : void
[mangled: org/apache/spark/SparkContext.setLogLevel:(Ljava/lang/String;)V]
SparkContext.supportDynamicAllocation ( ) : boolean
[mangled: org/apache/spark/SparkContext.supportDynamicAllocation:()Z]
SparkContext.withScope ( scala.Function0<U> body ) : U
[mangled: org/apache/spark/SparkContext.withScope:(Lscala/Function0;)Ljava/lang/Object;]
spark-core_2.10-1.4.0.jar, TaskContext.class
package org.apache.spark
TaskContext.taskMemoryManager ( ) [abstract] : unsafe.memory.TaskMemoryManager
[mangled: org/apache/spark/TaskContext.taskMemoryManager:()Lorg/apache/spark/unsafe/memory/TaskMemoryManager;]
spark-streaming_2.10-1.4.0.jar, JavaStreamingContext.class
package org.apache.spark.streaming.api.java
JavaStreamingContext.getOrCreate ( String p1, org.apache.spark.api.java.function.Function0<JavaStreamingContext> p2 ) [static] : JavaStreamingContext
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.getOrCreate:(Ljava/lang/String;Lorg/apache/spark/api/java/function/Function0;)Lorg/apache/spark/streaming/api/java/JavaStreamingContext;]
JavaStreamingContext.getOrCreate ( String p1, org.apache.spark.api.java.function.Function0<JavaStreamingContext> p2, org.apache.hadoop.conf.Configuration p3 ) [static] : JavaStreamingContext
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.getOrCreate:(Ljava/lang/String;Lorg/apache/spark/api/java/function/Function0;Lorg/apache/hadoop/conf/Configuration;)Lorg/apache/spark/streaming/api/java/JavaStreamingContext;]
JavaStreamingContext.getOrCreate ( String p1, org.apache.spark.api.java.function.Function0<JavaStreamingContext> p2, org.apache.hadoop.conf.Configuration p3, boolean p4 ) [static] : JavaStreamingContext
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.getOrCreate:(Ljava/lang/String;Lorg/apache/spark/api/java/function/Function0;Lorg/apache/hadoop/conf/Configuration;Z)Lorg/apache/spark/streaming/api/java/JavaStreamingContext;]
JavaStreamingContext.getState ( ) : org.apache.spark.streaming.StreamingContextState
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.getState:()Lorg/apache/spark/streaming/StreamingContextState;]
spark-streaming_2.10-1.4.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.getActive ( ) [static] : scala.Option<StreamingContext>
[mangled: org/apache/spark/streaming/StreamingContext.getActive:()Lscala/Option;]
StreamingContext.getActiveOrCreate ( scala.Function0<StreamingContext> p1 ) [static] : StreamingContext
[mangled: org/apache/spark/streaming/StreamingContext.getActiveOrCreate:(Lscala/Function0;)Lorg/apache/spark/streaming/StreamingContext;]
StreamingContext.getActiveOrCreate ( String p1, scala.Function0<StreamingContext> p2, org.apache.hadoop.conf.Configuration p3, boolean p4 ) [static] : StreamingContext
[mangled: org/apache/spark/streaming/StreamingContext.getActiveOrCreate:(Ljava/lang/String;Lscala/Function0;Lorg/apache/hadoop/conf/Configuration;Z)Lorg/apache/spark/streaming/StreamingContext;]
StreamingContext.getNewInputStreamId ( ) : int
[mangled: org/apache/spark/streaming/StreamingContext.getNewInputStreamId:()I]
StreamingContext.getState ( ) : StreamingContextState
[mangled: org/apache/spark/streaming/StreamingContext.getState:()Lorg/apache/spark/streaming/StreamingContextState;]
StreamingContext.isCheckpointingEnabled ( ) : boolean
[mangled: org/apache/spark/streaming/StreamingContext.isCheckpointingEnabled:()Z]
StreamingContext.StreamingContext..startSite ( ) : java.util.concurrent.atomic.AtomicReference<org.apache.spark.util.CallSite>
[mangled: org/apache/spark/streaming/StreamingContext.org.apache.spark.streaming.StreamingContext..startSite:()Ljava/util/concurrent/atomic/AtomicReference;]
StreamingContext.StreamingContext..stopOnShutdown ( ) : void
[mangled: org/apache/spark/streaming/StreamingContext.org.apache.spark.streaming.StreamingContext..stopOnShutdown:()V]
StreamingContext.StreamingContext ( String path, org.apache.spark.SparkContext sparkContext )
[mangled: org/apache/spark/streaming/StreamingContext."<init>":(Ljava/lang/String;Lorg/apache/spark/SparkContext;)V]
StreamingContext.withNamedScope ( String name, scala.Function0<U> body ) : U
[mangled: org/apache/spark/streaming/StreamingContext.withNamedScope:(Ljava/lang/String;Lscala/Function0;)Ljava/lang/Object;]
StreamingContext.withScope ( scala.Function0<U> body ) : U
[mangled: org/apache/spark/streaming/StreamingContext.withScope:(Lscala/Function0;)Ljava/lang/Object;]
to the top
Problems with Data Types, High Severity (2)
spark-core_2.10-1.4.0.jar
package org.apache.spark
[+] TaskContext (1)
| Change | Effect |
---|
1 | Abstract method taskMemoryManager ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (12)
addTaskCompletionListener ( util.TaskCompletionListener )This abstract method is from 'TaskContext' abstract class.
addTaskCompletionListener ( scala.Function1<TaskContext,scala.runtime.BoxedUnit> )This abstract method is from 'TaskContext' abstract class.
attemptNumber ( )This abstract method is from 'TaskContext' abstract class.
get ( )This method is from 'TaskContext' abstract class.
isCompleted ( )This abstract method is from 'TaskContext' abstract class.
isInterrupted ( )This abstract method is from 'TaskContext' abstract class.
isRunningLocally ( )This abstract method is from 'TaskContext' abstract class.
partitionId ( )This abstract method is from 'TaskContext' abstract class.
stageId ( )This abstract method is from 'TaskContext' abstract class.
taskAttemptId ( )This abstract method is from 'TaskContext' abstract class.
TaskContext ( )This constructor is from 'TaskContext' abstract class.
taskMetrics ( )This abstract method is from 'TaskContext' abstract class.
spark-streaming_2.10-1.4.0.jar
package org.apache.spark.streaming.ui
[+] StreamingJobProgressListener (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.scheduler.SparkListener. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
progressListener ( )Return value of this method has type 'StreamingJobProgressListener'.
to the top
Problems with Data Types, Medium Severity (5)
spark-core_2.10-1.4.0.jar
package org.apache.spark.api.java
[+] JavaDoubleRDD (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaRDDLike<java.lang.Double,JavaDoubleRDD>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (3)
parallelizeDoubles ( java.util.List<java.lang.Double> )Return value of this method has type 'JavaDoubleRDD'.
parallelizeDoubles ( java.util.List<java.lang.Double>, int )Return value of this method has type 'JavaDoubleRDD'.
union ( JavaDoubleRDD, java.util.List<JavaDoubleRDD> )1st parameter 'first' of this method has type 'JavaDoubleRDD'.
[+] JavaPairRDD<K,V> (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaRDDLike<scala.Tuple2<K,V>,JavaPairRDD<K,V>>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (11)
hadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
newAPIHadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, org.apache.hadoop.conf.Configuration )Return value of this method has type 'JavaPairRDD<K,V>'.
newAPIHadoopRDD ( org.apache.hadoop.conf.Configuration, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
parallelizePairs ( java.util.List<scala.Tuple2<K,V>> )Return value of this method has type 'JavaPairRDD<K,V>'.
parallelizePairs ( java.util.List<scala.Tuple2<K,V>>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
union ( JavaPairRDD<K,V>, java.util.List<JavaPairRDD<K,V>> )1st parameter 'first' of this method has type 'JavaPairRDD<K,V>'.
[+] JavaRDD<T> (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaRDDLike<T,JavaRDD<T>>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (8)
checkpointFile ( java.lang.String )Return value of this method has type 'JavaRDD<T>'.
emptyRDD ( )Return value of this method has type 'JavaRDD<T>'.
objectFile ( java.lang.String )Return value of this method has type 'JavaRDD<T>'.
objectFile ( java.lang.String, int )Return value of this method has type 'JavaRDD<T>'.
parallelize ( java.util.List<T> )Return value of this method has type 'JavaRDD<T>'.
parallelize ( java.util.List<T>, int )Return value of this method has type 'JavaRDD<T>'.
union ( JavaRDD<T>, java.util.List<JavaRDD<T>> )1st parameter 'first' of this method has type 'JavaRDD<T>'.
queueStream ( java.util.Queue<JavaRDD<T>>, boolean, JavaRDD<T> )3rd parameter 'defaultRDD' of this method has type 'JavaRDD<T>'.
spark-streaming_2.10-1.4.0.jar
package org.apache.spark.streaming.api.java
[+] JavaDStream<T> (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaDStreamLike<T,JavaDStream<T>,org.apache.spark.api.java.JavaRDD<T>>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (3)
queueStream ( java.util.Queue<org.apache.spark.api.java.JavaRDD<T>> )Return value of this method has type 'JavaDStream<T>'.
transform ( java.util.List<JavaDStream<?>>, org.apache.spark.api.java.function.Function2<java.util.List<org.apache.spark.api.java.JavaRDD<?>>,org.apache.spark.streaming.Time,org.apache.spark.api.java.JavaRDD<T>> )Return value of this method has type 'JavaDStream<T>'.
union ( JavaDStream<T>, java.util.List<JavaDStream<T>> )1st parameter 'first' of this method has type 'JavaDStream<T>'.
[+] JavaPairDStream<K,V> (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaDStreamLike<scala.Tuple2<K,V>,JavaPairDStream<K,V>,org.apache.spark.api.java.JavaPairRDD<K,V>>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (2)
transformToPair ( java.util.List<JavaDStream<?>>, org.apache.spark.api.java.function.Function2<java.util.List<org.apache.spark.api.java.JavaRDD<?>>,org.apache.spark.streaming.Time,org.apache.spark.api.java.JavaPairRDD<K,V>> )Return value of this method has type 'JavaPairDStream<K,V>'.
union ( JavaPairDStream<K,V>, java.util.List<JavaPairDStream<K,V>> )1st parameter 'first' of this method has type 'JavaPairDStream<K,V>'.
to the top
Java ARchives (3)
spark-core_2.10-1.4.0.jar
spark-sql_2.10-1.4.0.jar
spark-streaming_2.10-1.4.0.jar
to the top
Generated on Thu Aug 13 09:49:18 2015 for infinispan-spark_2.10-0.1 by Java API Compliance Checker 1.4.1
A tool for checking backward compatibility of a Java library API