Binary compatibility report for the kafka-spark-consumer-1.0.5 library between 1.4.0 and 1.2.0 versions (relating to the portability of client application kafka-spark-consumer-1.0.5.jar)
Test Info
Library Name | kafka-spark-consumer-1.0.5 |
Version #1 | 1.4.0 |
Version #2 | 1.2.0 |
Java Version | 1.7.0_75 |
Test Results
Total Java ARchives | 2 |
---|
Total Methods / Classes | 172 / 2175 |
---|
Verdict | Incompatible (18%) |
Problem Summary
| Severity | Count |
---|
Added Methods | - | 5 |
---|
Removed Methods | High | 23 |
---|
Problems with Data Types | High | 2 |
---|
Medium | 3 |
Low | 0 |
Problems with Methods | High | 0 |
---|
Medium | 0 |
Low | 0 |
Added Methods (5)
spark-streaming_2.10-1.2.0.jar, JavaStreamingContext.class
package org.apache.spark.streaming.api.java
JavaStreamingContext.fileStream ( String directory ) : JavaPairInputDStream<K,V>
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.fileStream:(Ljava/lang/String;)Lorg/apache/spark/streaming/api/java/JavaPairInputDStream;]
spark-streaming_2.10-1.2.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.getNewReceiverStreamId ( ) : int
[mangled: org/apache/spark/streaming/StreamingContext.getNewReceiverStreamId:()I]
StreamingContext.state ( ) : scala.Enumeration.Value
[mangled: org/apache/spark/streaming/StreamingContext.state:()Lscala/Enumeration$Value;]
StreamingContext.state_.eq ( scala.Enumeration.Value p1 ) : void
[mangled: org/apache/spark/streaming/StreamingContext.state_.eq:(Lscala/Enumeration$Value;)V]
StreamingContext.StreamingContextState ( ) : StreamingContext.StreamingContextState.
[mangled: org/apache/spark/streaming/StreamingContext.StreamingContextState:()Lorg/apache/spark/streaming/StreamingContext$StreamingContextState$;]
to the top
Removed Methods (23)
spark-streaming_2.10-1.4.0.jar, JavaStreamingContext.class
package org.apache.spark.streaming.api.java
JavaStreamingContext.awaitTerminationOrTimeout ( long timeout ) : boolean
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.awaitTerminationOrTimeout:(J)Z]
JavaStreamingContext.binaryRecordsStream ( String directory, int recordLength ) : JavaDStream<byte[ ]>
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.binaryRecordsStream:(Ljava/lang/String;I)Lorg/apache/spark/streaming/api/java/JavaDStream;]
JavaStreamingContext.fileStream ( String directory, Class<K> kClass, Class<V> vClass, Class<F> fClass ) : JavaPairInputDStream<K,V>
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.fileStream:(Ljava/lang/String;Ljava/lang/Class;Ljava/lang/Class;Ljava/lang/Class;)Lorg/apache/spark/streaming/api/java/JavaPairInputDStream;]
JavaStreamingContext.fileStream ( String directory, Class<K> kClass, Class<V> vClass, Class<F> fClass, org.apache.spark.api.java.function.Function<org.apache.hadoop.fs.Path,Boolean> filter, boolean newFilesOnly ) : JavaPairInputDStream<K,V>
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.fileStream:(Ljava/lang/String;Ljava/lang/Class;Ljava/lang/Class;Ljava/lang/Class;Lorg/apache/spark/api/java/function/Function;Z)Lorg/apache/spark/streaming/api/java/JavaPairInputDStream;]
JavaStreamingContext.fileStream ( String directory, Class<K> kClass, Class<V> vClass, Class<F> fClass, org.apache.spark.api.java.function.Function<org.apache.hadoop.fs.Path,Boolean> filter, boolean newFilesOnly, org.apache.hadoop.conf.Configuration conf ) : JavaPairInputDStream<K,V>
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.fileStream:(Ljava/lang/String;Ljava/lang/Class;Ljava/lang/Class;Ljava/lang/Class;Lorg/apache/spark/api/java/function/Function;ZLorg/apache/hadoop/conf/Configuration;)Lorg/apache/spark/streaming/api/java/JavaPairInputDStream;]
JavaStreamingContext.getOrCreate ( String p1, org.apache.spark.api.java.function.Function0<JavaStreamingContext> p2 ) [static] : JavaStreamingContext
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.getOrCreate:(Ljava/lang/String;Lorg/apache/spark/api/java/function/Function0;)Lorg/apache/spark/streaming/api/java/JavaStreamingContext;]
JavaStreamingContext.getOrCreate ( String p1, org.apache.spark.api.java.function.Function0<JavaStreamingContext> p2, org.apache.hadoop.conf.Configuration p3 ) [static] : JavaStreamingContext
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.getOrCreate:(Ljava/lang/String;Lorg/apache/spark/api/java/function/Function0;Lorg/apache/hadoop/conf/Configuration;)Lorg/apache/spark/streaming/api/java/JavaStreamingContext;]
JavaStreamingContext.getOrCreate ( String p1, org.apache.spark.api.java.function.Function0<JavaStreamingContext> p2, org.apache.hadoop.conf.Configuration p3, boolean p4 ) [static] : JavaStreamingContext
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.getOrCreate:(Ljava/lang/String;Lorg/apache/spark/api/java/function/Function0;Lorg/apache/hadoop/conf/Configuration;Z)Lorg/apache/spark/streaming/api/java/JavaStreamingContext;]
JavaStreamingContext.getState ( ) : org.apache.spark.streaming.StreamingContextState
[mangled: org/apache/spark/streaming/api/java/JavaStreamingContext.getState:()Lorg/apache/spark/streaming/StreamingContextState;]
spark-streaming_2.10-1.4.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.awaitTerminationOrTimeout ( long timeout ) : boolean
[mangled: org/apache/spark/streaming/StreamingContext.awaitTerminationOrTimeout:(J)Z]
StreamingContext.binaryRecordsStream ( String directory, int recordLength ) : dstream.DStream<byte[ ]>
[mangled: org/apache/spark/streaming/StreamingContext.binaryRecordsStream:(Ljava/lang/String;I)Lorg/apache/spark/streaming/dstream/DStream;]
StreamingContext.fileStream ( String directory, scala.Function1<org.apache.hadoop.fs.Path,Object> filter, boolean newFilesOnly, org.apache.hadoop.conf.Configuration conf, scala.reflect.ClassTag<K> p5, scala.reflect.ClassTag<V> p6, scala.reflect.ClassTag<F> p7 ) : dstream.InputDStream<scala.Tuple2<K,V>>
[mangled: org/apache/spark/streaming/StreamingContext.fileStream:(Ljava/lang/String;Lscala/Function1;ZLorg/apache/hadoop/conf/Configuration;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/streaming/dstream/InputDStream;]
StreamingContext.getActive ( ) [static] : scala.Option<StreamingContext>
[mangled: org/apache/spark/streaming/StreamingContext.getActive:()Lscala/Option;]
StreamingContext.getActiveOrCreate ( scala.Function0<StreamingContext> p1 ) [static] : StreamingContext
[mangled: org/apache/spark/streaming/StreamingContext.getActiveOrCreate:(Lscala/Function0;)Lorg/apache/spark/streaming/StreamingContext;]
StreamingContext.getActiveOrCreate ( String p1, scala.Function0<StreamingContext> p2, org.apache.hadoop.conf.Configuration p3, boolean p4 ) [static] : StreamingContext
[mangled: org/apache/spark/streaming/StreamingContext.getActiveOrCreate:(Ljava/lang/String;Lscala/Function0;Lorg/apache/hadoop/conf/Configuration;Z)Lorg/apache/spark/streaming/StreamingContext;]
StreamingContext.getNewInputStreamId ( ) : int
[mangled: org/apache/spark/streaming/StreamingContext.getNewInputStreamId:()I]
StreamingContext.getState ( ) : StreamingContextState
[mangled: org/apache/spark/streaming/StreamingContext.getState:()Lorg/apache/spark/streaming/StreamingContextState;]
StreamingContext.isCheckpointingEnabled ( ) : boolean
[mangled: org/apache/spark/streaming/StreamingContext.isCheckpointingEnabled:()Z]
StreamingContext.StreamingContext..startSite ( ) : java.util.concurrent.atomic.AtomicReference<org.apache.spark.util.CallSite>
[mangled: org/apache/spark/streaming/StreamingContext.org.apache.spark.streaming.StreamingContext..startSite:()Ljava/util/concurrent/atomic/AtomicReference;]
StreamingContext.StreamingContext..stopOnShutdown ( ) : void
[mangled: org/apache/spark/streaming/StreamingContext.org.apache.spark.streaming.StreamingContext..stopOnShutdown:()V]
StreamingContext.StreamingContext ( String path, org.apache.spark.SparkContext sparkContext )
[mangled: org/apache/spark/streaming/StreamingContext."<init>":(Ljava/lang/String;Lorg/apache/spark/SparkContext;)V]
StreamingContext.withNamedScope ( String name, scala.Function0<U> body ) : U
[mangled: org/apache/spark/streaming/StreamingContext.withNamedScope:(Ljava/lang/String;Lscala/Function0;)Ljava/lang/Object;]
StreamingContext.withScope ( scala.Function0<U> body ) : U
[mangled: org/apache/spark/streaming/StreamingContext.withScope:(Lscala/Function0;)Ljava/lang/Object;]
to the top
Problems with Data Types, High Severity (2)
spark-core_2.10-1.4.0.jar
package org.apache.spark
[+] SparkContext (1)
| Change | Effect |
---|
1 | Removed super-interface ExecutorAllocationClient. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (4)
sc ( )Return value of this method has type 'SparkContext'.
sparkContext ( )Return value of this method has type 'SparkContext'.
StreamingContext ( SparkContext, streaming.Checkpoint, streaming.Duration )1st parameter 'sc_' of this method has type 'SparkContext'.
StreamingContext ( SparkContext, streaming.Duration )1st parameter 'sparkContext' of this method has type 'SparkContext'.
spark-streaming_2.10-1.4.0.jar
package org.apache.spark.streaming.ui
[+] StreamingJobProgressListener (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.scheduler.SparkListener. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
progressListener ( )Return value of this method has type 'StreamingJobProgressListener'.
to the top
Problems with Data Types, Medium Severity (3)
spark-core_2.10-1.4.0.jar
package org.apache.spark.api.java
[+] JavaRDD<T> (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaRDDLike<T,JavaRDD<T>>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (1)
queueStream ( java.util.Queue<JavaRDD<T>>, boolean, JavaRDD<T> )3rd parameter 'defaultRDD' of this method has type 'JavaRDD<T>'.
spark-streaming_2.10-1.4.0.jar
package org.apache.spark.streaming.api.java
[+] JavaDStream<T> (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaDStreamLike<T,JavaDStream<T>,org.apache.spark.api.java.JavaRDD<T>>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (3)
queueStream ( java.util.Queue<org.apache.spark.api.java.JavaRDD<T>> )Return value of this method has type 'JavaDStream<T>'.
transform ( java.util.List<JavaDStream<?>>, org.apache.spark.api.java.function.Function2<java.util.List<org.apache.spark.api.java.JavaRDD<?>>,org.apache.spark.streaming.Time,org.apache.spark.api.java.JavaRDD<T>> )Return value of this method has type 'JavaDStream<T>'.
union ( JavaDStream<T>, java.util.List<JavaDStream<T>> )1st parameter 'first' of this method has type 'JavaDStream<T>'.
[+] JavaPairDStream<K,V> (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaDStreamLike<scala.Tuple2<K,V>,JavaPairDStream<K,V>,org.apache.spark.api.java.JavaPairRDD<K,V>>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (2)
transformToPair ( java.util.List<JavaDStream<?>>, org.apache.spark.api.java.function.Function2<java.util.List<org.apache.spark.api.java.JavaRDD<?>>,org.apache.spark.streaming.Time,org.apache.spark.api.java.JavaPairRDD<K,V>> )Return value of this method has type 'JavaPairDStream<K,V>'.
union ( JavaPairDStream<K,V>, java.util.List<JavaPairDStream<K,V>> )1st parameter 'first' of this method has type 'JavaPairDStream<K,V>'.
to the top
Java ARchives (2)
spark-core_2.10-1.4.0.jar
spark-streaming_2.10-1.4.0.jar
to the top
Generated on Thu Oct 8 12:56:18 2015 for kafka-spark-consumer-1.0.5 by Java API Compliance Checker 1.4.1
A tool for checking backward compatibility of a Java library API