Binary compatibility report for the spark-testing-base_2.10-0.1.0 library  between 1.4.0 and 1.3.0 versions   (relating to the portability of client application spark-testing-base_2.10-0.1.0.jar)

Test Info


Library Namespark-testing-base_2.10-0.1.0
Version #11.4.0
Version #21.3.0
Java Version1.7.0_75

Test Results


Total Java ARchives4
Total Methods / Classes517 / 2866
VerdictIncompatible
(9.3%)

Problem Summary


SeverityCount
Added Methods-8
Removed MethodsHigh47
Problems with
Data Types
High1
Medium0
Low0
Problems with
Methods
High0
Medium0
Low0

Added Methods (8)


spark-core_2.10-1.3.0.jar, SparkConf.class
package org.apache.spark
SparkConf.translateConfKey ( String p1, boolean p2 ) [static]  :  String

spark-streaming_2.10-1.3.0.jar, Checkpoint.class
package org.apache.spark.streaming
Checkpoint.getCheckpointFiles ( String p1, org.apache.hadoop.fs.FileSystem p2 ) [static]  :  scala.collection.Seq<org.apache.hadoop.fs.Path>
Checkpoint.sparkConf ( )  :  org.apache.spark.SparkConf

spark-streaming_2.10-1.3.0.jar, DStream<T>.class
package org.apache.spark.streaming.dstream
DStream<T>.validate ( )  :  void

spark-streaming_2.10-1.3.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.getNewReceiverStreamId ( )  :  int
StreamingContext.state ( )  :  scala.Enumeration.Value
StreamingContext.state_.eq ( scala.Enumeration.Value p1 )  :  void
StreamingContext.StreamingContextState ( )  :  StreamingContext.StreamingContextState.

to the top

Removed Methods (47)


spark-core_2.10-1.4.0.jar, SparkConf.class
package org.apache.spark
SparkConf.getDeprecatedConfig ( String p1, SparkConf p2 ) [static]  :  scala.Option<String>
SparkConf.getSizeAsBytes ( String key )  :  long
SparkConf.getSizeAsBytes ( String key, String defaultValue )  :  long
SparkConf.getSizeAsGb ( String key )  :  long
SparkConf.getSizeAsGb ( String key, String defaultValue )  :  long
SparkConf.getSizeAsKb ( String key )  :  long
SparkConf.getSizeAsKb ( String key, String defaultValue )  :  long
SparkConf.getSizeAsMb ( String key )  :  long
SparkConf.getSizeAsMb ( String key, String defaultValue )  :  long
SparkConf.getTimeAsMs ( String key )  :  long
SparkConf.getTimeAsMs ( String key, String defaultValue )  :  long
SparkConf.getTimeAsSeconds ( String key )  :  long
SparkConf.getTimeAsSeconds ( String key, String defaultValue )  :  long
SparkConf.logDeprecationWarning ( String p1 ) [static]  :  void

spark-core_2.10-1.4.0.jar, SparkContext.class
package org.apache.spark
SparkContext.applicationAttemptId ( )  :  scala.Option<String>
SparkContext.externalBlockStoreFolderName ( )  :  String
SparkContext.getOrCreate ( ) [static]  :  SparkContext
SparkContext.getOrCreate ( SparkConf p1 ) [static]  :  SparkContext
SparkContext.SparkContext.._conf ( )  :  SparkConf
SparkContext.SparkContext.._env ( )  :  SparkEnv
SparkContext.SparkContext..assertNotStopped ( )  :  void
SparkContext.range ( long start, long end, long step, int numSlices )  :  rdd.RDD<Object>
SparkContext.setLogLevel ( String logLevel )  :  void
SparkContext.supportDynamicAllocation ( )  :  boolean
SparkContext.withScope ( scala.Function0<U> body )  :  U

spark-streaming_2.10-1.4.0.jar, Checkpoint.class
package org.apache.spark.streaming
Checkpoint.createSparkConf ( )  :  org.apache.spark.SparkConf
Checkpoint.deserialize ( java.io.InputStream p1, org.apache.spark.SparkConf p2 ) [static]  :  Checkpoint
Checkpoint.getCheckpointFiles ( String p1, scala.Option<org.apache.hadoop.fs.FileSystem> p2 ) [static]  :  scala.collection.Seq<org.apache.hadoop.fs.Path>
Checkpoint.serialize ( Checkpoint p1, org.apache.spark.SparkConf p2 ) [static]  :  byte[ ]

spark-streaming_2.10-1.4.0.jar, DStream<T>.class
package org.apache.spark.streaming.dstream
DStream<T>.baseScope ( )  :  scala.Option<String>
DStream<T>.createRDDWithLocalProperties ( org.apache.spark.streaming.Time time, scala.Function0<U> body )  :  U
DStream<T>.validateAtStart ( )  :  void

spark-streaming_2.10-1.4.0.jar, InputDStream<T>.class
package org.apache.spark.streaming.dstream
InputDStream<T>.baseScope ( )  :  scala.Option<String>
InputDStream<T>.id ( )  :  int
InputDStream<T>.name ( )  :  String

spark-streaming_2.10-1.4.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.getActive ( ) [static]  :  scala.Option<StreamingContext>
StreamingContext.getActiveOrCreate ( scala.Function0<StreamingContext> p1 ) [static]  :  StreamingContext
StreamingContext.getActiveOrCreate ( String p1, scala.Function0<StreamingContext> p2, org.apache.hadoop.conf.Configuration p3, boolean p4 ) [static]  :  StreamingContext
StreamingContext.getNewInputStreamId ( )  :  int
StreamingContext.getState ( )  :  StreamingContextState
StreamingContext.isCheckpointingEnabled ( )  :  boolean
StreamingContext.StreamingContext..startSite ( )  :  java.util.concurrent.atomic.AtomicReference<org.apache.spark.util.CallSite>
StreamingContext.StreamingContext..stopOnShutdown ( )  :  void
StreamingContext.StreamingContext ( String path, org.apache.spark.SparkContext sparkContext )
StreamingContext.withNamedScope ( String name, scala.Function0<U> body )  :  U
StreamingContext.withScope ( scala.Function0<U> body )  :  U

spark-streaming_2.10-1.4.0.jar, Time.class
package org.apache.spark.streaming
Time.floor ( Duration that, Time zeroTime )  :  Time

to the top

Problems with Data Types, High Severity (1)


spark-streaming_2.10-1.4.0.jar
package org.apache.spark.streaming.ui
[+] StreamingJobProgressListener (1)

to the top

Java ARchives (4)


spark-core_2.10-1.4.0.jar
spark-hive_2.10-1.4.0.jar
spark-sql_2.10-1.4.0.jar
spark-streaming_2.10-1.4.0.jar

to the top




Generated on Tue Aug 18 06:30:09 2015 for spark-testing-base_2.10-0.1.0 by Java API Compliance Checker 1.4.1  
A tool for checking backward compatibility of a Java library API