Binary compatibility report for the spark-testing-base-1.2.0_0.0.5 library  between 1.2.0 and 1.0.0 versions   (relating to the portability of client application spark-testing-base-1.2.0_0.0.5.jar)

Test Info


Library Namespark-testing-base-1.2.0_0.0.5
Version #11.2.0
Version #21.0.0
Java Version1.7.0_75

Test Results


Total Java ARchives2
Total Methods / Classes433 / 1414
VerdictIncompatible
(11.5%)

Problem Summary


SeverityCount
Added Methods-6
Removed MethodsHigh48
Problems with
Data Types
High2
Medium0
Low0
Problems with
Methods
High0
Medium0
Low0

Added Methods (6)


spark-core_2.10-1.0.0.jar, SparkConf.class
package org.apache.spark
SparkConf.SparkConf..settings ( )  :  scala.collection.mutable.HashMap<String,String>

spark-core_2.10-1.0.0.jar, SparkContext.class
package org.apache.spark
SparkContext.clean ( F f )  :  F
SparkContext.getCallSite ( )  :  String
SparkContext.ui ( )  :  ui.SparkUI

spark-streaming_2.10-1.0.0.jar, Checkpoint.class
package org.apache.spark.streaming
Checkpoint.sparkHome ( )  :  String

spark-streaming_2.10-1.0.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.uiTab ( )  :  ui.StreamingTab

to the top

Removed Methods (48)


spark-core_2.10-1.2.0.jar, SparkConf.class
package org.apache.spark
SparkConf.getAppId ( )  :  String
SparkConf.getenv ( String name )  :  String
SparkConf.isAkkaConf ( String p1 ) [static]  :  boolean
SparkConf.isExecutorStartupConf ( String p1 ) [static]  :  boolean
SparkConf.isSparkPortConf ( String p1 ) [static]  :  boolean
SparkConf.logName ( )  :  String
SparkConf.registerKryoClasses ( Class<?>[ ] classes )  :  SparkConf
SparkConf.settings ( )  :  scala.collection.mutable.HashMap<String,String>

spark-core_2.10-1.2.0.jar, SparkContext.class
package org.apache.spark
SparkContext.accumulable ( R initialValue, String name, AccumulableParam<R,T> param )  :  Accumulable<R,T>
SparkContext.accumulator ( T initialValue, String name, AccumulatorParam<T> param )  :  Accumulator<T>
SparkContext.applicationId ( )  :  String
SparkContext.binaryFiles ( String path, int minPartitions )  :  rdd.RDD<scala.Tuple2<String,input.PortableDataStream>>
SparkContext.binaryRecords ( String path, int recordLength, org.apache.hadoop.conf.Configuration conf )  :  rdd.RDD<byte[ ]>
SparkContext.clean ( F f, boolean checkSerializable )  :  F
SparkContext.eventLogDir ( )  :  scala.Option<String>
SparkContext.executorAllocationManager ( )  :  scala.Option<ExecutorAllocationManager>
SparkContext.getCallSite ( )  :  util.CallSite
SparkContext.getExecutorThreadDump ( String executorId )  :  scala.Option<util.ThreadStackTrace[ ]>
SparkContext.isEventLogEnabled ( )  :  boolean
SparkContext.jobProgressListener ( )  :  ui.jobs.JobProgressListener
SparkContext.killExecutor ( String executorId )  :  boolean
SparkContext.killExecutors ( scala.collection.Seq<String> executorIds )  :  boolean
SparkContext.logName ( )  :  String
SparkContext.metricsSystem ( )  :  metrics.MetricsSystem
SparkContext.SparkContext..creationSite ( )  :  util.CallSite
SparkContext.progressBar ( )  :  scala.Option<ui.ConsoleProgressBar>
SparkContext.requestExecutors ( int numAdditionalExecutors )  :  boolean
SparkContext.schedulerBackend ( )  :  scheduler.SchedulerBackend
SparkContext.schedulerBackend_.eq ( scheduler.SchedulerBackend p1 )  :  void
SparkContext.setCallSite ( util.CallSite callSite )  :  void
SparkContext.statusTracker ( )  :  SparkStatusTracker
SparkContext.ui ( )  :  scala.Option<ui.SparkUI>

spark-streaming_2.10-1.2.0.jar, Checkpoint.class
package org.apache.spark.streaming
Checkpoint.logName ( )  :  String

spark-streaming_2.10-1.2.0.jar, DStream<T>.class
package org.apache.spark.streaming.dstream
DStream<T>.creationSite ( )  :  org.apache.spark.util.CallSite
DStream<T>.getCreationSite ( ) [static]  :  org.apache.spark.util.CallSite
DStream<T>.logName ( )  :  String

spark-streaming_2.10-1.2.0.jar, Duration.class
package org.apache.spark.streaming
Duration.div ( Duration that )  :  double
Duration.greater ( Duration that )  :  boolean
Duration.greaterEq ( Duration that )  :  boolean
Duration.less ( Duration that )  :  boolean
Duration.lessEq ( Duration that )  :  boolean
Duration.minus ( Duration that )  :  Duration
Duration.plus ( Duration that )  :  Duration
Duration.times ( int times )  :  Duration

spark-streaming_2.10-1.2.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.logName ( )  :  String
StreamingContext.progressListener ( )  :  ui.StreamingJobProgressListener
StreamingContext.StreamingContext ( String path )
StreamingContext.uiTab ( )  :  scala.Option<ui.StreamingTab>

to the top

Problems with Data Types, High Severity (2)


spark-core_2.10-1.2.0.jar
package org.apache.spark.broadcast
[+] Broadcast<T> (1)

package org.apache.spark.rdd
[+] PairRDDFunctions<K,V> (1)

to the top

Java ARchives (2)


spark-core_2.10-1.2.0.jar
spark-streaming_2.10-1.2.0.jar

to the top




Generated on Tue Mar 31 06:37:53 2015 for spark-testing-base-1.2.0_0.0.5 by Java API Compliance Checker 1.4.1  
A tool for checking backward compatibility of a Java library API