Test Info
Library Name | demo-scala-python-0.1-s_2.10 |
Version #1 | 1.2.0 |
Version #2 | 1.1.0 |
Java Version | 1.7.0_75 |
Subject | Binary Compatibility |
Test Results
Total Java ARchives | 1 |
---|
Total Methods / Classes | 168 / 1189 |
---|
Verdict | Incompatible (12.5%) |
Problem Summary
| Severity | Count |
---|
Added Methods | - | 1 |
---|
Removed Methods | High | 19 |
---|
Problems with Data Types | High | 2 |
---|
Medium | 0 |
Low | 0 |
Problems with Methods | High | 0 |
---|
Medium | 0 |
Low | 0 |
Added Methods (1)
spark-core_2.10-1.1.0.jar,
SparkContext.class
package org.apache.spark
SparkContext.ui ( ) : ui.SparkUI
[mangled: org/apache/spark/SparkContext.ui:()Lorg/apache/spark/ui/SparkUI;]
to the top
Removed Methods (19)
spark-core_2.10-1.2.0.jar,
SparkContext.class
package org.apache.spark
SparkContext.applicationId ( ) : String
[mangled: org/apache/spark/SparkContext.applicationId:()Ljava/lang/String;]
SparkContext.binaryFiles ( String path, int minPartitions ) : rdd.RDD<scala.Tuple2<String,input.PortableDataStream>>
[mangled: org/apache/spark/SparkContext.binaryFiles:(Ljava/lang/String;I)Lorg/apache/spark/rdd/RDD;]
SparkContext.binaryRecords ( String path, int recordLength, org.apache.hadoop.conf.Configuration conf ) : rdd.RDD<byte[ ]>
[mangled: org/apache/spark/SparkContext.binaryRecords:(Ljava/lang/String;ILorg/apache/hadoop/conf/Configuration;)Lorg/apache/spark/rdd/RDD;]
SparkContext.eventLogDir ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.eventLogDir:()Lscala/Option;]
SparkContext.executorAllocationManager ( ) : scala.Option<ExecutorAllocationManager>
[mangled: org/apache/spark/SparkContext.executorAllocationManager:()Lscala/Option;]
SparkContext.getExecutorThreadDump ( String executorId ) : scala.Option<util.ThreadStackTrace[ ]>
[mangled: org/apache/spark/SparkContext.getExecutorThreadDump:(Ljava/lang/String;)Lscala/Option;]
SparkContext.isEventLogEnabled ( ) : boolean
[mangled: org/apache/spark/SparkContext.isEventLogEnabled:()Z]
SparkContext.jobProgressListener ( ) : ui.jobs.JobProgressListener
[mangled: org/apache/spark/SparkContext.jobProgressListener:()Lorg/apache/spark/ui/jobs/JobProgressListener;]
SparkContext.killExecutor ( String executorId ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutor:(Ljava/lang/String;)Z]
SparkContext.killExecutors ( scala.collection.Seq<String> executorIds ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutors:(Lscala/collection/Seq;)Z]
SparkContext.metricsSystem ( ) : metrics.MetricsSystem
[mangled: org/apache/spark/SparkContext.metricsSystem:()Lorg/apache/spark/metrics/MetricsSystem;]
SparkContext.SparkContext..creationSite ( ) : util.CallSite
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext..creationSite:()Lorg/apache/spark/util/CallSite;]
SparkContext.progressBar ( ) : scala.Option<ui.ConsoleProgressBar>
[mangled: org/apache/spark/SparkContext.progressBar:()Lscala/Option;]
SparkContext.requestExecutors ( int numAdditionalExecutors ) : boolean
[mangled: org/apache/spark/SparkContext.requestExecutors:(I)Z]
SparkContext.schedulerBackend ( ) : scheduler.SchedulerBackend
[mangled: org/apache/spark/SparkContext.schedulerBackend:()Lorg/apache/spark/scheduler/SchedulerBackend;]
SparkContext.schedulerBackend_.eq ( scheduler.SchedulerBackend p1 ) : void
[mangled: org/apache/spark/SparkContext.schedulerBackend_.eq:(Lorg/apache/spark/scheduler/SchedulerBackend;)V]
SparkContext.setCallSite ( util.CallSite callSite ) : void
[mangled: org/apache/spark/SparkContext.setCallSite:(Lorg/apache/spark/util/CallSite;)V]
SparkContext.statusTracker ( ) : SparkStatusTracker
[mangled: org/apache/spark/SparkContext.statusTracker:()Lorg/apache/spark/SparkStatusTracker;]
SparkContext.ui ( ) : scala.Option<ui.SparkUI>
[mangled: org/apache/spark/SparkContext.ui:()Lscala/Option;]
to the top
Problems with Data Types, High Severity (2)
spark-core_2.10-1.2.0.jar
package org.apache.spark.broadcast
[+] Broadcast<T> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.Logging. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
broadcast ( T, scala.reflect.ClassTag<T> )Return value of this method has type 'Broadcast<T>'.
package org.apache.spark.rdd
[+] PairRDDFunctions<K,V> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.mapreduce.SparkHadoopMapReduceUtil. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
rddToPairRDDFunctions ( RDD<scala.Tuple2<K,V>>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.math.Ordering<K> )Return value of this method has type 'PairRDDFunctions<K,V>'.
to the top
Java ARchives (1)
spark-core_2.10-1.2.0.jar
to the top
Test Info
Library Name | demo-scala-python-0.1-s_2.10 |
Version #1 | 1.2.0 |
Version #2 | 1.1.0 |
Java Version | 1.7.0_75 |
Subject | Source Compatibility |
Test Results
Total Java ARchives | 1 |
---|
Total Methods / Classes | 168 / 1189 |
---|
Verdict | Incompatible (12.5%) |
Problem Summary
| Severity | Count |
---|
Added Methods | - | 1 |
---|
Removed Methods | High | 19 |
---|
Problems with Data Types | High | 2 |
---|
Medium | 0 |
Low | 0 |
Problems with Methods | High | 0 |
---|
Medium | 0 |
Low | 0 |
Added Methods (1)
spark-core_2.10-1.1.0.jar,
SparkContext.class
package org.apache.spark
SparkContext.ui ( ) : ui.SparkUI
[mangled: org/apache/spark/SparkContext.ui:()Lorg/apache/spark/ui/SparkUI;]
to the top
Removed Methods (19)
spark-core_2.10-1.2.0.jar,
SparkContext.class
package org.apache.spark
SparkContext.applicationId ( ) : String
[mangled: org/apache/spark/SparkContext.applicationId:()Ljava/lang/String;]
SparkContext.binaryFiles ( String path, int minPartitions ) : rdd.RDD<scala.Tuple2<String,input.PortableDataStream>>
[mangled: org/apache/spark/SparkContext.binaryFiles:(Ljava/lang/String;I)Lorg/apache/spark/rdd/RDD;]
SparkContext.binaryRecords ( String path, int recordLength, org.apache.hadoop.conf.Configuration conf ) : rdd.RDD<byte[ ]>
[mangled: org/apache/spark/SparkContext.binaryRecords:(Ljava/lang/String;ILorg/apache/hadoop/conf/Configuration;)Lorg/apache/spark/rdd/RDD;]
SparkContext.eventLogDir ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.eventLogDir:()Lscala/Option;]
SparkContext.executorAllocationManager ( ) : scala.Option<ExecutorAllocationManager>
[mangled: org/apache/spark/SparkContext.executorAllocationManager:()Lscala/Option;]
SparkContext.getExecutorThreadDump ( String executorId ) : scala.Option<util.ThreadStackTrace[ ]>
[mangled: org/apache/spark/SparkContext.getExecutorThreadDump:(Ljava/lang/String;)Lscala/Option;]
SparkContext.isEventLogEnabled ( ) : boolean
[mangled: org/apache/spark/SparkContext.isEventLogEnabled:()Z]
SparkContext.jobProgressListener ( ) : ui.jobs.JobProgressListener
[mangled: org/apache/spark/SparkContext.jobProgressListener:()Lorg/apache/spark/ui/jobs/JobProgressListener;]
SparkContext.killExecutor ( String executorId ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutor:(Ljava/lang/String;)Z]
SparkContext.killExecutors ( scala.collection.Seq<String> executorIds ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutors:(Lscala/collection/Seq;)Z]
SparkContext.metricsSystem ( ) : metrics.MetricsSystem
[mangled: org/apache/spark/SparkContext.metricsSystem:()Lorg/apache/spark/metrics/MetricsSystem;]
SparkContext.SparkContext..creationSite ( ) : util.CallSite
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext..creationSite:()Lorg/apache/spark/util/CallSite;]
SparkContext.progressBar ( ) : scala.Option<ui.ConsoleProgressBar>
[mangled: org/apache/spark/SparkContext.progressBar:()Lscala/Option;]
SparkContext.requestExecutors ( int numAdditionalExecutors ) : boolean
[mangled: org/apache/spark/SparkContext.requestExecutors:(I)Z]
SparkContext.schedulerBackend ( ) : scheduler.SchedulerBackend
[mangled: org/apache/spark/SparkContext.schedulerBackend:()Lorg/apache/spark/scheduler/SchedulerBackend;]
SparkContext.schedulerBackend_.eq ( scheduler.SchedulerBackend p1 ) : void
[mangled: org/apache/spark/SparkContext.schedulerBackend_.eq:(Lorg/apache/spark/scheduler/SchedulerBackend;)V]
SparkContext.setCallSite ( util.CallSite callSite ) : void
[mangled: org/apache/spark/SparkContext.setCallSite:(Lorg/apache/spark/util/CallSite;)V]
SparkContext.statusTracker ( ) : SparkStatusTracker
[mangled: org/apache/spark/SparkContext.statusTracker:()Lorg/apache/spark/SparkStatusTracker;]
SparkContext.ui ( ) : scala.Option<ui.SparkUI>
[mangled: org/apache/spark/SparkContext.ui:()Lscala/Option;]
to the top
Problems with Data Types, High Severity (2)
spark-core_2.10-1.2.0.jar
package org.apache.spark.broadcast
[+] Broadcast<T> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.Logging. | Recompilation of a client program may be terminated with the message: cannot find method in class Broadcast<T>. |
[+] affected methods (1)
broadcast ( T, scala.reflect.ClassTag<T> )Return value of this method has type 'Broadcast<T>'.
package org.apache.spark.rdd
[+] PairRDDFunctions<K,V> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.mapreduce.SparkHadoopMapReduceUtil. | Recompilation of a client program may be terminated with the message: cannot find method in class PairRDDFunctions<K,V>. |
[+] affected methods (1)
rddToPairRDDFunctions ( RDD<scala.Tuple2<K,V>>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.math.Ordering<K> )Return value of this method has type 'PairRDDFunctions<K,V>'.
to the top
Java ARchives (1)
spark-core_2.10-1.2.0.jar
to the top