Binary compatibility report for the spark-sas7bdat-1.1.4-s_2.10 library between 1.5.0 and 1.3.0 versions (relating to the portability of client application spark-sas7bdat-1.1.4-s_2.10.jar)
Test Info
Library Name | spark-sas7bdat-1.1.4-s_2.10 |
Version #1 | 1.5.0 |
Version #2 | 1.3.0 |
Java Version | 1.7.0_75 |
Test Results
Total Java ARchives | 1 |
---|
Total Methods / Classes | 103 / 638 |
---|
Verdict | Incompatible (23.8%) |
Problem Summary
| Severity | Count |
---|
Added Methods | - | 4 |
---|
Removed Methods | High | 23 |
---|
Problems with Data Types | High | 1 |
---|
Medium | 1 |
Low | 1 |
Problems with Methods | High | 0 |
---|
Medium | 0 |
Low | 0 |
Added Methods (4)
spark-sql_2.10-1.3.0.jar, SQLContext.class
package org.apache.spark.sql
SQLContext.cacheManager ( ) : CacheManager
[mangled: org/apache/spark/sql/SQLContext.cacheManager:()Lorg/apache/spark/sql/CacheManager;]
SQLContext.checkAnalysis ( ) : catalyst.analysis.CheckAnalysis
[mangled: org/apache/spark/sql/SQLContext.checkAnalysis:()Lorg/apache/spark/sql/catalyst/analysis/CheckAnalysis;]
SQLContext.createDataFrame ( org.apache.spark.api.java.JavaRDD<Row> rowRDD, java.util.List<String> columns ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/api/java/JavaRDD;Ljava/util/List;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.ddlParser ( ) : sources.DDLParser
[mangled: org/apache/spark/sql/SQLContext.ddlParser:()Lorg/apache/spark/sql/sources/DDLParser;]
to the top
Removed Methods (23)
spark-sql_2.10-1.5.0.jar, BaseRelation.class
package org.apache.spark.sql.sources
BaseRelation.needConversion ( ) : boolean
[mangled: org/apache/spark/sql/sources/BaseRelation.needConversion:()Z]
spark-sql_2.10-1.5.0.jar, SQLContext.class
package org.apache.spark.sql
SQLContext.cacheManager ( ) : execution.CacheManager
[mangled: org/apache/spark/sql/SQLContext.cacheManager:()Lorg/apache/spark/sql/execution/CacheManager;]
SQLContext.createDataFrame ( org.apache.spark.rdd.RDD<Row> rowRDD, types.StructType schema, boolean needsConversion ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;Z)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.createSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.currentSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.currentSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.ddlParser ( ) : execution.datasources.DDLParser
[mangled: org/apache/spark/sql/SQLContext.ddlParser:()Lorg/apache/spark/sql/execution/datasources/DDLParser;]
SQLContext.defaultSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.defaultSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.detachSession ( ) : void
[mangled: org/apache/spark/sql/SQLContext.detachSession:()V]
SQLContext.dialectClassName ( ) : String
[mangled: org/apache/spark/sql/SQLContext.dialectClassName:()Ljava/lang/String;]
SQLContext.getConf ( SQLConf.SQLConfEntry<T> entry ) : T
[mangled: org/apache/spark/sql/SQLContext.getConf:(Lorg/apache/spark/sql/SQLConf$SQLConfEntry;)Ljava/lang/Object;]
SQLContext.getConf ( SQLConf.SQLConfEntry<T> entry, T defaultValue ) : T
[mangled: org/apache/spark/sql/SQLContext.getConf:(Lorg/apache/spark/sql/SQLConf$SQLConfEntry;Ljava/lang/Object;)Ljava/lang/Object;]
SQLContext.getOrCreate ( org.apache.spark.SparkContext p1 ) [static] : SQLContext
[mangled: org/apache/spark/sql/SQLContext.getOrCreate:(Lorg/apache/spark/SparkContext;)Lorg/apache/spark/sql/SQLContext;]
SQLContext.getSQLDialect ( ) : catalyst.ParserDialect
[mangled: org/apache/spark/sql/SQLContext.getSQLDialect:()Lorg/apache/spark/sql/catalyst/ParserDialect;]
SQLContext.internalCreateDataFrame ( org.apache.spark.rdd.RDD<catalyst.InternalRow> catalystRows, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.internalCreateDataFrame:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.listener ( ) : execution.ui.SQLListener
[mangled: org/apache/spark/sql/SQLContext.listener:()Lorg/apache/spark/sql/execution/ui/SQLListener;]
SQLContext.openSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.openSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.range ( long end ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.range:(J)Lorg/apache/spark/sql/DataFrame;]
SQLContext.range ( long start, long end ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.range:(JJ)Lorg/apache/spark/sql/DataFrame;]
SQLContext.range ( long start, long end, long step, int numPartitions ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.range:(JJJI)Lorg/apache/spark/sql/DataFrame;]
SQLContext.read ( ) : DataFrameReader
[mangled: org/apache/spark/sql/SQLContext.read:()Lorg/apache/spark/sql/DataFrameReader;]
SQLContext.setConf ( SQLConf.SQLConfEntry<T> entry, T value ) : void
[mangled: org/apache/spark/sql/SQLContext.setConf:(Lorg/apache/spark/sql/SQLConf$SQLConfEntry;Ljava/lang/Object;)V]
SQLContext.setSession ( SQLContext.SQLSession session ) : void
[mangled: org/apache/spark/sql/SQLContext.setSession:(Lorg/apache/spark/sql/SQLContext$SQLSession;)V]
SQLContext.tlSession ( ) : ThreadLocal<SQLContext.SQLSession>
[mangled: org/apache/spark/sql/SQLContext.tlSession:()Ljava/lang/ThreadLocal;]
to the top
Problems with Data Types, High Severity (1)
spark-sql_2.10-1.5.0.jar
package org.apache.spark.sql
[+] SQLConf (1)
| Change | Effect |
---|
1 | Removed super-interface catalyst.CatalystConf. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
conf ( )Return value of this method has type 'SQLConf'.
to the top
Problems with Data Types, Medium Severity (1)
spark-sql_2.10-1.5.0.jar
package org.apache.spark.sql
[+] SQLContext.implicits. (1)
| Change | Effect |
---|
1 | Removed super-class SQLImplicits. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (1)
implicits ( )Return value of this method has type 'SQLContext.implicits.'.
to the top
Problems with Data Types, Low Severity (1)
spark-sql_2.10-1.5.0.jar
package org.apache.spark.sql
[+] DataFrame (1)
| Change | Effect |
---|
1 | Added super-class java.lang.Object. | A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (22)
applySchemaToPythonRDD ( org.apache.spark.rdd.RDD<java.lang.Object[ ]>, java.lang.String )Return value of this method has type 'DataFrame'.
applySchemaToPythonRDD ( org.apache.spark.rdd.RDD<java.lang.Object[ ]>, types.StructType )Return value of this method has type 'DataFrame'.
baseRelationToDataFrame ( sources.BaseRelation )Return value of this method has type 'DataFrame'.
createDataFrame ( org.apache.spark.api.java.JavaRDD<?>, java.lang.Class<?> )Return value of this method has type 'DataFrame'.
createDataFrame ( org.apache.spark.api.java.JavaRDD<Row>, types.StructType )Return value of this method has type 'DataFrame'.
createDataFrame ( org.apache.spark.rdd.RDD<?>, java.lang.Class<?> )Return value of this method has type 'DataFrame'.
createDataFrame ( org.apache.spark.rdd.RDD<A>, scala.reflect.api.TypeTags.TypeTag<A> )Return value of this method has type 'DataFrame'.
createDataFrame ( org.apache.spark.rdd.RDD<Row>, types.StructType )Return value of this method has type 'DataFrame'.
createDataFrame ( scala.collection.Seq<A>, scala.reflect.api.TypeTags.TypeTag<A> )Return value of this method has type 'DataFrame'.
createExternalTable ( java.lang.String, java.lang.String )Return value of this method has type 'DataFrame'.
createExternalTable ( java.lang.String, java.lang.String, java.lang.String )Return value of this method has type 'DataFrame'.
createExternalTable ( java.lang.String, java.lang.String, java.util.Map<java.lang.String,java.lang.String> )Return value of this method has type 'DataFrame'.
createExternalTable ( java.lang.String, java.lang.String, types.StructType, java.util.Map<java.lang.String,java.lang.String> )Return value of this method has type 'DataFrame'.
createExternalTable ( java.lang.String, java.lang.String, types.StructType, scala.collection.immutable.Map<java.lang.String,java.lang.String> )Return value of this method has type 'DataFrame'.
createExternalTable ( java.lang.String, java.lang.String, scala.collection.immutable.Map<java.lang.String,java.lang.String> )Return value of this method has type 'DataFrame'.
emptyDataFrame ( )Return value of this method has type 'DataFrame'.
parquetFile ( java.lang.String... )Return value of this method has type 'DataFrame'.
registerDataFrameAsTable ( DataFrame, java.lang.String )1st parameter 'df' of this method has type 'DataFrame'.
sql ( java.lang.String )Return value of this method has type 'DataFrame'.
table ( java.lang.String )Return value of this method has type 'DataFrame'.
tables ( )Return value of this method has type 'DataFrame'.
tables ( java.lang.String )Return value of this method has type 'DataFrame'.
to the top
Java ARchives (1)
spark-sql_2.10-1.5.0.jar
to the top
Generated on Wed Oct 7 02:08:46 2015 for spark-sas7bdat-1.1.4-s_2.10 by Java API Compliance Checker 1.4.1
A tool for checking backward compatibility of a Java library API