Binary compatibility report for the spark-connector_2.10-1.0.0 library between 1.4.0 and 1.0.0 versions (relating to the portability of client application spark-connector_2.10-1.0.0.jar)
Test Info
Library Name | spark-connector_2.10-1.0.0 |
Version #1 | 1.4.0 |
Version #2 | 1.0.0 |
Java Version | 1.7.0_75 |
Test Results
Total Java ARchives | 3 |
---|
Total Methods / Classes | 1008 / 2720 |
---|
Verdict | Incompatible (64.4%) |
Problem Summary
| Severity | Count |
---|
Added Methods | - | 107 |
---|
Removed Methods | High | 385 |
---|
Problems with Data Types | High | 33 |
---|
Medium | 4 |
Low | 3 |
Problems with Methods | High | 0 |
---|
Medium | 0 |
Low | 3 |
Added Methods (107)
spark-core_2.10-1.0.0.jar, JavaRDD<T>.class
package org.apache.spark.api.java
JavaRDD<T>.aggregate ( U zeroValue, function.Function2<U,T,U> seqOp, function.Function2<U,U,U> combOp ) : U
[mangled: org/apache/spark/api/java/JavaRDD<T>.aggregate:(Ljava/lang/Object;Lorg/apache/spark/api/java/function/Function2;Lorg/apache/spark/api/java/function/Function2;)Ljava/lang/Object;]
JavaRDD<T>.cartesian ( JavaRDDLike<U,?> other ) : JavaPairRDD<T,U>
[mangled: org/apache/spark/api/java/JavaRDD<T>.cartesian:(Lorg/apache/spark/api/java/JavaRDDLike;)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaRDD<T>.checkpoint ( ) : void
[mangled: org/apache/spark/api/java/JavaRDD<T>.checkpoint:()V]
JavaRDD<T>.collect ( ) : java.util.List<T>
[mangled: org/apache/spark/api/java/JavaRDD<T>.collect:()Ljava/util/List;]
JavaRDD<T>.collectPartitions ( int[ ] partitionIds ) : java.util.List<T>[ ]
[mangled: org/apache/spark/api/java/JavaRDD<T>.collectPartitions:([I)[Ljava/util/List;]
JavaRDD<T>.context ( ) : org.apache.spark.SparkContext
[mangled: org/apache/spark/api/java/JavaRDD<T>.context:()Lorg/apache/spark/SparkContext;]
JavaRDD<T>.count ( ) : long
[mangled: org/apache/spark/api/java/JavaRDD<T>.count:()J]
JavaRDD<T>.countApprox ( long timeout ) : org.apache.spark.partial.PartialResult<org.apache.spark.partial.BoundedDouble>
[mangled: org/apache/spark/api/java/JavaRDD<T>.countApprox:(J)Lorg/apache/spark/partial/PartialResult;]
JavaRDD<T>.countApprox ( long timeout, double confidence ) : org.apache.spark.partial.PartialResult<org.apache.spark.partial.BoundedDouble>
[mangled: org/apache/spark/api/java/JavaRDD<T>.countApprox:(JD)Lorg/apache/spark/partial/PartialResult;]
JavaRDD<T>.countApproxDistinct ( double relativeSD ) : long
[mangled: org/apache/spark/api/java/JavaRDD<T>.countApproxDistinct:(D)J]
JavaRDD<T>.countByValue ( ) : java.util.Map<T,Long>
[mangled: org/apache/spark/api/java/JavaRDD<T>.countByValue:()Ljava/util/Map;]
JavaRDD<T>.countByValueApprox ( long timeout ) : org.apache.spark.partial.PartialResult<java.util.Map<T,org.apache.spark.partial.BoundedDouble>>
[mangled: org/apache/spark/api/java/JavaRDD<T>.countByValueApprox:(J)Lorg/apache/spark/partial/PartialResult;]
JavaRDD<T>.countByValueApprox ( long timeout, double confidence ) : org.apache.spark.partial.PartialResult<java.util.Map<T,org.apache.spark.partial.BoundedDouble>>
[mangled: org/apache/spark/api/java/JavaRDD<T>.countByValueApprox:(JD)Lorg/apache/spark/partial/PartialResult;]
JavaRDD<T>.first ( ) : T
[mangled: org/apache/spark/api/java/JavaRDD<T>.first:()Ljava/lang/Object;]
JavaRDD<T>.flatMap ( function.FlatMapFunction<T,U> f ) : JavaRDD<U>
[mangled: org/apache/spark/api/java/JavaRDD<T>.flatMap:(Lorg/apache/spark/api/java/function/FlatMapFunction;)Lorg/apache/spark/api/java/JavaRDD;]
JavaRDD<T>.flatMapToDouble ( function.DoubleFlatMapFunction<T> f ) : JavaDoubleRDD
[mangled: org/apache/spark/api/java/JavaRDD<T>.flatMapToDouble:(Lorg/apache/spark/api/java/function/DoubleFlatMapFunction;)Lorg/apache/spark/api/java/JavaDoubleRDD;]
JavaRDD<T>.flatMapToPair ( function.PairFlatMapFunction<T,K2,V2> f ) : JavaPairRDD<K2,V2>
[mangled: org/apache/spark/api/java/JavaRDD<T>.flatMapToPair:(Lorg/apache/spark/api/java/function/PairFlatMapFunction;)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaRDD<T>.fold ( T zeroValue, function.Function2<T,T,T> f ) : T
[mangled: org/apache/spark/api/java/JavaRDD<T>.fold:(Ljava/lang/Object;Lorg/apache/spark/api/java/function/Function2;)Ljava/lang/Object;]
JavaRDD<T>.foreach ( function.VoidFunction<T> f ) : void
[mangled: org/apache/spark/api/java/JavaRDD<T>.foreach:(Lorg/apache/spark/api/java/function/VoidFunction;)V]
JavaRDD<T>.foreachPartition ( function.VoidFunction<java.util.Iterator<T>> f ) : void
[mangled: org/apache/spark/api/java/JavaRDD<T>.foreachPartition:(Lorg/apache/spark/api/java/function/VoidFunction;)V]
JavaRDD<T>.getCheckpointFile ( ) : com.google.common.base.Optional<String>
[mangled: org/apache/spark/api/java/JavaRDD<T>.getCheckpointFile:()Lcom/google/common/base/Optional;]
JavaRDD<T>.getStorageLevel ( ) : org.apache.spark.storage.StorageLevel
[mangled: org/apache/spark/api/java/JavaRDD<T>.getStorageLevel:()Lorg/apache/spark/storage/StorageLevel;]
JavaRDD<T>.glom ( ) : JavaRDD<java.util.List<T>>
[mangled: org/apache/spark/api/java/JavaRDD<T>.glom:()Lorg/apache/spark/api/java/JavaRDD;]
JavaRDD<T>.groupBy ( function.Function<T,K> f ) : JavaPairRDD<K,Iterable<T>>
[mangled: org/apache/spark/api/java/JavaRDD<T>.groupBy:(Lorg/apache/spark/api/java/function/Function;)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaRDD<T>.groupBy ( function.Function<T,K> f, int numPartitions ) : JavaPairRDD<K,Iterable<T>>
[mangled: org/apache/spark/api/java/JavaRDD<T>.groupBy:(Lorg/apache/spark/api/java/function/Function;I)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaRDD<T>.id ( ) : int
[mangled: org/apache/spark/api/java/JavaRDD<T>.id:()I]
JavaRDD<T>.isCheckpointed ( ) : boolean
[mangled: org/apache/spark/api/java/JavaRDD<T>.isCheckpointed:()Z]
JavaRDD<T>.iterator ( org.apache.spark.Partition split, org.apache.spark.TaskContext taskContext ) : java.util.Iterator<T>
[mangled: org/apache/spark/api/java/JavaRDD<T>.iterator:(Lorg/apache/spark/Partition;Lorg/apache/spark/TaskContext;)Ljava/util/Iterator;]
JavaRDD<T>.keyBy ( function.Function<T,K> f ) : JavaPairRDD<K,T>
[mangled: org/apache/spark/api/java/JavaRDD<T>.keyBy:(Lorg/apache/spark/api/java/function/Function;)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaRDD<T>.map ( function.Function<T,R> f ) : JavaRDD<R>
[mangled: org/apache/spark/api/java/JavaRDD<T>.map:(Lorg/apache/spark/api/java/function/Function;)Lorg/apache/spark/api/java/JavaRDD;]
JavaRDD<T>.mapPartitions ( function.FlatMapFunction<java.util.Iterator<T>,U> f ) : JavaRDD<U>
[mangled: org/apache/spark/api/java/JavaRDD<T>.mapPartitions:(Lorg/apache/spark/api/java/function/FlatMapFunction;)Lorg/apache/spark/api/java/JavaRDD;]
JavaRDD<T>.mapPartitions ( function.FlatMapFunction<java.util.Iterator<T>,U> f, boolean preservesPartitioning ) : JavaRDD<U>
[mangled: org/apache/spark/api/java/JavaRDD<T>.mapPartitions:(Lorg/apache/spark/api/java/function/FlatMapFunction;Z)Lorg/apache/spark/api/java/JavaRDD;]
JavaRDD<T>.mapPartitionsToDouble ( function.DoubleFlatMapFunction<java.util.Iterator<T>> f ) : JavaDoubleRDD
[mangled: org/apache/spark/api/java/JavaRDD<T>.mapPartitionsToDouble:(Lorg/apache/spark/api/java/function/DoubleFlatMapFunction;)Lorg/apache/spark/api/java/JavaDoubleRDD;]
JavaRDD<T>.mapPartitionsToDouble ( function.DoubleFlatMapFunction<java.util.Iterator<T>> f, boolean preservesPartitioning ) : JavaDoubleRDD
[mangled: org/apache/spark/api/java/JavaRDD<T>.mapPartitionsToDouble:(Lorg/apache/spark/api/java/function/DoubleFlatMapFunction;Z)Lorg/apache/spark/api/java/JavaDoubleRDD;]
JavaRDD<T>.mapPartitionsToPair ( function.PairFlatMapFunction<java.util.Iterator<T>,K2,V2> f ) : JavaPairRDD<K2,V2>
[mangled: org/apache/spark/api/java/JavaRDD<T>.mapPartitionsToPair:(Lorg/apache/spark/api/java/function/PairFlatMapFunction;)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaRDD<T>.mapPartitionsToPair ( function.PairFlatMapFunction<java.util.Iterator<T>,K2,V2> f, boolean preservesPartitioning ) : JavaPairRDD<K2,V2>
[mangled: org/apache/spark/api/java/JavaRDD<T>.mapPartitionsToPair:(Lorg/apache/spark/api/java/function/PairFlatMapFunction;Z)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaRDD<T>.mapPartitionsWithIndex ( function.Function2<Integer,java.util.Iterator<T>,java.util.Iterator<R>> f, boolean preservesPartitioning ) : JavaRDD<R>
[mangled: org/apache/spark/api/java/JavaRDD<T>.mapPartitionsWithIndex:(Lorg/apache/spark/api/java/function/Function2;Z)Lorg/apache/spark/api/java/JavaRDD;]
JavaRDD<T>.mapToDouble ( function.DoubleFunction<T> f ) : JavaDoubleRDD
[mangled: org/apache/spark/api/java/JavaRDD<T>.mapToDouble:(Lorg/apache/spark/api/java/function/DoubleFunction;)Lorg/apache/spark/api/java/JavaDoubleRDD;]
JavaRDD<T>.mapToPair ( function.PairFunction<T,K2,V2> f ) : JavaPairRDD<K2,V2>
[mangled: org/apache/spark/api/java/JavaRDD<T>.mapToPair:(Lorg/apache/spark/api/java/function/PairFunction;)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaRDD<T>.max ( java.util.Comparator<T> comp ) : T
[mangled: org/apache/spark/api/java/JavaRDD<T>.max:(Ljava/util/Comparator;)Ljava/lang/Object;]
JavaRDD<T>.min ( java.util.Comparator<T> comp ) : T
[mangled: org/apache/spark/api/java/JavaRDD<T>.min:(Ljava/util/Comparator;)Ljava/lang/Object;]
JavaRDD<T>.name ( ) : String
[mangled: org/apache/spark/api/java/JavaRDD<T>.name:()Ljava/lang/String;]
JavaRDD<T>.pipe ( java.util.List<String> command ) : JavaRDD<String>
[mangled: org/apache/spark/api/java/JavaRDD<T>.pipe:(Ljava/util/List;)Lorg/apache/spark/api/java/JavaRDD;]
JavaRDD<T>.pipe ( java.util.List<String> command, java.util.Map<String,String> env ) : JavaRDD<String>
[mangled: org/apache/spark/api/java/JavaRDD<T>.pipe:(Ljava/util/List;Ljava/util/Map;)Lorg/apache/spark/api/java/JavaRDD;]
JavaRDD<T>.pipe ( String command ) : JavaRDD<String>
[mangled: org/apache/spark/api/java/JavaRDD<T>.pipe:(Ljava/lang/String;)Lorg/apache/spark/api/java/JavaRDD;]
JavaRDD<T>.reduce ( function.Function2<T,T,T> f ) : T
[mangled: org/apache/spark/api/java/JavaRDD<T>.reduce:(Lorg/apache/spark/api/java/function/Function2;)Ljava/lang/Object;]
JavaRDD<T>.saveAsObjectFile ( String path ) : void
[mangled: org/apache/spark/api/java/JavaRDD<T>.saveAsObjectFile:(Ljava/lang/String;)V]
JavaRDD<T>.saveAsTextFile ( String path ) : void
[mangled: org/apache/spark/api/java/JavaRDD<T>.saveAsTextFile:(Ljava/lang/String;)V]
JavaRDD<T>.saveAsTextFile ( String path, Class<? extends org.apache.hadoop.io.compress.CompressionCodec> codec ) : void
[mangled: org/apache/spark/api/java/JavaRDD<T>.saveAsTextFile:(Ljava/lang/String;Ljava/lang/Class;)V]
JavaRDD<T>.splits ( ) : java.util.List<org.apache.spark.Partition>
[mangled: org/apache/spark/api/java/JavaRDD<T>.splits:()Ljava/util/List;]
JavaRDD<T>.take ( int num ) : java.util.List<T>
[mangled: org/apache/spark/api/java/JavaRDD<T>.take:(I)Ljava/util/List;]
JavaRDD<T>.takeOrdered ( int num ) : java.util.List<T>
[mangled: org/apache/spark/api/java/JavaRDD<T>.takeOrdered:(I)Ljava/util/List;]
JavaRDD<T>.takeOrdered ( int num, java.util.Comparator<T> comp ) : java.util.List<T>
[mangled: org/apache/spark/api/java/JavaRDD<T>.takeOrdered:(ILjava/util/Comparator;)Ljava/util/List;]
JavaRDD<T>.takeSample ( boolean withReplacement, int num ) : java.util.List<T>
[mangled: org/apache/spark/api/java/JavaRDD<T>.takeSample:(ZI)Ljava/util/List;]
JavaRDD<T>.takeSample ( boolean withReplacement, int num, long seed ) : java.util.List<T>
[mangled: org/apache/spark/api/java/JavaRDD<T>.takeSample:(ZIJ)Ljava/util/List;]
JavaRDD<T>.toArray ( ) : java.util.List<T>
[mangled: org/apache/spark/api/java/JavaRDD<T>.toArray:()Ljava/util/List;]
JavaRDD<T>.toDebugString ( ) : String
[mangled: org/apache/spark/api/java/JavaRDD<T>.toDebugString:()Ljava/lang/String;]
JavaRDD<T>.toLocalIterator ( ) : java.util.Iterator<T>
[mangled: org/apache/spark/api/java/JavaRDD<T>.toLocalIterator:()Ljava/util/Iterator;]
JavaRDD<T>.top ( int num ) : java.util.List<T>
[mangled: org/apache/spark/api/java/JavaRDD<T>.top:(I)Ljava/util/List;]
JavaRDD<T>.top ( int num, java.util.Comparator<T> comp ) : java.util.List<T>
[mangled: org/apache/spark/api/java/JavaRDD<T>.top:(ILjava/util/Comparator;)Ljava/util/List;]
JavaRDD<T>.zip ( JavaRDDLike<U,?> other ) : JavaPairRDD<T,U>
[mangled: org/apache/spark/api/java/JavaRDD<T>.zip:(Lorg/apache/spark/api/java/JavaRDDLike;)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaRDD<T>.zipPartitions ( JavaRDDLike<U,?> other, function.FlatMapFunction2<java.util.Iterator<T>,java.util.Iterator<U>,V> f ) : JavaRDD<V>
[mangled: org/apache/spark/api/java/JavaRDD<T>.zipPartitions:(Lorg/apache/spark/api/java/JavaRDDLike;Lorg/apache/spark/api/java/function/FlatMapFunction2;)Lorg/apache/spark/api/java/JavaRDD;]
JavaRDD<T>.zipWithIndex ( ) : JavaPairRDD<T,Long>
[mangled: org/apache/spark/api/java/JavaRDD<T>.zipWithIndex:()Lorg/apache/spark/api/java/JavaPairRDD;]
JavaRDD<T>.zipWithUniqueId ( ) : JavaPairRDD<T,Long>
[mangled: org/apache/spark/api/java/JavaRDD<T>.zipWithUniqueId:()Lorg/apache/spark/api/java/JavaPairRDD;]
spark-core_2.10-1.0.0.jar, RDD<T>.class
package org.apache.spark.rdd
RDD<T>.creationSiteInfo ( ) : org.apache.spark.util.Utils.CallSiteInfo
[mangled: org/apache/spark/rdd/RDD<T>.creationSiteInfo:()Lorg/apache/spark/util/Utils$CallSiteInfo;]
spark-core_2.10-1.0.0.jar, SparkConf.class
package org.apache.spark
SparkConf.SparkConf..settings ( ) : scala.collection.mutable.HashMap<String,String>
[mangled: org/apache/spark/SparkConf.org.apache.spark.SparkConf..settings:()Lscala/collection/mutable/HashMap;]
spark-core_2.10-1.0.0.jar, SparkContext.class
package org.apache.spark
SparkContext.clean ( F f ) : F
[mangled: org/apache/spark/SparkContext.clean:(Ljava/lang/Object;)Ljava/lang/Object;]
SparkContext.getCallSite ( ) : String
[mangled: org/apache/spark/SparkContext.getCallSite:()Ljava/lang/String;]
SparkContext.ui ( ) : ui.SparkUI
[mangled: org/apache/spark/SparkContext.ui:()Lorg/apache/spark/ui/SparkUI;]
spark-core_2.10-1.0.0.jar, TaskContext.class
package org.apache.spark
TaskContext.completed ( ) : boolean
[mangled: org/apache/spark/TaskContext.completed:()Z]
TaskContext.completed_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/TaskContext.completed_.eq:(Z)V]
TaskContext.executeOnCompleteCallbacks ( ) : void
[mangled: org/apache/spark/TaskContext.executeOnCompleteCallbacks:()V]
TaskContext.interrupted ( ) : boolean
[mangled: org/apache/spark/TaskContext.interrupted:()Z]
TaskContext.interrupted_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/TaskContext.interrupted_.eq:(Z)V]
TaskContext.TaskContext ( int stageId, int partitionId, long attemptId, boolean runningLocally, executor.TaskMetrics taskMetrics )
[mangled: org/apache/spark/TaskContext."<init>":(IIJZLorg/apache/spark/executor/TaskMetrics;)V]
spark-sql_2.10-1.0.0.jar, SQLContext.class
package org.apache.spark.sql
SQLContext.binaryToLiteral ( byte[ ] a ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.binaryToLiteral:([B)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.booleanToLiteral ( boolean b ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.booleanToLiteral:(Z)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.byteToLiteral ( byte b ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.byteToLiteral:(B)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.createParquetFile ( String path, boolean allowExisting, org.apache.hadoop.conf.Configuration conf, scala.reflect.api.TypeTags.TypeTag<A> p4 ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.createParquetFile:(Ljava/lang/String;ZLorg/apache/hadoop/conf/Configuration;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.createSchemaRDD ( org.apache.spark.rdd.RDD<A> rdd, scala.reflect.api.TypeTags.TypeTag<A> p2 ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.createSchemaRDD:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.decimalToLiteral ( scala.math.BigDecimal d ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.decimalToLiteral:(Lscala/math/BigDecimal;)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.doubleToLiteral ( double d ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.doubleToLiteral:(D)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.DslAttribute ( catalyst.expressions.AttributeReference a ) : catalyst.dsl.package.ExpressionConversions.DslAttribute
[mangled: org/apache/spark/sql/SQLContext.DslAttribute:(Lorg/apache/spark/sql/catalyst/expressions/AttributeReference;)Lorg/apache/spark/sql/catalyst/dsl/package$ExpressionConversions$DslAttribute;]
SQLContext.DslExpression ( catalyst.expressions.Expression e ) : catalyst.dsl.package.ExpressionConversions.DslExpression
[mangled: org/apache/spark/sql/SQLContext.DslExpression:(Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/catalyst/dsl/package$ExpressionConversions$DslExpression;]
SQLContext.DslString ( String s ) : catalyst.dsl.package.ExpressionConversions.DslString
[mangled: org/apache/spark/sql/SQLContext.DslString:(Ljava/lang/String;)Lorg/apache/spark/sql/catalyst/dsl/package$ExpressionConversions$DslString;]
SQLContext.DslSymbol ( scala.Symbol sym ) : catalyst.dsl.package.ExpressionConversions.DslSymbol
[mangled: org/apache/spark/sql/SQLContext.DslSymbol:(Lscala/Symbol;)Lorg/apache/spark/sql/catalyst/dsl/package$ExpressionConversions$DslSymbol;]
SQLContext.floatToLiteral ( float f ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.floatToLiteral:(F)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.inferSchema ( org.apache.spark.rdd.RDD<scala.collection.immutable.Map<String,Object>> rdd ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.inferSchema:(Lorg/apache/spark/rdd/RDD;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.intToLiteral ( int i ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.intToLiteral:(I)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.logger ( ) : com.typesafe.scalalogging.slf4j.Logger
[mangled: org/apache/spark/sql/SQLContext.logger:()Lcom/typesafe/scalalogging/slf4j/Logger;]
SQLContext.logicalPlanToSparkQuery ( catalyst.plans.logical.LogicalPlan plan ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.logicalPlanToSparkQuery:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.longToLiteral ( long l ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.longToLiteral:(J)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.optimizer ( ) : catalyst.optimizer.Optimizer.
[mangled: org/apache/spark/sql/SQLContext.optimizer:()Lorg/apache/spark/sql/catalyst/optimizer/Optimizer$;]
SQLContext.parquetFile ( String path ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.parquetFile:(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.parser ( ) : catalyst.SqlParser
[mangled: org/apache/spark/sql/SQLContext.parser:()Lorg/apache/spark/sql/catalyst/SqlParser;]
SQLContext.registerRDDAsTable ( SchemaRDD rdd, String tableName ) : void
[mangled: org/apache/spark/sql/SQLContext.registerRDDAsTable:(Lorg/apache/spark/sql/SchemaRDD;Ljava/lang/String;)V]
SQLContext.shortToLiteral ( short s ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.shortToLiteral:(S)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.sql ( String sqlText ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.sql:(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.stringToLiteral ( String s ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.stringToLiteral:(Ljava/lang/String;)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.symbolToUnresolvedAttribute ( scala.Symbol s ) : catalyst.analysis.UnresolvedAttribute
[mangled: org/apache/spark/sql/SQLContext.symbolToUnresolvedAttribute:(Lscala/Symbol;)Lorg/apache/spark/sql/catalyst/analysis/UnresolvedAttribute;]
SQLContext.table ( String tableName ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.table:(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.timestampToLiteral ( java.sql.Timestamp t ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.timestampToLiteral:(Ljava/sql/Timestamp;)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
spark-streaming_2.10-1.0.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.getNewReceiverStreamId ( ) : int
[mangled: org/apache/spark/streaming/StreamingContext.getNewReceiverStreamId:()I]
StreamingContext.state ( ) : scala.Enumeration.Value
[mangled: org/apache/spark/streaming/StreamingContext.state:()Lscala/Enumeration$Value;]
StreamingContext.state_.eq ( scala.Enumeration.Value p1 ) : void
[mangled: org/apache/spark/streaming/StreamingContext.state_.eq:(Lscala/Enumeration$Value;)V]
StreamingContext.StreamingContextState ( ) : StreamingContext.StreamingContextState.
[mangled: org/apache/spark/streaming/StreamingContext.StreamingContextState:()Lorg/apache/spark/streaming/StreamingContext$StreamingContextState$;]
StreamingContext.uiTab ( ) : ui.StreamingTab
[mangled: org/apache/spark/streaming/StreamingContext.uiTab:()Lorg/apache/spark/streaming/ui/StreamingTab;]
to the top
Removed Methods (385)
spark-core_2.10-1.4.0.jar, JavaRDD<T>.class
package org.apache.spark.api.java
JavaRDD<T>.randomSplit ( double[ ] weights ) : JavaRDD<T>[ ]
[mangled: org/apache/spark/api/java/JavaRDD<T>.randomSplit:([D)[Lorg/apache/spark/api/java/JavaRDD;]
JavaRDD<T>.randomSplit ( double[ ] weights, long seed ) : JavaRDD<T>[ ]
[mangled: org/apache/spark/api/java/JavaRDD<T>.randomSplit:([DJ)[Lorg/apache/spark/api/java/JavaRDD;]
JavaRDD<T>.sortBy ( function.Function<T,S> f, boolean ascending, int numPartitions ) : JavaRDD<T>
[mangled: org/apache/spark/api/java/JavaRDD<T>.sortBy:(Lorg/apache/spark/api/java/function/Function;ZI)Lorg/apache/spark/api/java/JavaRDD;]
spark-core_2.10-1.4.0.jar, JavaSparkContext.class
package org.apache.spark.api.java
JavaSparkContext.accumulable ( T initialValue, String name, org.apache.spark.AccumulableParam<T,R> param ) : org.apache.spark.Accumulable<T,R>
[mangled: org/apache/spark/api/java/JavaSparkContext.accumulable:(Ljava/lang/Object;Ljava/lang/String;Lorg/apache/spark/AccumulableParam;)Lorg/apache/spark/Accumulable;]
JavaSparkContext.accumulator ( double initialValue, String name ) : org.apache.spark.Accumulator<Double>
[mangled: org/apache/spark/api/java/JavaSparkContext.accumulator:(DLjava/lang/String;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.accumulator ( int initialValue, String name ) : org.apache.spark.Accumulator<Integer>
[mangled: org/apache/spark/api/java/JavaSparkContext.accumulator:(ILjava/lang/String;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.accumulator ( T initialValue, String name, org.apache.spark.AccumulatorParam<T> accumulatorParam ) : org.apache.spark.Accumulator<T>
[mangled: org/apache/spark/api/java/JavaSparkContext.accumulator:(Ljava/lang/Object;Ljava/lang/String;Lorg/apache/spark/AccumulatorParam;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.binaryFiles ( String path ) : JavaPairRDD<String,org.apache.spark.input.PortableDataStream>
[mangled: org/apache/spark/api/java/JavaSparkContext.binaryFiles:(Ljava/lang/String;)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaSparkContext.binaryFiles ( String path, int minPartitions ) : JavaPairRDD<String,org.apache.spark.input.PortableDataStream>
[mangled: org/apache/spark/api/java/JavaSparkContext.binaryFiles:(Ljava/lang/String;I)Lorg/apache/spark/api/java/JavaPairRDD;]
JavaSparkContext.binaryRecords ( String path, int recordLength ) : JavaRDD<byte[ ]>
[mangled: org/apache/spark/api/java/JavaSparkContext.binaryRecords:(Ljava/lang/String;I)Lorg/apache/spark/api/java/JavaRDD;]
JavaSparkContext.close ( ) : void
[mangled: org/apache/spark/api/java/JavaSparkContext.close:()V]
JavaSparkContext.doubleAccumulator ( double initialValue, String name ) : org.apache.spark.Accumulator<Double>
[mangled: org/apache/spark/api/java/JavaSparkContext.doubleAccumulator:(DLjava/lang/String;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.emptyRDD ( ) : JavaRDD<T>
[mangled: org/apache/spark/api/java/JavaSparkContext.emptyRDD:()Lorg/apache/spark/api/java/JavaRDD;]
JavaSparkContext.intAccumulator ( int initialValue, String name ) : org.apache.spark.Accumulator<Integer>
[mangled: org/apache/spark/api/java/JavaSparkContext.intAccumulator:(ILjava/lang/String;)Lorg/apache/spark/Accumulator;]
JavaSparkContext.setLogLevel ( String logLevel ) : void
[mangled: org/apache/spark/api/java/JavaSparkContext.setLogLevel:(Ljava/lang/String;)V]
JavaSparkContext.statusTracker ( ) : JavaSparkStatusTracker
[mangled: org/apache/spark/api/java/JavaSparkContext.statusTracker:()Lorg/apache/spark/api/java/JavaSparkStatusTracker;]
JavaSparkContext.version ( ) : String
[mangled: org/apache/spark/api/java/JavaSparkContext.version:()Ljava/lang/String;]
spark-core_2.10-1.4.0.jar, Logging.class
package org.apache.spark
Logging.logName ( ) [abstract] : String
[mangled: org/apache/spark/Logging.logName:()Ljava/lang/String;]
spark-core_2.10-1.4.0.jar, RDD<T>.class
package org.apache.spark.rdd
RDD<T>.countApproxDistinct ( int p, int sp ) : long
[mangled: org/apache/spark/rdd/RDD<T>.countApproxDistinct:(II)J]
RDD<T>.creationSite ( ) : org.apache.spark.util.CallSite
[mangled: org/apache/spark/rdd/RDD<T>.creationSite:()Lorg/apache/spark/util/CallSite;]
RDD<T>.doubleRDDToDoubleRDDFunctions ( RDD<Object> p1 ) [static] : DoubleRDDFunctions
[mangled: org/apache/spark/rdd/RDD<T>.doubleRDDToDoubleRDDFunctions:(Lorg/apache/spark/rdd/RDD;)Lorg/apache/spark/rdd/DoubleRDDFunctions;]
RDD<T>.isEmpty ( ) : boolean
[mangled: org/apache/spark/rdd/RDD<T>.isEmpty:()Z]
RDD<T>.logName ( ) : String
[mangled: org/apache/spark/rdd/RDD<T>.logName:()Ljava/lang/String;]
RDD<T>.numericRDDToDoubleRDDFunctions ( RDD<T> p1, scala.math.Numeric<T> p2 ) [static] : DoubleRDDFunctions
[mangled: org/apache/spark/rdd/RDD<T>.numericRDDToDoubleRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/math/Numeric;)Lorg/apache/spark/rdd/DoubleRDDFunctions;]
RDD<T>.RDD..doCheckpointCalled ( ) : boolean
[mangled: org/apache/spark/rdd/RDD<T>.org.apache.spark.rdd.RDD..doCheckpointCalled:()Z]
RDD<T>.RDD..doCheckpointCalled_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/rdd/RDD<T>.org.apache.spark.rdd.RDD..doCheckpointCalled_.eq:(Z)V]
RDD<T>.RDD..sc ( ) : org.apache.spark.SparkContext
[mangled: org/apache/spark/rdd/RDD<T>.org.apache.spark.rdd.RDD..sc:()Lorg/apache/spark/SparkContext;]
RDD<T>.parent ( int j, scala.reflect.ClassTag<U> p2 ) : RDD<U>
[mangled: org/apache/spark/rdd/RDD<T>.parent:(ILscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
RDD<T>.randomSampleWithRange ( double lb, double ub, long seed ) : RDD<T>
[mangled: org/apache/spark/rdd/RDD<T>.randomSampleWithRange:(DDJ)Lorg/apache/spark/rdd/RDD;]
RDD<T>.rddToAsyncRDDActions ( RDD<T> p1, scala.reflect.ClassTag<T> p2 ) [static] : AsyncRDDActions<T>
[mangled: org/apache/spark/rdd/RDD<T>.rddToAsyncRDDActions:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/AsyncRDDActions;]
RDD<T>.rddToOrderedRDDFunctions ( RDD<scala.Tuple2<K,V>> p1, scala.math.Ordering<K> p2, scala.reflect.ClassTag<K> p3, scala.reflect.ClassTag<V> p4 ) [static] : OrderedRDDFunctions<K,V,scala.Tuple2<K,V>>
[mangled: org/apache/spark/rdd/RDD<T>.rddToOrderedRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/math/Ordering;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/OrderedRDDFunctions;]
RDD<T>.rddToPairRDDFunctions ( RDD<scala.Tuple2<K,V>> p1, scala.reflect.ClassTag<K> p2, scala.reflect.ClassTag<V> p3, scala.math.Ordering<K> p4 ) [static] : PairRDDFunctions<K,V>
[mangled: org/apache/spark/rdd/RDD<T>.rddToPairRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/math/Ordering;)Lorg/apache/spark/rdd/PairRDDFunctions;]
RDD<T>.rddToSequenceFileRDDFunctions ( RDD<scala.Tuple2<K,V>> p1, scala.reflect.ClassTag<K> p2, scala.reflect.ClassTag<V> p3, org.apache.spark.WritableFactory<K> p4, org.apache.spark.WritableFactory<V> p5 ) [static] : SequenceFileRDDFunctions<K,V>
[mangled: org/apache/spark/rdd/RDD<T>.rddToSequenceFileRDDFunctions:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lorg/apache/spark/WritableFactory;Lorg/apache/spark/WritableFactory;)Lorg/apache/spark/rdd/SequenceFileRDDFunctions;]
RDD<T>.retag ( Class<T> cls ) : RDD<T>
[mangled: org/apache/spark/rdd/RDD<T>.retag:(Ljava/lang/Class;)Lorg/apache/spark/rdd/RDD;]
RDD<T>.retag ( scala.reflect.ClassTag<T> classTag ) : RDD<T>
[mangled: org/apache/spark/rdd/RDD<T>.retag:(Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
RDD<T>.scope ( ) : scala.Option<RDDOperationScope>
[mangled: org/apache/spark/rdd/RDD<T>.scope:()Lscala/Option;]
RDD<T>.sortBy ( scala.Function1<T,K> f, boolean ascending, int numPartitions, scala.math.Ordering<K> ord, scala.reflect.ClassTag<K> ctag ) : RDD<T>
[mangled: org/apache/spark/rdd/RDD<T>.sortBy:(Lscala/Function1;ZILscala/math/Ordering;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
RDD<T>.treeAggregate ( U zeroValue, scala.Function2<U,T,U> seqOp, scala.Function2<U,U,U> combOp, int depth, scala.reflect.ClassTag<U> p5 ) : U
[mangled: org/apache/spark/rdd/RDD<T>.treeAggregate:(Ljava/lang/Object;Lscala/Function2;Lscala/Function2;ILscala/reflect/ClassTag;)Ljava/lang/Object;]
RDD<T>.treeReduce ( scala.Function2<T,T,T> f, int depth ) : T
[mangled: org/apache/spark/rdd/RDD<T>.treeReduce:(Lscala/Function2;I)Ljava/lang/Object;]
RDD<T>.withScope ( scala.Function0<U> body ) : U
[mangled: org/apache/spark/rdd/RDD<T>.withScope:(Lscala/Function0;)Ljava/lang/Object;]
spark-core_2.10-1.4.0.jar, SparkConf.class
package org.apache.spark
SparkConf.getAppId ( ) : String
[mangled: org/apache/spark/SparkConf.getAppId:()Ljava/lang/String;]
SparkConf.getDeprecatedConfig ( String p1, SparkConf p2 ) [static] : scala.Option<String>
[mangled: org/apache/spark/SparkConf.getDeprecatedConfig:(Ljava/lang/String;Lorg/apache/spark/SparkConf;)Lscala/Option;]
SparkConf.getenv ( String name ) : String
[mangled: org/apache/spark/SparkConf.getenv:(Ljava/lang/String;)Ljava/lang/String;]
SparkConf.getSizeAsBytes ( String key ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsBytes:(Ljava/lang/String;)J]
SparkConf.getSizeAsBytes ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsBytes:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getSizeAsGb ( String key ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsGb:(Ljava/lang/String;)J]
SparkConf.getSizeAsGb ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsGb:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getSizeAsKb ( String key ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsKb:(Ljava/lang/String;)J]
SparkConf.getSizeAsKb ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsKb:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getSizeAsMb ( String key ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsMb:(Ljava/lang/String;)J]
SparkConf.getSizeAsMb ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getSizeAsMb:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getTimeAsMs ( String key ) : long
[mangled: org/apache/spark/SparkConf.getTimeAsMs:(Ljava/lang/String;)J]
SparkConf.getTimeAsMs ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getTimeAsMs:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.getTimeAsSeconds ( String key ) : long
[mangled: org/apache/spark/SparkConf.getTimeAsSeconds:(Ljava/lang/String;)J]
SparkConf.getTimeAsSeconds ( String key, String defaultValue ) : long
[mangled: org/apache/spark/SparkConf.getTimeAsSeconds:(Ljava/lang/String;Ljava/lang/String;)J]
SparkConf.isAkkaConf ( String p1 ) [static] : boolean
[mangled: org/apache/spark/SparkConf.isAkkaConf:(Ljava/lang/String;)Z]
SparkConf.isExecutorStartupConf ( String p1 ) [static] : boolean
[mangled: org/apache/spark/SparkConf.isExecutorStartupConf:(Ljava/lang/String;)Z]
SparkConf.isSparkPortConf ( String p1 ) [static] : boolean
[mangled: org/apache/spark/SparkConf.isSparkPortConf:(Ljava/lang/String;)Z]
SparkConf.logDeprecationWarning ( String p1 ) [static] : void
[mangled: org/apache/spark/SparkConf.logDeprecationWarning:(Ljava/lang/String;)V]
SparkConf.logName ( ) : String
[mangled: org/apache/spark/SparkConf.logName:()Ljava/lang/String;]
SparkConf.registerKryoClasses ( Class<?>[ ] classes ) : SparkConf
[mangled: org/apache/spark/SparkConf.registerKryoClasses:([Ljava/lang/Class;)Lorg/apache/spark/SparkConf;]
spark-core_2.10-1.4.0.jar, SparkContext.class
package org.apache.spark
SparkContext.accumulable ( R initialValue, String name, AccumulableParam<R,T> param ) : Accumulable<R,T>
[mangled: org/apache/spark/SparkContext.accumulable:(Ljava/lang/Object;Ljava/lang/String;Lorg/apache/spark/AccumulableParam;)Lorg/apache/spark/Accumulable;]
SparkContext.accumulator ( T initialValue, String name, AccumulatorParam<T> param ) : Accumulator<T>
[mangled: org/apache/spark/SparkContext.accumulator:(Ljava/lang/Object;Ljava/lang/String;Lorg/apache/spark/AccumulatorParam;)Lorg/apache/spark/Accumulator;]
SparkContext.addFile ( String path, boolean recursive ) : void
[mangled: org/apache/spark/SparkContext.addFile:(Ljava/lang/String;Z)V]
SparkContext.applicationAttemptId ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.applicationAttemptId:()Lscala/Option;]
SparkContext.applicationId ( ) : String
[mangled: org/apache/spark/SparkContext.applicationId:()Ljava/lang/String;]
SparkContext.binaryFiles ( String path, int minPartitions ) : rdd.RDD<scala.Tuple2<String,input.PortableDataStream>>
[mangled: org/apache/spark/SparkContext.binaryFiles:(Ljava/lang/String;I)Lorg/apache/spark/rdd/RDD;]
SparkContext.binaryRecords ( String path, int recordLength, org.apache.hadoop.conf.Configuration conf ) : rdd.RDD<byte[ ]>
[mangled: org/apache/spark/SparkContext.binaryRecords:(Ljava/lang/String;ILorg/apache/hadoop/conf/Configuration;)Lorg/apache/spark/rdd/RDD;]
SparkContext.clean ( F f, boolean checkSerializable ) : F
[mangled: org/apache/spark/SparkContext.clean:(Ljava/lang/Object;Z)Ljava/lang/Object;]
SparkContext.createSparkEnv ( SparkConf conf, boolean isLocal, scheduler.LiveListenerBus listenerBus ) : SparkEnv
[mangled: org/apache/spark/SparkContext.createSparkEnv:(Lorg/apache/spark/SparkConf;ZLorg/apache/spark/scheduler/LiveListenerBus;)Lorg/apache/spark/SparkEnv;]
SparkContext.eventLogCodec ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.eventLogCodec:()Lscala/Option;]
SparkContext.eventLogDir ( ) : scala.Option<java.net.URI>
[mangled: org/apache/spark/SparkContext.eventLogDir:()Lscala/Option;]
SparkContext.executorAllocationManager ( ) : scala.Option<ExecutorAllocationManager>
[mangled: org/apache/spark/SparkContext.executorAllocationManager:()Lscala/Option;]
SparkContext.externalBlockStoreFolderName ( ) : String
[mangled: org/apache/spark/SparkContext.externalBlockStoreFolderName:()Ljava/lang/String;]
SparkContext.getCallSite ( ) : util.CallSite
[mangled: org/apache/spark/SparkContext.getCallSite:()Lorg/apache/spark/util/CallSite;]
SparkContext.getExecutorThreadDump ( String executorId ) : scala.Option<util.ThreadStackTrace[ ]>
[mangled: org/apache/spark/SparkContext.getExecutorThreadDump:(Ljava/lang/String;)Lscala/Option;]
SparkContext.getOrCreate ( ) [static] : SparkContext
[mangled: org/apache/spark/SparkContext.getOrCreate:()Lorg/apache/spark/SparkContext;]
SparkContext.getOrCreate ( SparkConf p1 ) [static] : SparkContext
[mangled: org/apache/spark/SparkContext.getOrCreate:(Lorg/apache/spark/SparkConf;)Lorg/apache/spark/SparkContext;]
SparkContext.isEventLogEnabled ( ) : boolean
[mangled: org/apache/spark/SparkContext.isEventLogEnabled:()Z]
SparkContext.jobProgressListener ( ) : ui.jobs.JobProgressListener
[mangled: org/apache/spark/SparkContext.jobProgressListener:()Lorg/apache/spark/ui/jobs/JobProgressListener;]
SparkContext.killExecutor ( String executorId ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutor:(Ljava/lang/String;)Z]
SparkContext.killExecutors ( scala.collection.Seq<String> executorIds ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutors:(Lscala/collection/Seq;)Z]
SparkContext.logName ( ) : String
[mangled: org/apache/spark/SparkContext.logName:()Ljava/lang/String;]
SparkContext.metricsSystem ( ) : metrics.MetricsSystem
[mangled: org/apache/spark/SparkContext.metricsSystem:()Lorg/apache/spark/metrics/MetricsSystem;]
SparkContext.SparkContext.._conf ( ) : SparkConf
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._conf:()Lorg/apache/spark/SparkConf;]
SparkContext.SparkContext.._env ( ) : SparkEnv
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext.._env:()Lorg/apache/spark/SparkEnv;]
SparkContext.SparkContext..assertNotStopped ( ) : void
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext..assertNotStopped:()V]
SparkContext.SparkContext..creationSite ( ) : util.CallSite
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext..creationSite:()Lorg/apache/spark/util/CallSite;]
SparkContext.progressBar ( ) : scala.Option<ui.ConsoleProgressBar>
[mangled: org/apache/spark/SparkContext.progressBar:()Lscala/Option;]
SparkContext.range ( long start, long end, long step, int numSlices ) : rdd.RDD<Object>
[mangled: org/apache/spark/SparkContext.range:(JJJI)Lorg/apache/spark/rdd/RDD;]
SparkContext.requestExecutors ( int numAdditionalExecutors ) : boolean
[mangled: org/apache/spark/SparkContext.requestExecutors:(I)Z]
SparkContext.requestTotalExecutors ( int numExecutors ) : boolean
[mangled: org/apache/spark/SparkContext.requestTotalExecutors:(I)Z]
SparkContext.schedulerBackend ( ) : scheduler.SchedulerBackend
[mangled: org/apache/spark/SparkContext.schedulerBackend:()Lorg/apache/spark/scheduler/SchedulerBackend;]
SparkContext.schedulerBackend_.eq ( scheduler.SchedulerBackend sb ) : void
[mangled: org/apache/spark/SparkContext.schedulerBackend_.eq:(Lorg/apache/spark/scheduler/SchedulerBackend;)V]
SparkContext.setCallSite ( util.CallSite callSite ) : void
[mangled: org/apache/spark/SparkContext.setCallSite:(Lorg/apache/spark/util/CallSite;)V]
SparkContext.setLogLevel ( String logLevel ) : void
[mangled: org/apache/spark/SparkContext.setLogLevel:(Ljava/lang/String;)V]
SparkContext.statusTracker ( ) : SparkStatusTracker
[mangled: org/apache/spark/SparkContext.statusTracker:()Lorg/apache/spark/SparkStatusTracker;]
SparkContext.supportDynamicAllocation ( ) : boolean
[mangled: org/apache/spark/SparkContext.supportDynamicAllocation:()Z]
SparkContext.ui ( ) : scala.Option<ui.SparkUI>
[mangled: org/apache/spark/SparkContext.ui:()Lscala/Option;]
SparkContext.withScope ( scala.Function0<U> body ) : U
[mangled: org/apache/spark/SparkContext.withScope:(Lscala/Function0;)Ljava/lang/Object;]
spark-core_2.10-1.4.0.jar, StorageLevel.class
package org.apache.spark.storage
StorageLevel.fromString ( String p1 ) [static] : StorageLevel
[mangled: org/apache/spark/storage/StorageLevel.fromString:(Ljava/lang/String;)Lorg/apache/spark/storage/StorageLevel;]
StorageLevel.StorageLevel.._deserialized_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/storage/StorageLevel.org.apache.spark.storage.StorageLevel.._deserialized_.eq:(Z)V]
StorageLevel.StorageLevel.._replication ( ) : int
[mangled: org/apache/spark/storage/StorageLevel.org.apache.spark.storage.StorageLevel.._replication:()I]
StorageLevel.StorageLevel.._replication_.eq ( int p1 ) : void
[mangled: org/apache/spark/storage/StorageLevel.org.apache.spark.storage.StorageLevel.._replication_.eq:(I)V]
StorageLevel.StorageLevel.._useDisk_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/storage/StorageLevel.org.apache.spark.storage.StorageLevel.._useDisk_.eq:(Z)V]
StorageLevel.StorageLevel.._useMemory_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/storage/StorageLevel.org.apache.spark.storage.StorageLevel.._useMemory_.eq:(Z)V]
StorageLevel.StorageLevel.._useOffHeap_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/storage/StorageLevel.org.apache.spark.storage.StorageLevel.._useOffHeap_.eq:(Z)V]
spark-core_2.10-1.4.0.jar, TaskContext.class
package org.apache.spark
TaskContext.addTaskCompletionListener ( util.TaskCompletionListener p1 ) [abstract] : TaskContext
[mangled: org/apache/spark/TaskContext.addTaskCompletionListener:(Lorg/apache/spark/util/TaskCompletionListener;)Lorg/apache/spark/TaskContext;]
TaskContext.addTaskCompletionListener ( scala.Function1<TaskContext,scala.runtime.BoxedUnit> p1 ) [abstract] : TaskContext
[mangled: org/apache/spark/TaskContext.addTaskCompletionListener:(Lscala/Function1;)Lorg/apache/spark/TaskContext;]
TaskContext.attemptNumber ( ) [abstract] : int
[mangled: org/apache/spark/TaskContext.attemptNumber:()I]
TaskContext.get ( ) [static] : TaskContext
[mangled: org/apache/spark/TaskContext.get:()Lorg/apache/spark/TaskContext;]
TaskContext.isCompleted ( ) [abstract] : boolean
[mangled: org/apache/spark/TaskContext.isCompleted:()Z]
TaskContext.isInterrupted ( ) [abstract] : boolean
[mangled: org/apache/spark/TaskContext.isInterrupted:()Z]
TaskContext.isRunningLocally ( ) [abstract] : boolean
[mangled: org/apache/spark/TaskContext.isRunningLocally:()Z]
TaskContext.taskAttemptId ( ) [abstract] : long
[mangled: org/apache/spark/TaskContext.taskAttemptId:()J]
TaskContext.TaskContext ( )
[mangled: org/apache/spark/TaskContext."<init>":()V]
TaskContext.taskMemoryManager ( ) [abstract] : unsafe.memory.TaskMemoryManager
[mangled: org/apache/spark/TaskContext.taskMemoryManager:()Lorg/apache/spark/unsafe/memory/TaskMemoryManager;]
spark-sql_2.10-1.4.0.jar, BaseRelation.class
package org.apache.spark.sql.sources
BaseRelation.BaseRelation ( )
[mangled: org/apache/spark/sql/sources/BaseRelation."<init>":()V]
BaseRelation.needConversion ( ) : boolean
[mangled: org/apache/spark/sql/sources/BaseRelation.needConversion:()Z]
BaseRelation.schema ( ) [abstract] : org.apache.spark.sql.types.StructType
[mangled: org/apache/spark/sql/sources/BaseRelation.schema:()Lorg/apache/spark/sql/types/StructType;]
BaseRelation.sizeInBytes ( ) : long
[mangled: org/apache/spark/sql/sources/BaseRelation.sizeInBytes:()J]
BaseRelation.sqlContext ( ) [abstract] : org.apache.spark.sql.SQLContext
[mangled: org/apache/spark/sql/sources/BaseRelation.sqlContext:()Lorg/apache/spark/sql/SQLContext;]
spark-sql_2.10-1.4.0.jar, CreatableRelationProvider.class
package org.apache.spark.sql.sources
CreatableRelationProvider.createRelation ( org.apache.spark.sql.SQLContext p1, org.apache.spark.sql.SaveMode p2, scala.collection.immutable.Map<String,String> p3, org.apache.spark.sql.DataFrame p4 ) [abstract] : BaseRelation
[mangled: org/apache/spark/sql/sources/CreatableRelationProvider.createRelation:(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/sources/BaseRelation;]
spark-sql_2.10-1.4.0.jar, DataFrame.class
package org.apache.spark.sql
DataFrame.agg ( java.util.Map<String,String> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Ljava/util/Map;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( Column expr, Column... exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lorg/apache/spark/sql/Column;[Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( Column expr, scala.collection.Seq<Column> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lorg/apache/spark/sql/Column;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( scala.collection.immutable.Map<String,String> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( scala.Tuple2<String,String> aggExpr, scala.collection.Seq<scala.Tuple2<String,String>> aggExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lscala/Tuple2;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.apply ( String colName ) : Column
[mangled: org/apache/spark/sql/DataFrame.apply:(Ljava/lang/String;)Lorg/apache/spark/sql/Column;]
DataFrame.as ( scala.Symbol alias ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.as:(Lscala/Symbol;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.as ( String alias ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.as:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.cache ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.cache:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.cache ( ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.cache:()Lorg/apache/spark/sql/RDDApi;]
DataFrame.coalesce ( int numPartitions ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.coalesce:(I)Lorg/apache/spark/sql/DataFrame;]
DataFrame.col ( String colName ) : Column
[mangled: org/apache/spark/sql/DataFrame.col:(Ljava/lang/String;)Lorg/apache/spark/sql/Column;]
DataFrame.collect ( ) : Object
[mangled: org/apache/spark/sql/DataFrame.collect:()Ljava/lang/Object;]
DataFrame.collect ( ) : Row[ ]
[mangled: org/apache/spark/sql/DataFrame.collect:()[Lorg/apache/spark/sql/Row;]
DataFrame.collectAsList ( ) : java.util.List<Row>
[mangled: org/apache/spark/sql/DataFrame.collectAsList:()Ljava/util/List;]
DataFrame.columns ( ) : String[ ]
[mangled: org/apache/spark/sql/DataFrame.columns:()[Ljava/lang/String;]
DataFrame.count ( ) : long
[mangled: org/apache/spark/sql/DataFrame.count:()J]
DataFrame.cube ( Column... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.cube ( scala.collection.Seq<Column> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:(Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.cube ( String col1, scala.collection.Seq<String> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.cube ( String col1, String... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.cube:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.DataFrame ( SQLContext sqlContext, catalyst.plans.logical.LogicalPlan logicalPlan )
[mangled: org/apache/spark/sql/DataFrame."<init>":(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)V]
DataFrame.DataFrame ( SQLContext sqlContext, SQLContext.QueryExecution queryExecution )
[mangled: org/apache/spark/sql/DataFrame."<init>":(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/SQLContext$QueryExecution;)V]
DataFrame.describe ( scala.collection.Seq<String> cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.describe:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.describe ( String... cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.describe:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.distinct ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.distinct:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.drop ( String colName ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.drop:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.dropDuplicates ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.dropDuplicates:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.dropDuplicates ( scala.collection.Seq<String> colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.dropDuplicates:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.dropDuplicates ( String[ ] colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.dropDuplicates:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.dtypes ( ) : scala.Tuple2<String,String>[ ]
[mangled: org/apache/spark/sql/DataFrame.dtypes:()[Lscala/Tuple2;]
DataFrame.except ( DataFrame other ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.except:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.explain ( ) : void
[mangled: org/apache/spark/sql/DataFrame.explain:()V]
DataFrame.explain ( boolean extended ) : void
[mangled: org/apache/spark/sql/DataFrame.explain:(Z)V]
DataFrame.explode ( scala.collection.Seq<Column> input, scala.Function1<Row,scala.collection.TraversableOnce<A>> f, scala.reflect.api.TypeTags.TypeTag<A> p3 ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.explode:(Lscala/collection/Seq;Lscala/Function1;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.explode ( String inputColumn, String outputColumn, scala.Function1<A,scala.collection.TraversableOnce<B>> f, scala.reflect.api.TypeTags.TypeTag<B> p4 ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.explode:(Ljava/lang/String;Ljava/lang/String;Lscala/Function1;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.filter ( Column condition ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.filter:(Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.filter ( String conditionExpr ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.filter:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.first ( ) : Object
[mangled: org/apache/spark/sql/DataFrame.first:()Ljava/lang/Object;]
DataFrame.first ( ) : Row
[mangled: org/apache/spark/sql/DataFrame.first:()Lorg/apache/spark/sql/Row;]
DataFrame.flatMap ( scala.Function1<Row,scala.collection.TraversableOnce<R>> f, scala.reflect.ClassTag<R> p2 ) : org.apache.spark.rdd.RDD<R>
[mangled: org/apache/spark/sql/DataFrame.flatMap:(Lscala/Function1;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
DataFrame.foreach ( scala.Function1<Row,scala.runtime.BoxedUnit> f ) : void
[mangled: org/apache/spark/sql/DataFrame.foreach:(Lscala/Function1;)V]
DataFrame.foreachPartition ( scala.Function1<scala.collection.Iterator<Row>,scala.runtime.BoxedUnit> f ) : void
[mangled: org/apache/spark/sql/DataFrame.foreachPartition:(Lscala/Function1;)V]
DataFrame.groupBy ( Column... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.groupBy ( scala.collection.Seq<Column> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:(Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.groupBy ( String col1, scala.collection.Seq<String> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.groupBy ( String col1, String... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.head ( ) : Row
[mangled: org/apache/spark/sql/DataFrame.head:()Lorg/apache/spark/sql/Row;]
DataFrame.head ( int n ) : Row[ ]
[mangled: org/apache/spark/sql/DataFrame.head:(I)[Lorg/apache/spark/sql/Row;]
DataFrame.intersect ( DataFrame other ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.intersect:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.isLocal ( ) : boolean
[mangled: org/apache/spark/sql/DataFrame.isLocal:()Z]
DataFrame.javaRDD ( ) : org.apache.spark.api.java.JavaRDD<Row>
[mangled: org/apache/spark/sql/DataFrame.javaRDD:()Lorg/apache/spark/api/java/JavaRDD;]
DataFrame.javaToPython ( ) : org.apache.spark.api.java.JavaRDD<byte[ ]>
[mangled: org/apache/spark/sql/DataFrame.javaToPython:()Lorg/apache/spark/api/java/JavaRDD;]
DataFrame.join ( DataFrame right ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.join ( DataFrame right, Column joinExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.join ( DataFrame right, Column joinExprs, String joinType ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Lorg/apache/spark/sql/Column;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.join ( DataFrame right, String usingColumn ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.limit ( int n ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.limit:(I)Lorg/apache/spark/sql/DataFrame;]
DataFrame.logicalPlan ( ) : catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/DataFrame.logicalPlan:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
DataFrame.map ( scala.Function1<Row,R> f, scala.reflect.ClassTag<R> p2 ) : org.apache.spark.rdd.RDD<R>
[mangled: org/apache/spark/sql/DataFrame.map:(Lscala/Function1;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
DataFrame.mapPartitions ( scala.Function1<scala.collection.Iterator<Row>,scala.collection.Iterator<R>> f, scala.reflect.ClassTag<R> p2 ) : org.apache.spark.rdd.RDD<R>
[mangled: org/apache/spark/sql/DataFrame.mapPartitions:(Lscala/Function1;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
DataFrame.na ( ) : DataFrameNaFunctions
[mangled: org/apache/spark/sql/DataFrame.na:()Lorg/apache/spark/sql/DataFrameNaFunctions;]
DataFrame.numericColumns ( ) : scala.collection.Seq<catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/DataFrame.numericColumns:()Lscala/collection/Seq;]
DataFrame.orderBy ( Column... sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.orderBy ( scala.collection.Seq<Column> sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.orderBy ( String sortCol, scala.collection.Seq<String> sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.orderBy ( String sortCol, String... sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.DataFrame..logicalPlanToDataFrame ( catalyst.plans.logical.LogicalPlan logicalPlan ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.org.apache.spark.sql.DataFrame..logicalPlanToDataFrame:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.persist ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.persist:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.persist ( ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.persist:()Lorg/apache/spark/sql/RDDApi;]
DataFrame.persist ( org.apache.spark.storage.StorageLevel newLevel ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.persist:(Lorg/apache/spark/storage/StorageLevel;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.persist ( org.apache.spark.storage.StorageLevel newLevel ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.persist:(Lorg/apache/spark/storage/StorageLevel;)Lorg/apache/spark/sql/RDDApi;]
DataFrame.printSchema ( ) : void
[mangled: org/apache/spark/sql/DataFrame.printSchema:()V]
DataFrame.queryExecution ( ) : SQLContext.QueryExecution
[mangled: org/apache/spark/sql/DataFrame.queryExecution:()Lorg/apache/spark/sql/SQLContext$QueryExecution;]
DataFrame.randomSplit ( double[ ] weights ) : DataFrame[ ]
[mangled: org/apache/spark/sql/DataFrame.randomSplit:([D)[Lorg/apache/spark/sql/DataFrame;]
DataFrame.randomSplit ( double[ ] weights, long seed ) : DataFrame[ ]
[mangled: org/apache/spark/sql/DataFrame.randomSplit:([DJ)[Lorg/apache/spark/sql/DataFrame;]
DataFrame.randomSplit ( scala.collection.immutable.List<Object> weights, long seed ) : DataFrame[ ]
[mangled: org/apache/spark/sql/DataFrame.randomSplit:(Lscala/collection/immutable/List;J)[Lorg/apache/spark/sql/DataFrame;]
DataFrame.rdd ( ) : org.apache.spark.rdd.RDD<Row>
[mangled: org/apache/spark/sql/DataFrame.rdd:()Lorg/apache/spark/rdd/RDD;]
DataFrame.registerTempTable ( String tableName ) : void
[mangled: org/apache/spark/sql/DataFrame.registerTempTable:(Ljava/lang/String;)V]
DataFrame.repartition ( int numPartitions ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.repartition:(I)Lorg/apache/spark/sql/DataFrame;]
DataFrame.resolve ( String colName ) : catalyst.expressions.NamedExpression
[mangled: org/apache/spark/sql/DataFrame.resolve:(Ljava/lang/String;)Lorg/apache/spark/sql/catalyst/expressions/NamedExpression;]
DataFrame.rollup ( Column... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.rollup ( scala.collection.Seq<Column> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:(Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.rollup ( String col1, scala.collection.Seq<String> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.rollup ( String col1, String... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.rollup:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.sample ( boolean withReplacement, double fraction ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sample:(ZD)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sample ( boolean withReplacement, double fraction, long seed ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sample:(ZDJ)Lorg/apache/spark/sql/DataFrame;]
DataFrame.schema ( ) : types.StructType
[mangled: org/apache/spark/sql/DataFrame.schema:()Lorg/apache/spark/sql/types/StructType;]
DataFrame.select ( Column... cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.select ( scala.collection.Seq<Column> cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.select ( String col, scala.collection.Seq<String> cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.select ( String col, String... cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.selectExpr ( scala.collection.Seq<String> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.selectExpr:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.selectExpr ( String... exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.selectExpr:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.show ( ) : void
[mangled: org/apache/spark/sql/DataFrame.show:()V]
DataFrame.show ( int numRows ) : void
[mangled: org/apache/spark/sql/DataFrame.show:(I)V]
DataFrame.showString ( int numRows ) : String
[mangled: org/apache/spark/sql/DataFrame.showString:(I)Ljava/lang/String;]
DataFrame.sort ( Column... sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sort ( scala.collection.Seq<Column> sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sort ( String sortCol, scala.collection.Seq<String> sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sort ( String sortCol, String... sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sqlContext ( ) : SQLContext
[mangled: org/apache/spark/sql/DataFrame.sqlContext:()Lorg/apache/spark/sql/SQLContext;]
DataFrame.stat ( ) : DataFrameStatFunctions
[mangled: org/apache/spark/sql/DataFrame.stat:()Lorg/apache/spark/sql/DataFrameStatFunctions;]
DataFrame.take ( int n ) : Object
[mangled: org/apache/spark/sql/DataFrame.take:(I)Ljava/lang/Object;]
DataFrame.take ( int n ) : Row[ ]
[mangled: org/apache/spark/sql/DataFrame.take:(I)[Lorg/apache/spark/sql/Row;]
DataFrame.toDF ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.toDF:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.toDF ( scala.collection.Seq<String> colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.toDF:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.toDF ( String... colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.toDF:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.toJavaRDD ( ) : org.apache.spark.api.java.JavaRDD<Row>
[mangled: org/apache/spark/sql/DataFrame.toJavaRDD:()Lorg/apache/spark/api/java/JavaRDD;]
DataFrame.toJSON ( ) : org.apache.spark.rdd.RDD<String>
[mangled: org/apache/spark/sql/DataFrame.toJSON:()Lorg/apache/spark/rdd/RDD;]
DataFrame.toString ( ) : String
[mangled: org/apache/spark/sql/DataFrame.toString:()Ljava/lang/String;]
DataFrame.unionAll ( DataFrame other ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.unionAll:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.unpersist ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.unpersist:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.unpersist ( ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.unpersist:()Lorg/apache/spark/sql/RDDApi;]
DataFrame.unpersist ( boolean blocking ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.unpersist:(Z)Lorg/apache/spark/sql/DataFrame;]
DataFrame.unpersist ( boolean blocking ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.unpersist:(Z)Lorg/apache/spark/sql/RDDApi;]
DataFrame.where ( Column condition ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.where:(Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.withColumn ( String colName, Column col ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.withColumn:(Ljava/lang/String;Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.withColumnRenamed ( String existingName, String newName ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.withColumnRenamed:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.write ( ) : DataFrameWriter
[mangled: org/apache/spark/sql/DataFrame.write:()Lorg/apache/spark/sql/DataFrameWriter;]
spark-sql_2.10-1.4.0.jar, DataFrameReader.class
package org.apache.spark.sql
DataFrameReader.DataFrameReader ( SQLContext sqlContext )
[mangled: org/apache/spark/sql/DataFrameReader."<init>":(Lorg/apache/spark/sql/SQLContext;)V]
DataFrameReader.format ( String source ) : DataFrameReader
[mangled: org/apache/spark/sql/DataFrameReader.format:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrameReader;]
DataFrameReader.jdbc ( String url, String table, java.util.Properties properties ) : DataFrame
[mangled: org/apache/spark/sql/DataFrameReader.jdbc:(Ljava/lang/String;Ljava/lang/String;Ljava/util/Properties;)Lorg/apache/spark/sql/DataFrame;]
DataFrameReader.jdbc ( String url, String table, String columnName, long lowerBound, long upperBound, int numPartitions, java.util.Properties connectionProperties ) : DataFrame
[mangled: org/apache/spark/sql/DataFrameReader.jdbc:(Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;JJILjava/util/Properties;)Lorg/apache/spark/sql/DataFrame;]
DataFrameReader.jdbc ( String url, String table, String[ ] predicates, java.util.Properties connectionProperties ) : DataFrame
[mangled: org/apache/spark/sql/DataFrameReader.jdbc:(Ljava/lang/String;Ljava/lang/String;[Ljava/lang/String;Ljava/util/Properties;)Lorg/apache/spark/sql/DataFrame;]
DataFrameReader.json ( org.apache.spark.api.java.JavaRDD<String> jsonRDD ) : DataFrame
[mangled: org/apache/spark/sql/DataFrameReader.json:(Lorg/apache/spark/api/java/JavaRDD;)Lorg/apache/spark/sql/DataFrame;]
DataFrameReader.json ( org.apache.spark.rdd.RDD<String> jsonRDD ) : DataFrame
[mangled: org/apache/spark/sql/DataFrameReader.json:(Lorg/apache/spark/rdd/RDD;)Lorg/apache/spark/sql/DataFrame;]
DataFrameReader.json ( String path ) : DataFrame
[mangled: org/apache/spark/sql/DataFrameReader.json:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrameReader.load ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrameReader.load:()Lorg/apache/spark/sql/DataFrame;]
DataFrameReader.load ( String path ) : DataFrame
[mangled: org/apache/spark/sql/DataFrameReader.load:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrameReader.option ( String key, String value ) : DataFrameReader
[mangled: org/apache/spark/sql/DataFrameReader.option:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrameReader;]
DataFrameReader.options ( java.util.Map<String,String> options ) : DataFrameReader
[mangled: org/apache/spark/sql/DataFrameReader.options:(Ljava/util/Map;)Lorg/apache/spark/sql/DataFrameReader;]
DataFrameReader.options ( scala.collection.Map<String,String> options ) : DataFrameReader
[mangled: org/apache/spark/sql/DataFrameReader.options:(Lscala/collection/Map;)Lorg/apache/spark/sql/DataFrameReader;]
DataFrameReader.parquet ( scala.collection.Seq<String> paths ) : DataFrame
[mangled: org/apache/spark/sql/DataFrameReader.parquet:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrameReader.parquet ( String... paths ) : DataFrame
[mangled: org/apache/spark/sql/DataFrameReader.parquet:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrameReader.schema ( types.StructType schema ) : DataFrameReader
[mangled: org/apache/spark/sql/DataFrameReader.schema:(Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrameReader;]
DataFrameReader.table ( String tableName ) : DataFrame
[mangled: org/apache/spark/sql/DataFrameReader.table:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
spark-sql_2.10-1.4.0.jar, DataFrameWriter.class
package org.apache.spark.sql
DataFrameWriter.DataFrameWriter ( DataFrame df )
[mangled: org/apache/spark/sql/DataFrameWriter."<init>":(Lorg/apache/spark/sql/DataFrame;)V]
DataFrameWriter.format ( String source ) : DataFrameWriter
[mangled: org/apache/spark/sql/DataFrameWriter.format:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrameWriter;]
DataFrameWriter.insertInto ( String tableName ) : void
[mangled: org/apache/spark/sql/DataFrameWriter.insertInto:(Ljava/lang/String;)V]
DataFrameWriter.jdbc ( String url, String table, java.util.Properties connectionProperties ) : void
[mangled: org/apache/spark/sql/DataFrameWriter.jdbc:(Ljava/lang/String;Ljava/lang/String;Ljava/util/Properties;)V]
DataFrameWriter.json ( String path ) : void
[mangled: org/apache/spark/sql/DataFrameWriter.json:(Ljava/lang/String;)V]
DataFrameWriter.mode ( SaveMode saveMode ) : DataFrameWriter
[mangled: org/apache/spark/sql/DataFrameWriter.mode:(Lorg/apache/spark/sql/SaveMode;)Lorg/apache/spark/sql/DataFrameWriter;]
DataFrameWriter.mode ( String saveMode ) : DataFrameWriter
[mangled: org/apache/spark/sql/DataFrameWriter.mode:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrameWriter;]
DataFrameWriter.option ( String key, String value ) : DataFrameWriter
[mangled: org/apache/spark/sql/DataFrameWriter.option:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrameWriter;]
DataFrameWriter.options ( java.util.Map<String,String> options ) : DataFrameWriter
[mangled: org/apache/spark/sql/DataFrameWriter.options:(Ljava/util/Map;)Lorg/apache/spark/sql/DataFrameWriter;]
DataFrameWriter.options ( scala.collection.Map<String,String> options ) : DataFrameWriter
[mangled: org/apache/spark/sql/DataFrameWriter.options:(Lscala/collection/Map;)Lorg/apache/spark/sql/DataFrameWriter;]
DataFrameWriter.parquet ( String path ) : void
[mangled: org/apache/spark/sql/DataFrameWriter.parquet:(Ljava/lang/String;)V]
DataFrameWriter.partitionBy ( scala.collection.Seq<String> colNames ) : DataFrameWriter
[mangled: org/apache/spark/sql/DataFrameWriter.partitionBy:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrameWriter;]
DataFrameWriter.partitionBy ( String... colNames ) : DataFrameWriter
[mangled: org/apache/spark/sql/DataFrameWriter.partitionBy:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrameWriter;]
DataFrameWriter.save ( ) : void
[mangled: org/apache/spark/sql/DataFrameWriter.save:()V]
DataFrameWriter.save ( String path ) : void
[mangled: org/apache/spark/sql/DataFrameWriter.save:(Ljava/lang/String;)V]
DataFrameWriter.saveAsTable ( String tableName ) : void
[mangled: org/apache/spark/sql/DataFrameWriter.saveAsTable:(Ljava/lang/String;)V]
spark-sql_2.10-1.4.0.jar, EqualTo.class
package org.apache.spark.sql.sources
EqualTo.attribute ( ) : String
[mangled: org/apache/spark/sql/sources/EqualTo.attribute:()Ljava/lang/String;]
EqualTo.value ( ) : Object
[mangled: org/apache/spark/sql/sources/EqualTo.value:()Ljava/lang/Object;]
spark-sql_2.10-1.4.0.jar, Filter.class
package org.apache.spark.sql.sources
Filter.Filter ( )
[mangled: org/apache/spark/sql/sources/Filter."<init>":()V]
spark-sql_2.10-1.4.0.jar, GreaterThan.class
package org.apache.spark.sql.sources
GreaterThan.attribute ( ) : String
[mangled: org/apache/spark/sql/sources/GreaterThan.attribute:()Ljava/lang/String;]
GreaterThan.value ( ) : Object
[mangled: org/apache/spark/sql/sources/GreaterThan.value:()Ljava/lang/Object;]
spark-sql_2.10-1.4.0.jar, GreaterThanOrEqual.class
package org.apache.spark.sql.sources
GreaterThanOrEqual.attribute ( ) : String
[mangled: org/apache/spark/sql/sources/GreaterThanOrEqual.attribute:()Ljava/lang/String;]
GreaterThanOrEqual.value ( ) : Object
[mangled: org/apache/spark/sql/sources/GreaterThanOrEqual.value:()Ljava/lang/Object;]
spark-sql_2.10-1.4.0.jar, IsNotNull.class
package org.apache.spark.sql.sources
IsNotNull.attribute ( ) : String
[mangled: org/apache/spark/sql/sources/IsNotNull.attribute:()Ljava/lang/String;]
spark-sql_2.10-1.4.0.jar, IsNull.class
package org.apache.spark.sql.sources
IsNull.attribute ( ) : String
[mangled: org/apache/spark/sql/sources/IsNull.attribute:()Ljava/lang/String;]
spark-sql_2.10-1.4.0.jar, LessThan.class
package org.apache.spark.sql.sources
LessThan.attribute ( ) : String
[mangled: org/apache/spark/sql/sources/LessThan.attribute:()Ljava/lang/String;]
LessThan.value ( ) : Object
[mangled: org/apache/spark/sql/sources/LessThan.value:()Ljava/lang/Object;]
spark-sql_2.10-1.4.0.jar, LessThanOrEqual.class
package org.apache.spark.sql.sources
LessThanOrEqual.attribute ( ) : String
[mangled: org/apache/spark/sql/sources/LessThanOrEqual.attribute:()Ljava/lang/String;]
LessThanOrEqual.value ( ) : Object
[mangled: org/apache/spark/sql/sources/LessThanOrEqual.value:()Ljava/lang/Object;]
spark-sql_2.10-1.4.0.jar, PrunedFilteredScan.class
package org.apache.spark.sql.sources
PrunedFilteredScan.buildScan ( String[ ] p1, Filter[ ] p2 ) [abstract] : org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>
[mangled: org/apache/spark/sql/sources/PrunedFilteredScan.buildScan:([Ljava/lang/String;[Lorg/apache/spark/sql/sources/Filter;)Lorg/apache/spark/rdd/RDD;]
spark-sql_2.10-1.4.0.jar, RelationProvider.class
package org.apache.spark.sql.sources
RelationProvider.createRelation ( org.apache.spark.sql.SQLContext p1, scala.collection.immutable.Map<String,String> p2 ) [abstract] : BaseRelation
[mangled: org/apache/spark/sql/sources/RelationProvider.createRelation:(Lorg/apache/spark/sql/SQLContext;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/sources/BaseRelation;]
spark-sql_2.10-1.4.0.jar, SaveMode.class
package org.apache.spark.sql
SaveMode.valueOf ( String name ) [static] : SaveMode
[mangled: org/apache/spark/sql/SaveMode.valueOf:(Ljava/lang/String;)Lorg/apache/spark/sql/SaveMode;]
SaveMode.values ( ) [static] : SaveMode[ ]
[mangled: org/apache/spark/sql/SaveMode.values:()[Lorg/apache/spark/sql/SaveMode;]
spark-sql_2.10-1.4.0.jar, SchemaRelationProvider.class
package org.apache.spark.sql.sources
SchemaRelationProvider.createRelation ( org.apache.spark.sql.SQLContext p1, scala.collection.immutable.Map<String,String> p2, org.apache.spark.sql.types.StructType p3 ) [abstract] : BaseRelation
[mangled: org/apache/spark/sql/sources/SchemaRelationProvider.createRelation:(Lorg/apache/spark/sql/SQLContext;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/sources/BaseRelation;]
spark-sql_2.10-1.4.0.jar, SQLContext.class
package org.apache.spark.sql
SQLContext.applySchemaToPythonRDD ( org.apache.spark.rdd.RDD<Object[ ]> rdd, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.applySchemaToPythonRDD:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.applySchemaToPythonRDD ( org.apache.spark.rdd.RDD<Object[ ]> rdd, String schemaString ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.applySchemaToPythonRDD:(Lorg/apache/spark/rdd/RDD;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.baseRelationToDataFrame ( sources.BaseRelation baseRelation ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.baseRelationToDataFrame:(Lorg/apache/spark/sql/sources/BaseRelation;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.cacheManager ( ) : execution.CacheManager
[mangled: org/apache/spark/sql/SQLContext.cacheManager:()Lorg/apache/spark/sql/execution/CacheManager;]
SQLContext.clearCache ( ) : void
[mangled: org/apache/spark/sql/SQLContext.clearCache:()V]
SQLContext.conf ( ) : SQLConf
[mangled: org/apache/spark/sql/SQLContext.conf:()Lorg/apache/spark/sql/SQLConf;]
SQLContext.createDataFrame ( org.apache.spark.api.java.JavaRDD<?> rdd, Class<?> beanClass ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/api/java/JavaRDD;Ljava/lang/Class;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.api.java.JavaRDD<Row> rowRDD, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/api/java/JavaRDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.rdd.RDD<?> rdd, Class<?> beanClass ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/rdd/RDD;Ljava/lang/Class;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.rdd.RDD<A> rdd, scala.reflect.api.TypeTags.TypeTag<A> p2 ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.rdd.RDD<Row> rowRDD, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.rdd.RDD<Row> rowRDD, types.StructType schema, boolean needsConversion ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;Z)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( scala.collection.Seq<A> data, scala.reflect.api.TypeTags.TypeTag<A> p2 ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lscala/collection/Seq;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String path ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String path, String source ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String source, java.util.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Ljava/util/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String source, types.StructType schema, java.util.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/types/StructType;Ljava/util/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String source, types.StructType schema, scala.collection.immutable.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/types/StructType;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String source, scala.collection.immutable.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.createSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.currentSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.currentSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.ddlParser ( ) : sources.DDLParser
[mangled: org/apache/spark/sql/SQLContext.ddlParser:()Lorg/apache/spark/sql/sources/DDLParser;]
SQLContext.defaultSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.defaultSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.detachSession ( ) : void
[mangled: org/apache/spark/sql/SQLContext.detachSession:()V]
SQLContext.dialectClassName ( ) : String
[mangled: org/apache/spark/sql/SQLContext.dialectClassName:()Ljava/lang/String;]
SQLContext.dropTempTable ( String tableName ) : void
[mangled: org/apache/spark/sql/SQLContext.dropTempTable:(Ljava/lang/String;)V]
SQLContext.emptyDataFrame ( ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.emptyDataFrame:()Lorg/apache/spark/sql/DataFrame;]
SQLContext.emptyResult ( ) : org.apache.spark.rdd.RDD<Row>
[mangled: org/apache/spark/sql/SQLContext.emptyResult:()Lorg/apache/spark/rdd/RDD;]
SQLContext.experimental ( ) : ExperimentalMethods
[mangled: org/apache/spark/sql/SQLContext.experimental:()Lorg/apache/spark/sql/ExperimentalMethods;]
SQLContext.functionRegistry ( ) : catalyst.analysis.FunctionRegistry
[mangled: org/apache/spark/sql/SQLContext.functionRegistry:()Lorg/apache/spark/sql/catalyst/analysis/FunctionRegistry;]
SQLContext.getAllConfs ( ) : scala.collection.immutable.Map<String,String>
[mangled: org/apache/spark/sql/SQLContext.getAllConfs:()Lscala/collection/immutable/Map;]
SQLContext.getConf ( String key ) : String
[mangled: org/apache/spark/sql/SQLContext.getConf:(Ljava/lang/String;)Ljava/lang/String;]
SQLContext.getConf ( String key, String defaultValue ) : String
[mangled: org/apache/spark/sql/SQLContext.getConf:(Ljava/lang/String;Ljava/lang/String;)Ljava/lang/String;]
SQLContext.getOrCreate ( org.apache.spark.SparkContext p1 ) [static] : SQLContext
[mangled: org/apache/spark/sql/SQLContext.getOrCreate:(Lorg/apache/spark/SparkContext;)Lorg/apache/spark/sql/SQLContext;]
SQLContext.getSchema ( Class<?> beanClass ) : scala.collection.Seq<catalyst.expressions.AttributeReference>
[mangled: org/apache/spark/sql/SQLContext.getSchema:(Ljava/lang/Class;)Lscala/collection/Seq;]
SQLContext.getSQLDialect ( ) : catalyst.ParserDialect
[mangled: org/apache/spark/sql/SQLContext.getSQLDialect:()Lorg/apache/spark/sql/catalyst/ParserDialect;]
SQLContext.implicits ( ) : SQLContext.implicits.
[mangled: org/apache/spark/sql/SQLContext.implicits:()Lorg/apache/spark/sql/SQLContext$implicits$;]
SQLContext.isCached ( String tableName ) : boolean
[mangled: org/apache/spark/sql/SQLContext.isCached:(Ljava/lang/String;)Z]
SQLContext.isTraceEnabled ( ) : boolean
[mangled: org/apache/spark/sql/SQLContext.isTraceEnabled:()Z]
SQLContext.log ( ) : org.slf4j.Logger
[mangled: org/apache/spark/sql/SQLContext.log:()Lorg/slf4j/Logger;]
SQLContext.logDebug ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/SQLContext.logDebug:(Lscala/Function0;)V]
SQLContext.logDebug ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/SQLContext.logDebug:(Lscala/Function0;Ljava/lang/Throwable;)V]
SQLContext.logError ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/SQLContext.logError:(Lscala/Function0;)V]
SQLContext.logError ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/SQLContext.logError:(Lscala/Function0;Ljava/lang/Throwable;)V]
SQLContext.logInfo ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/SQLContext.logInfo:(Lscala/Function0;)V]
SQLContext.logInfo ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/SQLContext.logInfo:(Lscala/Function0;Ljava/lang/Throwable;)V]
SQLContext.logName ( ) : String
[mangled: org/apache/spark/sql/SQLContext.logName:()Ljava/lang/String;]
SQLContext.logTrace ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/SQLContext.logTrace:(Lscala/Function0;)V]
SQLContext.logTrace ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/SQLContext.logTrace:(Lscala/Function0;Ljava/lang/Throwable;)V]
SQLContext.logWarning ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/SQLContext.logWarning:(Lscala/Function0;)V]
SQLContext.logWarning ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/SQLContext.logWarning:(Lscala/Function0;Ljava/lang/Throwable;)V]
SQLContext.openSession ( ) : SQLContext.SQLSession
[mangled: org/apache/spark/sql/SQLContext.openSession:()Lorg/apache/spark/sql/SQLContext$SQLSession;]
SQLContext.optimizer ( ) : catalyst.optimizer.Optimizer
[mangled: org/apache/spark/sql/SQLContext.optimizer:()Lorg/apache/spark/sql/catalyst/optimizer/Optimizer;]
SQLContext.org.apache.spark.Logging..log_ ( ) : org.slf4j.Logger
[mangled: org/apache/spark/sql/SQLContext.org.apache.spark.Logging..log_:()Lorg/slf4j/Logger;]
SQLContext.org.apache.spark.Logging..log__.eq ( org.slf4j.Logger p1 ) : void
[mangled: org/apache/spark/sql/SQLContext.org.apache.spark.Logging..log__.eq:(Lorg/slf4j/Logger;)V]
SQLContext.parquetFile ( String... paths ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.parquetFile:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.parseDataType ( String dataTypeString ) : types.DataType
[mangled: org/apache/spark/sql/SQLContext.parseDataType:(Ljava/lang/String;)Lorg/apache/spark/sql/types/DataType;]
SQLContext.range ( long start, long end ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.range:(JJ)Lorg/apache/spark/sql/DataFrame;]
SQLContext.range ( long start, long end, long step, int numPartitions ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.range:(JJJI)Lorg/apache/spark/sql/DataFrame;]
SQLContext.read ( ) : DataFrameReader
[mangled: org/apache/spark/sql/SQLContext.read:()Lorg/apache/spark/sql/DataFrameReader;]
SQLContext.registerDataFrameAsTable ( DataFrame df, String tableName ) : void
[mangled: org/apache/spark/sql/SQLContext.registerDataFrameAsTable:(Lorg/apache/spark/sql/DataFrame;Ljava/lang/String;)V]
SQLContext.setConf ( java.util.Properties props ) : void
[mangled: org/apache/spark/sql/SQLContext.setConf:(Ljava/util/Properties;)V]
SQLContext.setConf ( String key, String value ) : void
[mangled: org/apache/spark/sql/SQLContext.setConf:(Ljava/lang/String;Ljava/lang/String;)V]
SQLContext.sql ( String sqlText ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.sql:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.SQLContext ( org.apache.spark.api.java.JavaSparkContext sparkContext )
[mangled: org/apache/spark/sql/SQLContext."<init>":(Lorg/apache/spark/api/java/JavaSparkContext;)V]
SQLContext.sqlParser ( ) : SparkSQLParser
[mangled: org/apache/spark/sql/SQLContext.sqlParser:()Lorg/apache/spark/sql/SparkSQLParser;]
SQLContext.table ( String tableName ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.table:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.tableNames ( ) : String[ ]
[mangled: org/apache/spark/sql/SQLContext.tableNames:()[Ljava/lang/String;]
SQLContext.tableNames ( String databaseName ) : String[ ]
[mangled: org/apache/spark/sql/SQLContext.tableNames:(Ljava/lang/String;)[Ljava/lang/String;]
SQLContext.tables ( ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.tables:()Lorg/apache/spark/sql/DataFrame;]
SQLContext.tables ( String databaseName ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.tables:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.tlSession ( ) : ThreadLocal<SQLContext.SQLSession>
[mangled: org/apache/spark/sql/SQLContext.tlSession:()Ljava/lang/ThreadLocal;]
SQLContext.udf ( ) : UDFRegistration
[mangled: org/apache/spark/sql/SQLContext.udf:()Lorg/apache/spark/sql/UDFRegistration;]
spark-streaming_2.10-1.4.0.jar, StreamingContext.class
package org.apache.spark.streaming
StreamingContext.awaitTerminationOrTimeout ( long timeout ) : boolean
[mangled: org/apache/spark/streaming/StreamingContext.awaitTerminationOrTimeout:(J)Z]
StreamingContext.binaryRecordsStream ( String directory, int recordLength ) : dstream.DStream<byte[ ]>
[mangled: org/apache/spark/streaming/StreamingContext.binaryRecordsStream:(Ljava/lang/String;I)Lorg/apache/spark/streaming/dstream/DStream;]
StreamingContext.fileStream ( String directory, scala.Function1<org.apache.hadoop.fs.Path,Object> filter, boolean newFilesOnly, org.apache.hadoop.conf.Configuration conf, scala.reflect.ClassTag<K> p5, scala.reflect.ClassTag<V> p6, scala.reflect.ClassTag<F> p7 ) : dstream.InputDStream<scala.Tuple2<K,V>>
[mangled: org/apache/spark/streaming/StreamingContext.fileStream:(Ljava/lang/String;Lscala/Function1;ZLorg/apache/hadoop/conf/Configuration;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/streaming/dstream/InputDStream;]
StreamingContext.getActive ( ) [static] : scala.Option<StreamingContext>
[mangled: org/apache/spark/streaming/StreamingContext.getActive:()Lscala/Option;]
StreamingContext.getActiveOrCreate ( scala.Function0<StreamingContext> p1 ) [static] : StreamingContext
[mangled: org/apache/spark/streaming/StreamingContext.getActiveOrCreate:(Lscala/Function0;)Lorg/apache/spark/streaming/StreamingContext;]
StreamingContext.getActiveOrCreate ( String p1, scala.Function0<StreamingContext> p2, org.apache.hadoop.conf.Configuration p3, boolean p4 ) [static] : StreamingContext
[mangled: org/apache/spark/streaming/StreamingContext.getActiveOrCreate:(Ljava/lang/String;Lscala/Function0;Lorg/apache/hadoop/conf/Configuration;Z)Lorg/apache/spark/streaming/StreamingContext;]
StreamingContext.getNewInputStreamId ( ) : int
[mangled: org/apache/spark/streaming/StreamingContext.getNewInputStreamId:()I]
StreamingContext.getState ( ) : StreamingContextState
[mangled: org/apache/spark/streaming/StreamingContext.getState:()Lorg/apache/spark/streaming/StreamingContextState;]
StreamingContext.isCheckpointingEnabled ( ) : boolean
[mangled: org/apache/spark/streaming/StreamingContext.isCheckpointingEnabled:()Z]
StreamingContext.logName ( ) : String
[mangled: org/apache/spark/streaming/StreamingContext.logName:()Ljava/lang/String;]
StreamingContext.StreamingContext..startSite ( ) : java.util.concurrent.atomic.AtomicReference<org.apache.spark.util.CallSite>
[mangled: org/apache/spark/streaming/StreamingContext.org.apache.spark.streaming.StreamingContext..startSite:()Ljava/util/concurrent/atomic/AtomicReference;]
StreamingContext.StreamingContext..stopOnShutdown ( ) : void
[mangled: org/apache/spark/streaming/StreamingContext.org.apache.spark.streaming.StreamingContext..stopOnShutdown:()V]
StreamingContext.progressListener ( ) : ui.StreamingJobProgressListener
[mangled: org/apache/spark/streaming/StreamingContext.progressListener:()Lorg/apache/spark/streaming/ui/StreamingJobProgressListener;]
StreamingContext.StreamingContext ( String path )
[mangled: org/apache/spark/streaming/StreamingContext."<init>":(Ljava/lang/String;)V]
StreamingContext.StreamingContext ( String path, org.apache.spark.SparkContext sparkContext )
[mangled: org/apache/spark/streaming/StreamingContext."<init>":(Ljava/lang/String;Lorg/apache/spark/SparkContext;)V]
StreamingContext.uiTab ( ) : scala.Option<ui.StreamingTab>
[mangled: org/apache/spark/streaming/StreamingContext.uiTab:()Lscala/Option;]
StreamingContext.withNamedScope ( String name, scala.Function0<U> body ) : U
[mangled: org/apache/spark/streaming/StreamingContext.withNamedScope:(Ljava/lang/String;Lscala/Function0;)Ljava/lang/Object;]
StreamingContext.withScope ( scala.Function0<U> body ) : U
[mangled: org/apache/spark/streaming/StreamingContext.withScope:(Lscala/Function0;)Ljava/lang/Object;]
to the top
Problems with Data Types, High Severity (33)
spark-core_2.10-1.4.0.jar
package org.apache.spark
[+] Logging (1)
| Change | Effect |
---|
1 | Abstract method logName ( ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (14)
isTraceEnabled ( )This abstract method is from 'Logging' interface.
log ( )This abstract method is from 'Logging' interface.
logDebug ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logDebug ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
logError ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logError ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
logInfo ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logInfo ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
logTrace ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logTrace ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
logWarning ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logWarning ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
Logging..log_ ( )This abstract method is from 'Logging' interface.
Logging..log__.eq ( org.slf4j.Logger )This abstract method is from 'Logging' interface.
[+] SparkContext (1)
| Change | Effect |
---|
1 | Removed super-interface ExecutorAllocationClient. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (156)
fromSparkContext ( SparkContext )1st parameter 'p1' of this method has type 'SparkContext'.
JavaSparkContext ( SparkContext )1st parameter 'sc' of this method has type 'SparkContext'.
sc ( )Return value of this method has type 'SparkContext'.
toSparkContext ( api.java.JavaSparkContext )Return value of this method has type 'SparkContext'.
context ( )Return value of this method has type 'SparkContext'.
rdd.RDD ( SparkContext, scala.collection.Seq<Dependency<?>>, scala.reflect.ClassTag<T> )1st parameter '_sc' of this method has type 'SparkContext'.
sparkContext ( )Return value of this method has type 'SparkContext'.
accumulable ( R, AccumulableParam<R,T> )This method is from 'SparkContext' class.
accumulableCollection ( R, scala.Function1<R,scala.collection.generic.Growable<T>>, scala.reflect.ClassTag<R> )This method is from 'SparkContext' class.
accumulator ( T, AccumulatorParam<T> )This method is from 'SparkContext' class.
addedFiles ( )This method is from 'SparkContext' class.
addedJars ( )This method is from 'SparkContext' class.
addFile ( java.lang.String )This method is from 'SparkContext' class.
addJar ( java.lang.String )This method is from 'SparkContext' class.
addSparkListener ( scheduler.SparkListener )This method is from 'SparkContext' class.
appName ( )This method is from 'SparkContext' class.
booleanWritableConverter ( )This method is from 'SparkContext' class.
boolToBoolWritable ( boolean )This method is from 'SparkContext' class.
broadcast ( T, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
bytesToBytesWritable ( byte[ ] )This method is from 'SparkContext' class.
bytesWritableConverter ( )This method is from 'SparkContext' class.
cancelAllJobs ( )This method is from 'SparkContext' class.
cancelJob ( int )This method is from 'SparkContext' class.
cancelJobGroup ( java.lang.String )This method is from 'SparkContext' class.
cancelStage ( int )This method is from 'SparkContext' class.
checkpointDir ( )This method is from 'SparkContext' class.
checkpointDir_.eq ( scala.Option<java.lang.String> )This method is from 'SparkContext' class.
checkpointFile ( java.lang.String, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
cleaner ( )This method is from 'SparkContext' class.
cleanup ( long )This method is from 'SparkContext' class.
clearCallSite ( )This method is from 'SparkContext' class.
clearJobGroup ( )This method is from 'SparkContext' class.
conf ( )This method is from 'SparkContext' class.
dagScheduler ( )This method is from 'SparkContext' class.
dagScheduler_.eq ( scheduler.DAGScheduler )This method is from 'SparkContext' class.
defaultMinPartitions ( )This method is from 'SparkContext' class.
defaultParallelism ( )This method is from 'SparkContext' class.
doubleRDDToDoubleRDDFunctions ( rdd.RDD<java.lang.Object> )This method is from 'SparkContext' class.
doubleToDoubleWritable ( double )This method is from 'SparkContext' class.
doubleWritableConverter ( )This method is from 'SparkContext' class.
emptyRDD ( scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
env ( )This method is from 'SparkContext' class.
eventLogger ( )This method is from 'SparkContext' class.
executorEnvs ( )This method is from 'SparkContext' class.
executorMemory ( )This method is from 'SparkContext' class.
files ( )This method is from 'SparkContext' class.
floatToFloatWritable ( float )This method is from 'SparkContext' class.
floatWritableConverter ( )This method is from 'SparkContext' class.
getAllPools ( )This method is from 'SparkContext' class.
getCheckpointDir ( )This method is from 'SparkContext' class.
getConf ( )This method is from 'SparkContext' class.
getExecutorMemoryStatus ( )This method is from 'SparkContext' class.
getExecutorStorageStatus ( )This method is from 'SparkContext' class.
getLocalProperties ( )This method is from 'SparkContext' class.
getLocalProperty ( java.lang.String )This method is from 'SparkContext' class.
getPersistentRDDs ( )This method is from 'SparkContext' class.
getPoolForName ( java.lang.String )This method is from 'SparkContext' class.
getPreferredLocs ( rdd.RDD<?>, int )This method is from 'SparkContext' class.
getRDDStorageInfo ( )This method is from 'SparkContext' class.
getSchedulingMode ( )This method is from 'SparkContext' class.
getSparkHome ( )This method is from 'SparkContext' class.
hadoopConfiguration ( )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, int, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, java.lang.Class<? extends org.apache.hadoop.mapred.InputFormat<K,V>>, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<? extends org.apache.hadoop.mapred.InputFormat<K,V>>, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
intToIntWritable ( int )This method is from 'SparkContext' class.
intWritableConverter ( )This method is from 'SparkContext' class.
isLocal ( )This method is from 'SparkContext' class.
isTraceEnabled ( )This method is from 'SparkContext' class.
jarOfClass ( java.lang.Class<?> )This method is from 'SparkContext' class.
jarOfObject ( java.lang.Object )This method is from 'SparkContext' class.
jars ( )This method is from 'SparkContext' class.
listenerBus ( )This method is from 'SparkContext' class.
log ( )This method is from 'SparkContext' class.
logDebug ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logDebug ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logError ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logError ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logInfo ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logInfo ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logTrace ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logTrace ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logWarning ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logWarning ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
longToLongWritable ( long )This method is from 'SparkContext' class.
longWritableConverter ( )This method is from 'SparkContext' class.
makeRDD ( scala.collection.Seq<scala.Tuple2<T,scala.collection.Seq<java.lang.String>>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
makeRDD ( scala.collection.Seq<T>, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
master ( )This method is from 'SparkContext' class.
metadataCleaner ( )This method is from 'SparkContext' class.
newAPIHadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, org.apache.hadoop.conf.Configuration )This method is from 'SparkContext' class.
newAPIHadoopFile ( java.lang.String, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
newAPIHadoopRDD ( org.apache.hadoop.conf.Configuration, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )This method is from 'SparkContext' class.
newRddId ( )This method is from 'SparkContext' class.
newShuffleId ( )This method is from 'SparkContext' class.
numericRDDToDoubleRDDFunctions ( rdd.RDD<T>, scala.math.Numeric<T> )This method is from 'SparkContext' class.
objectFile ( java.lang.String, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
Logging..log_ ( )This method is from 'SparkContext' class.
Logging..log__.eq ( org.slf4j.Logger )This method is from 'SparkContext' class.
SparkContext..warnSparkMem ( java.lang.String )This method is from 'SparkContext' class.
parallelize ( scala.collection.Seq<T>, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
persistentRdds ( )This method is from 'SparkContext' class.
persistRDD ( rdd.RDD<?> )This method is from 'SparkContext' class.
preferredNodeLocationData ( )This method is from 'SparkContext' class.
preferredNodeLocationData_.eq ( scala.collection.Map<java.lang.String,scala.collection.Set<scheduler.SplitInfo>> )This method is from 'SparkContext' class.
rddToAsyncRDDActions ( rdd.RDD<T>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
rddToOrderedRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.math.Ordering<K>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V> )This method is from 'SparkContext' class.
rddToPairRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.math.Ordering<K> )This method is from 'SparkContext' class.
rddToSequenceFileRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.Function1<K,org.apache.hadoop.io.Writable>, scala.reflect.ClassTag<K>, scala.Function1<V,org.apache.hadoop.io.Writable>, scala.reflect.ClassTag<V> )This method is from 'SparkContext' class.
runApproximateJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, partial.ApproximateEvaluator<U,R>, long )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, boolean, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, boolean, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, boolean, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, int, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.Function0<WritableConverter<K>>, scala.Function0<WritableConverter<V>> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
setCallSite ( java.lang.String )This method is from 'SparkContext' class.
setCheckpointDir ( java.lang.String )This method is from 'SparkContext' class.
setJobDescription ( java.lang.String )This method is from 'SparkContext' class.
setJobGroup ( java.lang.String, java.lang.String, boolean )This method is from 'SparkContext' class.
setLocalProperties ( java.util.Properties )This method is from 'SparkContext' class.
setLocalProperty ( java.lang.String, java.lang.String )This method is from 'SparkContext' class.
SparkContext ( )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, java.lang.String )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, java.lang.String, scala.collection.Seq<java.lang.String> )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, java.lang.String, scala.collection.Seq<java.lang.String>, scala.collection.Map<java.lang.String,java.lang.String>, scala.collection.Map<java.lang.String,scala.collection.Set<scheduler.SplitInfo>> )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, SparkConf )This constructor is from 'SparkContext' class.
SparkContext ( SparkConf )This constructor is from 'SparkContext' class.
SparkContext ( SparkConf, scala.collection.Map<java.lang.String,scala.collection.Set<scheduler.SplitInfo>> )This constructor is from 'SparkContext' class.
sparkUser ( )This method is from 'SparkContext' class.
startTime ( )This method is from 'SparkContext' class.
stop ( )This method is from 'SparkContext' class.
stringToText ( java.lang.String )This method is from 'SparkContext' class.
stringWritableConverter ( )This method is from 'SparkContext' class.
submitJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.Function0<R> )This method is from 'SparkContext' class.
taskScheduler ( )This method is from 'SparkContext' class.
taskScheduler_.eq ( scheduler.TaskScheduler )This method is from 'SparkContext' class.
textFile ( java.lang.String, int )This method is from 'SparkContext' class.
union ( rdd.RDD<T>, scala.collection.Seq<rdd.RDD<T>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
union ( scala.collection.Seq<rdd.RDD<T>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
unpersistRDD ( int, boolean )This method is from 'SparkContext' class.
version ( )This method is from 'SparkContext' class.
wholeTextFiles ( java.lang.String, int )This method is from 'SparkContext' class.
writableWritableConverter ( )This method is from 'SparkContext' class.
sparkContext ( )Return value of this method has type 'SparkContext'.
SQLContext ( SparkContext )1st parameter 'sparkContext' of this method has type 'SparkContext'.
sc ( )Return value of this method has type 'SparkContext'.
sparkContext ( )Return value of this method has type 'SparkContext'.
StreamingContext ( SparkContext, streaming.Checkpoint, streaming.Duration )1st parameter 'sc_' of this method has type 'SparkContext'.
StreamingContext ( SparkContext, streaming.Duration )1st parameter 'sparkContext' of this method has type 'SparkContext'.
[+] TaskContext (9)
| Change | Effect |
---|
1 | Abstract method addTaskCompletionListener ( util.TaskCompletionListener ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
2 | Abstract method addTaskCompletionListener ( scala.Function1<TaskContext,scala.runtime.BoxedUnit> ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
3 | Abstract method attemptNumber ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
4 | Abstract method isCompleted ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
5 | Abstract method isInterrupted ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
6 | Abstract method isRunningLocally ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
7 | Abstract method taskAttemptId ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
8 | Abstract method taskMemoryManager ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
9 | Removed super-interface java.io.Serializable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (6)
compute ( Partition, TaskContext )2nd parameter 'p2' of this abstract method has type 'TaskContext'.
computeOrReadCheckpoint ( Partition, TaskContext )2nd parameter 'context' of this method has type 'TaskContext'.
iterator ( Partition, TaskContext )2nd parameter 'context' of this method has type 'TaskContext'.
partitionId ( )This abstract method is from 'TaskContext' abstract class.
stageId ( )This abstract method is from 'TaskContext' abstract class.
taskMetrics ( )This abstract method is from 'TaskContext' abstract class.
package org.apache.spark.api.java
[+] JavaSparkContext (1)
| Change | Effect |
---|
1 | Removed super-interface java.io.Closeable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (70)
accumulable ( T, org.apache.spark.AccumulableParam<T,R> )This method is from 'JavaSparkContext' class.
accumulator ( double )This method is from 'JavaSparkContext' class.
accumulator ( int )This method is from 'JavaSparkContext' class.
accumulator ( T, org.apache.spark.AccumulatorParam<T> )This method is from 'JavaSparkContext' class.
addFile ( java.lang.String )This method is from 'JavaSparkContext' class.
addJar ( java.lang.String )This method is from 'JavaSparkContext' class.
appName ( )This method is from 'JavaSparkContext' class.
broadcast ( T )This method is from 'JavaSparkContext' class.
cancelAllJobs ( )This method is from 'JavaSparkContext' class.
cancelJobGroup ( java.lang.String )This method is from 'JavaSparkContext' class.
checkpointFile ( java.lang.String )This method is from 'JavaSparkContext' class.
clearCallSite ( )This method is from 'JavaSparkContext' class.
clearJobGroup ( )This method is from 'JavaSparkContext' class.
defaultMinPartitions ( )This method is from 'JavaSparkContext' class.
defaultParallelism ( )This method is from 'JavaSparkContext' class.
doubleAccumulator ( double )This method is from 'JavaSparkContext' class.
env ( )This method is from 'JavaSparkContext' class.
fromSparkContext ( org.apache.spark.SparkContext )Return value of this method has type 'JavaSparkContext'.
getCheckpointDir ( )This method is from 'JavaSparkContext' class.
getConf ( )This method is from 'JavaSparkContext' class.
getLocalProperty ( java.lang.String )This method is from 'JavaSparkContext' class.
getSparkHome ( )This method is from 'JavaSparkContext' class.
hadoopConfiguration ( )This method is from 'JavaSparkContext' class.
hadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )This method is from 'JavaSparkContext' class.
hadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'JavaSparkContext' class.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )This method is from 'JavaSparkContext' class.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'JavaSparkContext' class.
intAccumulator ( int )This method is from 'JavaSparkContext' class.
isLocal ( )This method is from 'JavaSparkContext' class.
jarOfClass ( java.lang.Class<?> )This method is from 'JavaSparkContext' class.
jarOfObject ( java.lang.Object )This method is from 'JavaSparkContext' class.
jars ( )This method is from 'JavaSparkContext' class.
JavaSparkContext ( )This constructor is from 'JavaSparkContext' class.
JavaSparkContext ( java.lang.String, java.lang.String )This constructor is from 'JavaSparkContext' class.
JavaSparkContext ( java.lang.String, java.lang.String, java.lang.String, java.lang.String )This constructor is from 'JavaSparkContext' class.
JavaSparkContext ( java.lang.String, java.lang.String, java.lang.String, java.lang.String[ ] )This constructor is from 'JavaSparkContext' class.
JavaSparkContext ( java.lang.String, java.lang.String, java.lang.String, java.lang.String[ ], java.util.Map<java.lang.String,java.lang.String> )This constructor is from 'JavaSparkContext' class.
JavaSparkContext ( java.lang.String, java.lang.String, org.apache.spark.SparkConf )This constructor is from 'JavaSparkContext' class.
JavaSparkContext ( org.apache.spark.SparkConf )This constructor is from 'JavaSparkContext' class.
JavaSparkContext ( org.apache.spark.SparkContext )This constructor is from 'JavaSparkContext' class.
master ( )This method is from 'JavaSparkContext' class.
newAPIHadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, org.apache.hadoop.conf.Configuration )This method is from 'JavaSparkContext' class.
newAPIHadoopRDD ( org.apache.hadoop.conf.Configuration, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )This method is from 'JavaSparkContext' class.
objectFile ( java.lang.String )This method is from 'JavaSparkContext' class.
objectFile ( java.lang.String, int )This method is from 'JavaSparkContext' class.
parallelize ( java.util.List<T> )This method is from 'JavaSparkContext' class.
parallelize ( java.util.List<T>, int )This method is from 'JavaSparkContext' class.
parallelizeDoubles ( java.util.List<java.lang.Double> )This method is from 'JavaSparkContext' class.
parallelizeDoubles ( java.util.List<java.lang.Double>, int )This method is from 'JavaSparkContext' class.
parallelizePairs ( java.util.List<scala.Tuple2<K,V>> )This method is from 'JavaSparkContext' class.
parallelizePairs ( java.util.List<scala.Tuple2<K,V>>, int )This method is from 'JavaSparkContext' class.
sc ( )This method is from 'JavaSparkContext' class.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V> )This method is from 'JavaSparkContext' class.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'JavaSparkContext' class.
setCallSite ( java.lang.String )This method is from 'JavaSparkContext' class.
setCheckpointDir ( java.lang.String )This method is from 'JavaSparkContext' class.
setJobGroup ( java.lang.String, java.lang.String )This method is from 'JavaSparkContext' class.
setJobGroup ( java.lang.String, java.lang.String, boolean )This method is from 'JavaSparkContext' class.
setLocalProperty ( java.lang.String, java.lang.String )This method is from 'JavaSparkContext' class.
sparkUser ( )This method is from 'JavaSparkContext' class.
startTime ( )This method is from 'JavaSparkContext' class.
stop ( )This method is from 'JavaSparkContext' class.
textFile ( java.lang.String )This method is from 'JavaSparkContext' class.
textFile ( java.lang.String, int )This method is from 'JavaSparkContext' class.
toSparkContext ( JavaSparkContext )1st parameter 'p1' of this method has type 'JavaSparkContext'.
union ( JavaDoubleRDD, java.util.List<JavaDoubleRDD> )This method is from 'JavaSparkContext' class.
union ( JavaPairRDD<K,V>, java.util.List<JavaPairRDD<K,V>> )This method is from 'JavaSparkContext' class.
union ( JavaRDD<T>, java.util.List<JavaRDD<T>> )This method is from 'JavaSparkContext' class.
wholeTextFiles ( java.lang.String )This method is from 'JavaSparkContext' class.
wholeTextFiles ( java.lang.String, int )This method is from 'JavaSparkContext' class.
package org.apache.spark.broadcast
[+] Broadcast<T> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.Logging. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (2)
broadcast ( T )Return value of this method has type 'Broadcast<T>'.
broadcast ( T, scala.reflect.ClassTag<T> )Return value of this method has type 'Broadcast<T>'.
package org.apache.spark.rdd
[+] PairRDDFunctions<K,V> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.mapreduce.SparkHadoopMapReduceUtil. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
rddToPairRDDFunctions ( RDD<scala.Tuple2<K,V>>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.math.Ordering<K> )Return value of this method has type 'PairRDDFunctions<K,V>'.
spark-sql_2.10-1.4.0.jar
package org.apache.spark.sql
[+] DataFrame (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (120)
agg ( java.util.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
agg ( Column, Column... )This method is from 'DataFrame' class.
agg ( Column, scala.collection.Seq<Column> )This method is from 'DataFrame' class.
agg ( scala.collection.immutable.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
agg ( scala.Tuple2<java.lang.String,java.lang.String>, scala.collection.Seq<scala.Tuple2<java.lang.String,java.lang.String>> )This method is from 'DataFrame' class.
apply ( java.lang.String )This method is from 'DataFrame' class.
as ( java.lang.String )This method is from 'DataFrame' class.
as ( scala.Symbol )This method is from 'DataFrame' class.
cache ( )This method is from 'DataFrame' class.
cache ( )This method is from 'DataFrame' class.
coalesce ( int )This method is from 'DataFrame' class.
col ( java.lang.String )This method is from 'DataFrame' class.
collect ( )This method is from 'DataFrame' class.
collect ( )This method is from 'DataFrame' class.
collectAsList ( )This method is from 'DataFrame' class.
columns ( )This method is from 'DataFrame' class.
count ( )This method is from 'DataFrame' class.
cube ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
cube ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
cube ( Column... )This method is from 'DataFrame' class.
cube ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
DataFrame ( SQLContext, catalyst.plans.logical.LogicalPlan )This constructor is from 'DataFrame' class.
DataFrame ( SQLContext, SQLContext.QueryExecution )This constructor is from 'DataFrame' class.
describe ( java.lang.String... )This method is from 'DataFrame' class.
describe ( scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
distinct ( )This method is from 'DataFrame' class.
drop ( java.lang.String )This method is from 'DataFrame' class.
dropDuplicates ( )This method is from 'DataFrame' class.
dropDuplicates ( java.lang.String[ ] )This method is from 'DataFrame' class.
dropDuplicates ( scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
dtypes ( )This method is from 'DataFrame' class.
except ( DataFrame )This method is from 'DataFrame' class.
explain ( )This method is from 'DataFrame' class.
explain ( boolean )This method is from 'DataFrame' class.
explode ( java.lang.String, java.lang.String, scala.Function1<A,scala.collection.TraversableOnce<B>>, scala.reflect.api.TypeTags.TypeTag<B> )This method is from 'DataFrame' class.
explode ( scala.collection.Seq<Column>, scala.Function1<Row,scala.collection.TraversableOnce<A>>, scala.reflect.api.TypeTags.TypeTag<A> )This method is from 'DataFrame' class.
filter ( java.lang.String )This method is from 'DataFrame' class.
filter ( Column )This method is from 'DataFrame' class.
first ( )This method is from 'DataFrame' class.
first ( )This method is from 'DataFrame' class.
flatMap ( scala.Function1<Row,scala.collection.TraversableOnce<R>>, scala.reflect.ClassTag<R> )This method is from 'DataFrame' class.
foreach ( scala.Function1<Row,scala.runtime.BoxedUnit> )This method is from 'DataFrame' class.
foreachPartition ( scala.Function1<scala.collection.Iterator<Row>,scala.runtime.BoxedUnit> )This method is from 'DataFrame' class.
groupBy ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
groupBy ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
groupBy ( Column... )This method is from 'DataFrame' class.
groupBy ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
head ( )This method is from 'DataFrame' class.
head ( int )This method is from 'DataFrame' class.
intersect ( DataFrame )This method is from 'DataFrame' class.
isLocal ( )This method is from 'DataFrame' class.
javaRDD ( )This method is from 'DataFrame' class.
javaToPython ( )This method is from 'DataFrame' class.
join ( DataFrame )This method is from 'DataFrame' class.
join ( DataFrame, java.lang.String )This method is from 'DataFrame' class.
join ( DataFrame, Column )This method is from 'DataFrame' class.
join ( DataFrame, Column, java.lang.String )This method is from 'DataFrame' class.
limit ( int )This method is from 'DataFrame' class.
logicalPlan ( )This method is from 'DataFrame' class.
map ( scala.Function1<Row,R>, scala.reflect.ClassTag<R> )This method is from 'DataFrame' class.
mapPartitions ( scala.Function1<scala.collection.Iterator<Row>,scala.collection.Iterator<R>>, scala.reflect.ClassTag<R> )This method is from 'DataFrame' class.
na ( )This method is from 'DataFrame' class.
numericColumns ( )This method is from 'DataFrame' class.
orderBy ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
orderBy ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
orderBy ( Column... )This method is from 'DataFrame' class.
orderBy ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
DataFrame..logicalPlanToDataFrame ( catalyst.plans.logical.LogicalPlan )This method is from 'DataFrame' class.
persist ( )This method is from 'DataFrame' class.
persist ( )This method is from 'DataFrame' class.
persist ( org.apache.spark.storage.StorageLevel )This method is from 'DataFrame' class.
persist ( org.apache.spark.storage.StorageLevel )This method is from 'DataFrame' class.
printSchema ( )This method is from 'DataFrame' class.
queryExecution ( )This method is from 'DataFrame' class.
randomSplit ( double[ ] )This method is from 'DataFrame' class.
randomSplit ( double[ ], long )This method is from 'DataFrame' class.
randomSplit ( scala.collection.immutable.List<java.lang.Object>, long )This method is from 'DataFrame' class.
rdd ( )This method is from 'DataFrame' class.
registerTempTable ( java.lang.String )This method is from 'DataFrame' class.
repartition ( int )This method is from 'DataFrame' class.
resolve ( java.lang.String )This method is from 'DataFrame' class.
rollup ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
rollup ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
rollup ( Column... )This method is from 'DataFrame' class.
rollup ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
sample ( boolean, double )This method is from 'DataFrame' class.
sample ( boolean, double, long )This method is from 'DataFrame' class.
schema ( )This method is from 'DataFrame' class.
select ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
select ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
select ( Column... )This method is from 'DataFrame' class.
select ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
selectExpr ( java.lang.String... )This method is from 'DataFrame' class.
selectExpr ( scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
show ( )This method is from 'DataFrame' class.
show ( int )This method is from 'DataFrame' class.
showString ( int )This method is from 'DataFrame' class.
sort ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
sort ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
sort ( Column... )This method is from 'DataFrame' class.
sort ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
sqlContext ( )This method is from 'DataFrame' class.
stat ( )This method is from 'DataFrame' class.
take ( int )This method is from 'DataFrame' class.
take ( int )This method is from 'DataFrame' class.
toDF ( )This method is from 'DataFrame' class.
toDF ( java.lang.String... )This method is from 'DataFrame' class.
toDF ( scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
toJavaRDD ( )This method is from 'DataFrame' class.
toJSON ( )This method is from 'DataFrame' class.
toString ( )This method is from 'DataFrame' class.
unionAll ( DataFrame )This method is from 'DataFrame' class.
unpersist ( )This method is from 'DataFrame' class.
unpersist ( )This method is from 'DataFrame' class.
unpersist ( boolean )This method is from 'DataFrame' class.
unpersist ( boolean )This method is from 'DataFrame' class.
where ( Column )This method is from 'DataFrame' class.
withColumn ( java.lang.String, Column )This method is from 'DataFrame' class.
withColumnRenamed ( java.lang.String, java.lang.String )This method is from 'DataFrame' class.
write ( )This method is from 'DataFrame' class.
[+] DataFrameReader (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (17)
DataFrameReader ( SQLContext )This constructor is from 'DataFrameReader' class.
format ( java.lang.String )This method is from 'DataFrameReader' class.
jdbc ( java.lang.String, java.lang.String, java.lang.String, long, long, int, java.util.Properties )This method is from 'DataFrameReader' class.
jdbc ( java.lang.String, java.lang.String, java.lang.String[ ], java.util.Properties )This method is from 'DataFrameReader' class.
jdbc ( java.lang.String, java.lang.String, java.util.Properties )This method is from 'DataFrameReader' class.
json ( java.lang.String )This method is from 'DataFrameReader' class.
json ( org.apache.spark.api.java.JavaRDD<java.lang.String> )This method is from 'DataFrameReader' class.
json ( org.apache.spark.rdd.RDD<java.lang.String> )This method is from 'DataFrameReader' class.
load ( )This method is from 'DataFrameReader' class.
load ( java.lang.String )This method is from 'DataFrameReader' class.
option ( java.lang.String, java.lang.String )This method is from 'DataFrameReader' class.
options ( java.util.Map<java.lang.String,java.lang.String> )This method is from 'DataFrameReader' class.
options ( scala.collection.Map<java.lang.String,java.lang.String> )This method is from 'DataFrameReader' class.
parquet ( java.lang.String... )This method is from 'DataFrameReader' class.
parquet ( scala.collection.Seq<java.lang.String> )This method is from 'DataFrameReader' class.
schema ( types.StructType )This method is from 'DataFrameReader' class.
table ( java.lang.String )This method is from 'DataFrameReader' class.
[+] DataFrameWriter (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (16)
DataFrameWriter ( DataFrame )This constructor is from 'DataFrameWriter' class.
format ( java.lang.String )This method is from 'DataFrameWriter' class.
insertInto ( java.lang.String )This method is from 'DataFrameWriter' class.
jdbc ( java.lang.String, java.lang.String, java.util.Properties )This method is from 'DataFrameWriter' class.
json ( java.lang.String )This method is from 'DataFrameWriter' class.
mode ( java.lang.String )This method is from 'DataFrameWriter' class.
mode ( SaveMode )This method is from 'DataFrameWriter' class.
option ( java.lang.String, java.lang.String )This method is from 'DataFrameWriter' class.
options ( java.util.Map<java.lang.String,java.lang.String> )This method is from 'DataFrameWriter' class.
options ( scala.collection.Map<java.lang.String,java.lang.String> )This method is from 'DataFrameWriter' class.
parquet ( java.lang.String )This method is from 'DataFrameWriter' class.
partitionBy ( java.lang.String... )This method is from 'DataFrameWriter' class.
partitionBy ( scala.collection.Seq<java.lang.String> )This method is from 'DataFrameWriter' class.
save ( )This method is from 'DataFrameWriter' class.
save ( java.lang.String )This method is from 'DataFrameWriter' class.
saveAsTable ( java.lang.String )This method is from 'DataFrameWriter' class.
[+] SaveMode (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (2)
valueOf ( java.lang.String )This method is from 'SaveMode' class.
values ( )This method is from 'SaveMode' class.
[+] SQLContext (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.Logging. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (11)
analyzer ( )This method is from 'SQLContext' class.
cacheTable ( java.lang.String )This method is from 'SQLContext' class.
catalog ( )This method is from 'SQLContext' class.
executePlan ( catalyst.plans.logical.LogicalPlan )This method is from 'SQLContext' class.
executeSql ( java.lang.String )This method is from 'SQLContext' class.
parseSql ( java.lang.String )This method is from 'SQLContext' class.
planner ( )This method is from 'SQLContext' class.
prepareForExecution ( )This method is from 'SQLContext' class.
sparkContext ( )This method is from 'SQLContext' class.
SQLContext ( org.apache.spark.SparkContext )This constructor is from 'SQLContext' class.
uncacheTable ( java.lang.String )This method is from 'SQLContext' class.
[+] SQLContext.QueryExecution (1)
| Change | Effect |
---|
1 | This class became abstract. | A client program may be interrupted by InstantiationError exception. |
[+] affected methods (2)
executePlan ( catalyst.plans.logical.LogicalPlan )Return value of this method has type 'SQLContext.QueryExecution'.
executeSql ( java.lang.String )Return value of this method has type 'SQLContext.QueryExecution'.
package org.apache.spark.sql.sources
[+] BaseRelation (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (5)
BaseRelation ( )This constructor is from 'BaseRelation' abstract class.
needConversion ( )This method is from 'BaseRelation' abstract class.
schema ( )This abstract method is from 'BaseRelation' abstract class.
sizeInBytes ( )This method is from 'BaseRelation' abstract class.
sqlContext ( )This abstract method is from 'BaseRelation' abstract class.
[+] CreatableRelationProvider (1)
| Change | Effect |
---|
1 | This interface has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
createRelation ( org.apache.spark.sql.SQLContext, org.apache.spark.sql.SaveMode, scala.collection.immutable.Map<java.lang.String,java.lang.String>, org.apache.spark.sql.DataFrame )This abstract method is from 'CreatableRelationProvider' interface.
[+] EqualTo (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (2)
attribute ( )This method is from 'EqualTo' class.
value ( )This method is from 'EqualTo' class.
[+] Filter (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
Filter ( )This constructor is from 'Filter' abstract class.
[+] GreaterThan (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (2)
attribute ( )This method is from 'GreaterThan' class.
value ( )This method is from 'GreaterThan' class.
[+] GreaterThanOrEqual (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (2)
attribute ( )This method is from 'GreaterThanOrEqual' class.
value ( )This method is from 'GreaterThanOrEqual' class.
[+] IsNotNull (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
attribute ( )This method is from 'IsNotNull' class.
[+] IsNull (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
attribute ( )This method is from 'IsNull' class.
[+] LessThan (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (2)
attribute ( )This method is from 'LessThan' class.
value ( )This method is from 'LessThan' class.
[+] LessThanOrEqual (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (2)
attribute ( )This method is from 'LessThanOrEqual' class.
value ( )This method is from 'LessThanOrEqual' class.
[+] PrunedFilteredScan (1)
| Change | Effect |
---|
1 | This interface has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
buildScan ( java.lang.String[ ], Filter[ ] )This abstract method is from 'PrunedFilteredScan' interface.
[+] RelationProvider (1)
| Change | Effect |
---|
1 | This interface has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
createRelation ( org.apache.spark.sql.SQLContext, scala.collection.immutable.Map<java.lang.String,java.lang.String> )This abstract method is from 'RelationProvider' interface.
[+] SchemaRelationProvider (1)
| Change | Effect |
---|
1 | This interface has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
createRelation ( org.apache.spark.sql.SQLContext, scala.collection.immutable.Map<java.lang.String,java.lang.String>, org.apache.spark.sql.types.StructType )This abstract method is from 'SchemaRelationProvider' interface.
to the top
Problems with Data Types, Medium Severity (4)
spark-core_2.10-1.4.0.jar
package org.apache.spark.api.java
[+] JavaDoubleRDD (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaRDDLike<java.lang.Double,JavaDoubleRDD>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (3)
parallelizeDoubles ( java.util.List<java.lang.Double> )Return value of this method has type 'JavaDoubleRDD'.
parallelizeDoubles ( java.util.List<java.lang.Double>, int )Return value of this method has type 'JavaDoubleRDD'.
union ( JavaDoubleRDD, java.util.List<JavaDoubleRDD> )1st parameter 'first' of this method has type 'JavaDoubleRDD'.
[+] JavaPairRDD<K,V> (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaRDDLike<scala.Tuple2<K,V>,JavaPairRDD<K,V>>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (11)
hadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
newAPIHadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, org.apache.hadoop.conf.Configuration )Return value of this method has type 'JavaPairRDD<K,V>'.
newAPIHadoopRDD ( org.apache.hadoop.conf.Configuration, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
parallelizePairs ( java.util.List<scala.Tuple2<K,V>> )Return value of this method has type 'JavaPairRDD<K,V>'.
parallelizePairs ( java.util.List<scala.Tuple2<K,V>>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V> )Return value of this method has type 'JavaPairRDD<K,V>'.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V>, int )Return value of this method has type 'JavaPairRDD<K,V>'.
union ( JavaPairRDD<K,V>, java.util.List<JavaPairRDD<K,V>> )1st parameter 'first' of this method has type 'JavaPairRDD<K,V>'.
[+] JavaRDD<T> (1)
| Change | Effect |
---|
1 | Removed super-class AbstractJavaRDDLike<T,JavaRDD<T>>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (33)
cache ( )Return value of this method has type 'JavaRDD<T>'.
classTag ( )This method is from 'JavaRDD<T>' class.
coalesce ( int )Return value of this method has type 'JavaRDD<T>'.
coalesce ( int, boolean )Return value of this method has type 'JavaRDD<T>'.
distinct ( )Return value of this method has type 'JavaRDD<T>'.
distinct ( int )Return value of this method has type 'JavaRDD<T>'.
filter ( function.Function<T,java.lang.Boolean> )Return value of this method has type 'JavaRDD<T>'.
fromRDD ( org.apache.spark.rdd.RDD<T>, scala.reflect.ClassTag<T> )Return value of this method has type 'JavaRDD<T>'.
intersection ( JavaRDD<T> )Return value of this method has type 'JavaRDD<T>'.
JavaRDD ( org.apache.spark.rdd.RDD<T>, scala.reflect.ClassTag<T> )This method is from 'JavaRDD<T>' class.
persist ( org.apache.spark.storage.StorageLevel )Return value of this method has type 'JavaRDD<T>'.
rdd ( )This method is from 'JavaRDD<T>' class.
repartition ( int )Return value of this method has type 'JavaRDD<T>'.
sample ( boolean, double )Return value of this method has type 'JavaRDD<T>'.
sample ( boolean, double, long )Return value of this method has type 'JavaRDD<T>'.
setName ( java.lang.String )Return value of this method has type 'JavaRDD<T>'.
subtract ( JavaRDD<T> )Return value of this method has type 'JavaRDD<T>'.
subtract ( JavaRDD<T>, int )Return value of this method has type 'JavaRDD<T>'.
subtract ( JavaRDD<T>, org.apache.spark.Partitioner )Return value of this method has type 'JavaRDD<T>'.
toRDD ( JavaRDD<T> )1st parameter 'p1' of this method has type 'JavaRDD<T>'.
toString ( )This method is from 'JavaRDD<T>' class.
union ( JavaRDD<T> )Return value of this method has type 'JavaRDD<T>'.
unpersist ( )Return value of this method has type 'JavaRDD<T>'.
unpersist ( boolean )Return value of this method has type 'JavaRDD<T>'.
wrapRDD ( org.apache.spark.rdd.RDD )This method is from 'JavaRDD<T>' class.
wrapRDD ( org.apache.spark.rdd.RDD<T> )Return value of this method has type 'JavaRDD<T>'.
checkpointFile ( java.lang.String )Return value of this method has type 'JavaRDD<T>'.
objectFile ( java.lang.String )Return value of this method has type 'JavaRDD<T>'.
objectFile ( java.lang.String, int )Return value of this method has type 'JavaRDD<T>'.
parallelize ( java.util.List<T> )Return value of this method has type 'JavaRDD<T>'.
parallelize ( java.util.List<T>, int )Return value of this method has type 'JavaRDD<T>'.
union ( JavaRDD<T>, java.util.List<JavaRDD<T>> )1st parameter 'first' of this method has type 'JavaRDD<T>'.
toJavaRDD ( )Return value of this method has type 'JavaRDD<T>'.
package org.apache.spark.scheduler
[+] LiveListenerBus (1)
| Change | Effect |
---|
1 | Removed super-class org.apache.spark.util.AsynchronousListenerBus<SparkListener,SparkListenerEvent>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (1)
listenerBus ( )Return value of this method has type 'LiveListenerBus'.
to the top
Problems with Data Types, Low Severity (3)
spark-core_2.10-1.4.0.jar
package org.apache.spark
[+] TaskContext (3)
| Change | Effect |
---|
1 | Abstract method partitionId ( ) became non-abstract. | Some methods in this class may change behavior. |
2 | Abstract method stageId ( ) became non-abstract. | Some methods in this class may change behavior. |
3 | Abstract method taskMetrics ( ) became non-abstract. | Some methods in this class may change behavior. |
[+] affected methods (3)
partitionId ( )This abstract method is from 'TaskContext' abstract class.
stageId ( )This abstract method is from 'TaskContext' abstract class.
taskMetrics ( )This abstract method is from 'TaskContext' abstract class.
to the top
Problems with Methods, Low Severity (3)
spark-core_2.10-1.4.0.jar, TaskContext
package org.apache.spark
[+] TaskContext.partitionId ( ) [abstract] : int (1)
[mangled: org/apache/spark/TaskContext.partitionId:()I]
| Change | Effect |
---|
1 | Method became non-abstract.
| A client program may change behavior. |
[+] TaskContext.stageId ( ) [abstract] : int (1)
[mangled: org/apache/spark/TaskContext.stageId:()I]
| Change | Effect |
---|
1 | Method became non-abstract.
| A client program may change behavior. |
[+] TaskContext.taskMetrics ( ) [abstract] : executor.TaskMetrics (1)
[mangled: org/apache/spark/TaskContext.taskMetrics:()Lorg/apache/spark/executor/TaskMetrics;]
| Change | Effect |
---|
1 | Method became non-abstract.
| A client program may change behavior. |
to the top
Java ARchives (3)
spark-core_2.10-1.4.0.jar
spark-sql_2.10-1.4.0.jar
spark-streaming_2.10-1.4.0.jar
to the top
Generated on Tue Oct 20 09:53:46 2015 for spark-connector_2.10-1.0.0 by Java API Compliance Checker 1.4.1
A tool for checking backward compatibility of a Java library API