Binary compatibility report for the sparkling-water-core_2.10-1.3.3 library between 1.3.0 and 1.0.0 versions (relating to the portability of client application sparkling-water-core_2.10-1.3.3.jar)
Test Info
Library Name | sparkling-water-core_2.10-1.3.3 |
Version #1 | 1.3.0 |
Version #2 | 1.0.0 |
Java Version | 1.7.0_75 |
Test Results
Total Java ARchives | 3 |
---|
Total Methods / Classes | 579 / 2606 |
---|
Verdict | Incompatible (75.3%) |
Problem Summary
| Severity | Count |
---|
Added Methods | - | 48 |
---|
Removed Methods | High | 260 |
---|
Problems with Data Types | High | 19 |
---|
Medium | 1 |
Low | 3 |
Problems with Methods | High | 0 |
---|
Medium | 0 |
Low | 3 |
Added Methods (48)
spark-core_2.10-1.0.0.jar, SparkConf.class
package org.apache.spark
SparkConf.SparkConf..settings ( ) : scala.collection.mutable.HashMap<String,String>
[mangled: org/apache/spark/SparkConf.org.apache.spark.SparkConf..settings:()Lscala/collection/mutable/HashMap;]
spark-core_2.10-1.0.0.jar, SparkContext.class
package org.apache.spark
SparkContext.clean ( F f ) : F
[mangled: org/apache/spark/SparkContext.clean:(Ljava/lang/Object;)Ljava/lang/Object;]
SparkContext.getCallSite ( ) : String
[mangled: org/apache/spark/SparkContext.getCallSite:()Ljava/lang/String;]
SparkContext.ui ( ) : ui.SparkUI
[mangled: org/apache/spark/SparkContext.ui:()Lorg/apache/spark/ui/SparkUI;]
spark-core_2.10-1.0.0.jar, SparkEnv.class
package org.apache.spark
SparkEnv.connectionManager ( ) : network.ConnectionManager
[mangled: org/apache/spark/SparkEnv.connectionManager:()Lorg/apache/spark/network/ConnectionManager;]
SparkEnv.destroyPythonWorker ( String pythonExec, scala.collection.immutable.Map<String,String> envVars ) : void
[mangled: org/apache/spark/SparkEnv.destroyPythonWorker:(Ljava/lang/String;Lscala/collection/immutable/Map;)V]
SparkEnv.shuffleFetcher ( ) : ShuffleFetcher
[mangled: org/apache/spark/SparkEnv.shuffleFetcher:()Lorg/apache/spark/ShuffleFetcher;]
SparkEnv.shuffleMemoryMap ( ) : scala.collection.mutable.HashMap<Object,Object>
[mangled: org/apache/spark/SparkEnv.shuffleMemoryMap:()Lscala/collection/mutable/HashMap;]
SparkEnv.SparkEnv ( String executorId, akka.actor.ActorSystem actorSystem, serializer.Serializer serializer, serializer.Serializer closureSerializer, CacheManager cacheManager, MapOutputTracker mapOutputTracker, ShuffleFetcher shuffleFetcher, broadcast.BroadcastManager broadcastManager, storage.BlockManager blockManager, network.ConnectionManager connectionManager, SecurityManager securityManager, HttpFileServer httpFileServer, String sparkFilesDir, metrics.MetricsSystem metricsSystem, SparkConf conf )
[mangled: org/apache/spark/SparkEnv."<init>":(Ljava/lang/String;Lakka/actor/ActorSystem;Lorg/apache/spark/serializer/Serializer;Lorg/apache/spark/serializer/Serializer;Lorg/apache/spark/CacheManager;Lorg/apache/spark/MapOutputTracker;Lorg/apache/spark/ShuffleFetcher;Lorg/apache/spark/broadcast/BroadcastManager;Lorg/apache/spark/storage/BlockManager;Lorg/apache/spark/network/ConnectionManager;Lorg/apache/spark/SecurityManager;Lorg/apache/spark/HttpFileServer;Ljava/lang/String;Lorg/apache/spark/metrics/MetricsSystem;Lorg/apache/spark/SparkConf;)V]
spark-core_2.10-1.0.0.jar, SparkListenerBlockManagerAdded.class
package org.apache.spark.scheduler
SparkListenerBlockManagerAdded.copy ( org.apache.spark.storage.BlockManagerId blockManagerId, long maxMem ) : SparkListenerBlockManagerAdded
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded.copy:(Lorg/apache/spark/storage/BlockManagerId;J)Lorg/apache/spark/scheduler/SparkListenerBlockManagerAdded;]
SparkListenerBlockManagerAdded.SparkListenerBlockManagerAdded ( org.apache.spark.storage.BlockManagerId blockManagerId, long maxMem )
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded."<init>":(Lorg/apache/spark/storage/BlockManagerId;J)V]
spark-core_2.10-1.0.0.jar, SparkListenerBlockManagerRemoved.class
package org.apache.spark.scheduler
SparkListenerBlockManagerRemoved.andThen ( scala.Function1<SparkListenerBlockManagerRemoved,A> p1 ) [static] : scala.Function1<org.apache.spark.storage.BlockManagerId,A>
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.andThen:(Lscala/Function1;)Lscala/Function1;]
SparkListenerBlockManagerRemoved.compose ( scala.Function1<A,org.apache.spark.storage.BlockManagerId> p1 ) [static] : scala.Function1<A,SparkListenerBlockManagerRemoved>
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.compose:(Lscala/Function1;)Lscala/Function1;]
SparkListenerBlockManagerRemoved.copy ( org.apache.spark.storage.BlockManagerId blockManagerId ) : SparkListenerBlockManagerRemoved
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.copy:(Lorg/apache/spark/storage/BlockManagerId;)Lorg/apache/spark/scheduler/SparkListenerBlockManagerRemoved;]
SparkListenerBlockManagerRemoved.SparkListenerBlockManagerRemoved ( org.apache.spark.storage.BlockManagerId blockManagerId )
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved."<init>":(Lorg/apache/spark/storage/BlockManagerId;)V]
spark-core_2.10-1.0.0.jar, TaskContext.class
package org.apache.spark
TaskContext.completed ( ) : boolean
[mangled: org/apache/spark/TaskContext.completed:()Z]
TaskContext.completed_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/TaskContext.completed_.eq:(Z)V]
TaskContext.executeOnCompleteCallbacks ( ) : void
[mangled: org/apache/spark/TaskContext.executeOnCompleteCallbacks:()V]
TaskContext.interrupted ( ) : boolean
[mangled: org/apache/spark/TaskContext.interrupted:()Z]
TaskContext.interrupted_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/TaskContext.interrupted_.eq:(Z)V]
TaskContext.TaskContext ( int stageId, int partitionId, long attemptId, boolean runningLocally, executor.TaskMetrics taskMetrics )
[mangled: org/apache/spark/TaskContext."<init>":(IIJZLorg/apache/spark/executor/TaskMetrics;)V]
spark-sql_2.10-1.0.0.jar, SQLContext.class
package org.apache.spark.sql
SQLContext.binaryToLiteral ( byte[ ] a ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.binaryToLiteral:([B)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.booleanToLiteral ( boolean b ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.booleanToLiteral:(Z)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.byteToLiteral ( byte b ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.byteToLiteral:(B)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.createParquetFile ( String path, boolean allowExisting, org.apache.hadoop.conf.Configuration conf, scala.reflect.api.TypeTags.TypeTag<A> p4 ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.createParquetFile:(Ljava/lang/String;ZLorg/apache/hadoop/conf/Configuration;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.createSchemaRDD ( org.apache.spark.rdd.RDD<A> rdd, scala.reflect.api.TypeTags.TypeTag<A> p2 ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.createSchemaRDD:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.decimalToLiteral ( scala.math.BigDecimal d ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.decimalToLiteral:(Lscala/math/BigDecimal;)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.doubleToLiteral ( double d ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.doubleToLiteral:(D)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.DslAttribute ( catalyst.expressions.AttributeReference a ) : catalyst.dsl.package.ExpressionConversions.DslAttribute
[mangled: org/apache/spark/sql/SQLContext.DslAttribute:(Lorg/apache/spark/sql/catalyst/expressions/AttributeReference;)Lorg/apache/spark/sql/catalyst/dsl/package$ExpressionConversions$DslAttribute;]
SQLContext.DslExpression ( catalyst.expressions.Expression e ) : catalyst.dsl.package.ExpressionConversions.DslExpression
[mangled: org/apache/spark/sql/SQLContext.DslExpression:(Lorg/apache/spark/sql/catalyst/expressions/Expression;)Lorg/apache/spark/sql/catalyst/dsl/package$ExpressionConversions$DslExpression;]
SQLContext.DslString ( String s ) : catalyst.dsl.package.ExpressionConversions.DslString
[mangled: org/apache/spark/sql/SQLContext.DslString:(Ljava/lang/String;)Lorg/apache/spark/sql/catalyst/dsl/package$ExpressionConversions$DslString;]
SQLContext.DslSymbol ( scala.Symbol sym ) : catalyst.dsl.package.ExpressionConversions.DslSymbol
[mangled: org/apache/spark/sql/SQLContext.DslSymbol:(Lscala/Symbol;)Lorg/apache/spark/sql/catalyst/dsl/package$ExpressionConversions$DslSymbol;]
SQLContext.floatToLiteral ( float f ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.floatToLiteral:(F)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.inferSchema ( org.apache.spark.rdd.RDD<scala.collection.immutable.Map<String,Object>> rdd ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.inferSchema:(Lorg/apache/spark/rdd/RDD;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.intToLiteral ( int i ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.intToLiteral:(I)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.logger ( ) : com.typesafe.scalalogging.slf4j.Logger
[mangled: org/apache/spark/sql/SQLContext.logger:()Lcom/typesafe/scalalogging/slf4j/Logger;]
SQLContext.logicalPlanToSparkQuery ( catalyst.plans.logical.LogicalPlan plan ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.logicalPlanToSparkQuery:(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.longToLiteral ( long l ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.longToLiteral:(J)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.optimizer ( ) : catalyst.optimizer.Optimizer.
[mangled: org/apache/spark/sql/SQLContext.optimizer:()Lorg/apache/spark/sql/catalyst/optimizer/Optimizer$;]
SQLContext.parquetFile ( String path ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.parquetFile:(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.parser ( ) : catalyst.SqlParser
[mangled: org/apache/spark/sql/SQLContext.parser:()Lorg/apache/spark/sql/catalyst/SqlParser;]
SQLContext.registerRDDAsTable ( SchemaRDD rdd, String tableName ) : void
[mangled: org/apache/spark/sql/SQLContext.registerRDDAsTable:(Lorg/apache/spark/sql/SchemaRDD;Ljava/lang/String;)V]
SQLContext.shortToLiteral ( short s ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.shortToLiteral:(S)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.sql ( String sqlText ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.sql:(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.stringToLiteral ( String s ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.stringToLiteral:(Ljava/lang/String;)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
SQLContext.symbolToUnresolvedAttribute ( scala.Symbol s ) : catalyst.analysis.UnresolvedAttribute
[mangled: org/apache/spark/sql/SQLContext.symbolToUnresolvedAttribute:(Lscala/Symbol;)Lorg/apache/spark/sql/catalyst/analysis/UnresolvedAttribute;]
SQLContext.table ( String tableName ) : SchemaRDD
[mangled: org/apache/spark/sql/SQLContext.table:(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;]
SQLContext.timestampToLiteral ( java.sql.Timestamp t ) : catalyst.expressions.Literal
[mangled: org/apache/spark/sql/SQLContext.timestampToLiteral:(Ljava/sql/Timestamp;)Lorg/apache/spark/sql/catalyst/expressions/Literal;]
to the top
Removed Methods (260)
spark-core_2.10-1.3.0.jar, Logging.class
package org.apache.spark
Logging.logName ( ) [abstract] : String
[mangled: org/apache/spark/Logging.logName:()Ljava/lang/String;]
spark-core_2.10-1.3.0.jar, SparkConf.class
package org.apache.spark
SparkConf.getAppId ( ) : String
[mangled: org/apache/spark/SparkConf.getAppId:()Ljava/lang/String;]
SparkConf.getenv ( String name ) : String
[mangled: org/apache/spark/SparkConf.getenv:(Ljava/lang/String;)Ljava/lang/String;]
SparkConf.isAkkaConf ( String p1 ) [static] : boolean
[mangled: org/apache/spark/SparkConf.isAkkaConf:(Ljava/lang/String;)Z]
SparkConf.isExecutorStartupConf ( String p1 ) [static] : boolean
[mangled: org/apache/spark/SparkConf.isExecutorStartupConf:(Ljava/lang/String;)Z]
SparkConf.isSparkPortConf ( String p1 ) [static] : boolean
[mangled: org/apache/spark/SparkConf.isSparkPortConf:(Ljava/lang/String;)Z]
SparkConf.logName ( ) : String
[mangled: org/apache/spark/SparkConf.logName:()Ljava/lang/String;]
SparkConf.registerKryoClasses ( Class<?>[ ] classes ) : SparkConf
[mangled: org/apache/spark/SparkConf.registerKryoClasses:([Ljava/lang/Class;)Lorg/apache/spark/SparkConf;]
SparkConf.translateConfKey ( String p1, boolean p2 ) [static] : String
[mangled: org/apache/spark/SparkConf.translateConfKey:(Ljava/lang/String;Z)Ljava/lang/String;]
spark-core_2.10-1.3.0.jar, SparkContext.class
package org.apache.spark
SparkContext.accumulable ( R initialValue, String name, AccumulableParam<R,T> param ) : Accumulable<R,T>
[mangled: org/apache/spark/SparkContext.accumulable:(Ljava/lang/Object;Ljava/lang/String;Lorg/apache/spark/AccumulableParam;)Lorg/apache/spark/Accumulable;]
SparkContext.accumulator ( T initialValue, String name, AccumulatorParam<T> param ) : Accumulator<T>
[mangled: org/apache/spark/SparkContext.accumulator:(Ljava/lang/Object;Ljava/lang/String;Lorg/apache/spark/AccumulatorParam;)Lorg/apache/spark/Accumulator;]
SparkContext.addFile ( String path, boolean recursive ) : void
[mangled: org/apache/spark/SparkContext.addFile:(Ljava/lang/String;Z)V]
SparkContext.applicationId ( ) : String
[mangled: org/apache/spark/SparkContext.applicationId:()Ljava/lang/String;]
SparkContext.binaryFiles ( String path, int minPartitions ) : rdd.RDD<scala.Tuple2<String,input.PortableDataStream>>
[mangled: org/apache/spark/SparkContext.binaryFiles:(Ljava/lang/String;I)Lorg/apache/spark/rdd/RDD;]
SparkContext.binaryRecords ( String path, int recordLength, org.apache.hadoop.conf.Configuration conf ) : rdd.RDD<byte[ ]>
[mangled: org/apache/spark/SparkContext.binaryRecords:(Ljava/lang/String;ILorg/apache/hadoop/conf/Configuration;)Lorg/apache/spark/rdd/RDD;]
SparkContext.clean ( F f, boolean checkSerializable ) : F
[mangled: org/apache/spark/SparkContext.clean:(Ljava/lang/Object;Z)Ljava/lang/Object;]
SparkContext.createSparkEnv ( SparkConf conf, boolean isLocal, scheduler.LiveListenerBus listenerBus ) : SparkEnv
[mangled: org/apache/spark/SparkContext.createSparkEnv:(Lorg/apache/spark/SparkConf;ZLorg/apache/spark/scheduler/LiveListenerBus;)Lorg/apache/spark/SparkEnv;]
SparkContext.eventLogCodec ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.eventLogCodec:()Lscala/Option;]
SparkContext.eventLogDir ( ) : scala.Option<String>
[mangled: org/apache/spark/SparkContext.eventLogDir:()Lscala/Option;]
SparkContext.executorAllocationManager ( ) : scala.Option<ExecutorAllocationManager>
[mangled: org/apache/spark/SparkContext.executorAllocationManager:()Lscala/Option;]
SparkContext.getCallSite ( ) : util.CallSite
[mangled: org/apache/spark/SparkContext.getCallSite:()Lorg/apache/spark/util/CallSite;]
SparkContext.getExecutorThreadDump ( String executorId ) : scala.Option<util.ThreadStackTrace[ ]>
[mangled: org/apache/spark/SparkContext.getExecutorThreadDump:(Ljava/lang/String;)Lscala/Option;]
SparkContext.isEventLogEnabled ( ) : boolean
[mangled: org/apache/spark/SparkContext.isEventLogEnabled:()Z]
SparkContext.jobProgressListener ( ) : ui.jobs.JobProgressListener
[mangled: org/apache/spark/SparkContext.jobProgressListener:()Lorg/apache/spark/ui/jobs/JobProgressListener;]
SparkContext.killExecutor ( String executorId ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutor:(Ljava/lang/String;)Z]
SparkContext.killExecutors ( scala.collection.Seq<String> executorIds ) : boolean
[mangled: org/apache/spark/SparkContext.killExecutors:(Lscala/collection/Seq;)Z]
SparkContext.logName ( ) : String
[mangled: org/apache/spark/SparkContext.logName:()Ljava/lang/String;]
SparkContext.metricsSystem ( ) : metrics.MetricsSystem
[mangled: org/apache/spark/SparkContext.metricsSystem:()Lorg/apache/spark/metrics/MetricsSystem;]
SparkContext.SparkContext..creationSite ( ) : util.CallSite
[mangled: org/apache/spark/SparkContext.org.apache.spark.SparkContext..creationSite:()Lorg/apache/spark/util/CallSite;]
SparkContext.progressBar ( ) : scala.Option<ui.ConsoleProgressBar>
[mangled: org/apache/spark/SparkContext.progressBar:()Lscala/Option;]
SparkContext.requestExecutors ( int numAdditionalExecutors ) : boolean
[mangled: org/apache/spark/SparkContext.requestExecutors:(I)Z]
SparkContext.requestTotalExecutors ( int numExecutors ) : boolean
[mangled: org/apache/spark/SparkContext.requestTotalExecutors:(I)Z]
SparkContext.schedulerBackend ( ) : scheduler.SchedulerBackend
[mangled: org/apache/spark/SparkContext.schedulerBackend:()Lorg/apache/spark/scheduler/SchedulerBackend;]
SparkContext.schedulerBackend_.eq ( scheduler.SchedulerBackend p1 ) : void
[mangled: org/apache/spark/SparkContext.schedulerBackend_.eq:(Lorg/apache/spark/scheduler/SchedulerBackend;)V]
SparkContext.setCallSite ( util.CallSite callSite ) : void
[mangled: org/apache/spark/SparkContext.setCallSite:(Lorg/apache/spark/util/CallSite;)V]
SparkContext.statusTracker ( ) : SparkStatusTracker
[mangled: org/apache/spark/SparkContext.statusTracker:()Lorg/apache/spark/SparkStatusTracker;]
SparkContext.ui ( ) : scala.Option<ui.SparkUI>
[mangled: org/apache/spark/SparkContext.ui:()Lscala/Option;]
spark-core_2.10-1.3.0.jar, SparkEnv.class
package org.apache.spark
SparkEnv.blockTransferService ( ) : network.BlockTransferService
[mangled: org/apache/spark/SparkEnv.blockTransferService:()Lorg/apache/spark/network/BlockTransferService;]
SparkEnv.destroyPythonWorker ( String pythonExec, scala.collection.immutable.Map<String,String> envVars, java.net.Socket worker ) : void
[mangled: org/apache/spark/SparkEnv.destroyPythonWorker:(Ljava/lang/String;Lscala/collection/immutable/Map;Ljava/net/Socket;)V]
SparkEnv.isStopped ( ) : boolean
[mangled: org/apache/spark/SparkEnv.isStopped:()Z]
SparkEnv.isStopped_.eq ( boolean p1 ) : void
[mangled: org/apache/spark/SparkEnv.isStopped_.eq:(Z)V]
SparkEnv.logName ( ) : String
[mangled: org/apache/spark/SparkEnv.logName:()Ljava/lang/String;]
SparkEnv.outputCommitCoordinator ( ) : scheduler.OutputCommitCoordinator
[mangled: org/apache/spark/SparkEnv.outputCommitCoordinator:()Lorg/apache/spark/scheduler/OutputCommitCoordinator;]
SparkEnv.releasePythonWorker ( String pythonExec, scala.collection.immutable.Map<String,String> envVars, java.net.Socket worker ) : void
[mangled: org/apache/spark/SparkEnv.releasePythonWorker:(Ljava/lang/String;Lscala/collection/immutable/Map;Ljava/net/Socket;)V]
SparkEnv.shuffleManager ( ) : shuffle.ShuffleManager
[mangled: org/apache/spark/SparkEnv.shuffleManager:()Lorg/apache/spark/shuffle/ShuffleManager;]
SparkEnv.shuffleMemoryManager ( ) : shuffle.ShuffleMemoryManager
[mangled: org/apache/spark/SparkEnv.shuffleMemoryManager:()Lorg/apache/spark/shuffle/ShuffleMemoryManager;]
SparkEnv.SparkEnv ( String executorId, akka.actor.ActorSystem actorSystem, serializer.Serializer serializer, serializer.Serializer closureSerializer, CacheManager cacheManager, MapOutputTracker mapOutputTracker, shuffle.ShuffleManager shuffleManager, broadcast.BroadcastManager broadcastManager, network.BlockTransferService blockTransferService, storage.BlockManager blockManager, SecurityManager securityManager, HttpFileServer httpFileServer, String sparkFilesDir, metrics.MetricsSystem metricsSystem, shuffle.ShuffleMemoryManager shuffleMemoryManager, scheduler.OutputCommitCoordinator outputCommitCoordinator, SparkConf conf )
[mangled: org/apache/spark/SparkEnv."<init>":(Ljava/lang/String;Lakka/actor/ActorSystem;Lorg/apache/spark/serializer/Serializer;Lorg/apache/spark/serializer/Serializer;Lorg/apache/spark/CacheManager;Lorg/apache/spark/MapOutputTracker;Lorg/apache/spark/shuffle/ShuffleManager;Lorg/apache/spark/broadcast/BroadcastManager;Lorg/apache/spark/network/BlockTransferService;Lorg/apache/spark/storage/BlockManager;Lorg/apache/spark/SecurityManager;Lorg/apache/spark/HttpFileServer;Ljava/lang/String;Lorg/apache/spark/metrics/MetricsSystem;Lorg/apache/spark/shuffle/ShuffleMemoryManager;Lorg/apache/spark/scheduler/OutputCommitCoordinator;Lorg/apache/spark/SparkConf;)V]
spark-core_2.10-1.3.0.jar, SparkListenerBlockManagerAdded.class
package org.apache.spark.scheduler
SparkListenerBlockManagerAdded.copy ( long time, org.apache.spark.storage.BlockManagerId blockManagerId, long maxMem ) : SparkListenerBlockManagerAdded
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded.copy:(JLorg/apache/spark/storage/BlockManagerId;J)Lorg/apache/spark/scheduler/SparkListenerBlockManagerAdded;]
SparkListenerBlockManagerAdded.SparkListenerBlockManagerAdded ( long time, org.apache.spark.storage.BlockManagerId blockManagerId, long maxMem )
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded."<init>":(JLorg/apache/spark/storage/BlockManagerId;J)V]
SparkListenerBlockManagerAdded.time ( ) : long
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerAdded.time:()J]
spark-core_2.10-1.3.0.jar, SparkListenerBlockManagerRemoved.class
package org.apache.spark.scheduler
SparkListenerBlockManagerRemoved.copy ( long time, org.apache.spark.storage.BlockManagerId blockManagerId ) : SparkListenerBlockManagerRemoved
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.copy:(JLorg/apache/spark/storage/BlockManagerId;)Lorg/apache/spark/scheduler/SparkListenerBlockManagerRemoved;]
SparkListenerBlockManagerRemoved.curried ( ) [static] : scala.Function1<Object,scala.Function1<org.apache.spark.storage.BlockManagerId,SparkListenerBlockManagerRemoved>>
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.curried:()Lscala/Function1;]
SparkListenerBlockManagerRemoved.SparkListenerBlockManagerRemoved ( long time, org.apache.spark.storage.BlockManagerId blockManagerId )
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved."<init>":(JLorg/apache/spark/storage/BlockManagerId;)V]
SparkListenerBlockManagerRemoved.time ( ) : long
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.time:()J]
SparkListenerBlockManagerRemoved.tupled ( ) [static] : scala.Function1<scala.Tuple2<Object,org.apache.spark.storage.BlockManagerId>,SparkListenerBlockManagerRemoved>
[mangled: org/apache/spark/scheduler/SparkListenerBlockManagerRemoved.tupled:()Lscala/Function1;]
spark-core_2.10-1.3.0.jar, TaskContext.class
package org.apache.spark
TaskContext.addTaskCompletionListener ( util.TaskCompletionListener p1 ) [abstract] : TaskContext
[mangled: org/apache/spark/TaskContext.addTaskCompletionListener:(Lorg/apache/spark/util/TaskCompletionListener;)Lorg/apache/spark/TaskContext;]
TaskContext.addTaskCompletionListener ( scala.Function1<TaskContext,scala.runtime.BoxedUnit> p1 ) [abstract] : TaskContext
[mangled: org/apache/spark/TaskContext.addTaskCompletionListener:(Lscala/Function1;)Lorg/apache/spark/TaskContext;]
TaskContext.attemptNumber ( ) [abstract] : int
[mangled: org/apache/spark/TaskContext.attemptNumber:()I]
TaskContext.get ( ) [static] : TaskContext
[mangled: org/apache/spark/TaskContext.get:()Lorg/apache/spark/TaskContext;]
TaskContext.isCompleted ( ) [abstract] : boolean
[mangled: org/apache/spark/TaskContext.isCompleted:()Z]
TaskContext.isInterrupted ( ) [abstract] : boolean
[mangled: org/apache/spark/TaskContext.isInterrupted:()Z]
TaskContext.isRunningLocally ( ) [abstract] : boolean
[mangled: org/apache/spark/TaskContext.isRunningLocally:()Z]
TaskContext.taskAttemptId ( ) [abstract] : long
[mangled: org/apache/spark/TaskContext.taskAttemptId:()J]
TaskContext.TaskContext ( )
[mangled: org/apache/spark/TaskContext."<init>":()V]
spark-sql_2.10-1.3.0.jar, DataFrame.class
package org.apache.spark.sql
DataFrame.agg ( java.util.Map<String,String> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Ljava/util/Map;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( Column expr, Column... exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lorg/apache/spark/sql/Column;[Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( Column expr, scala.collection.Seq<Column> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lorg/apache/spark/sql/Column;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( scala.collection.immutable.Map<String,String> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.agg ( scala.Tuple2<String,String> aggExpr, scala.collection.Seq<scala.Tuple2<String,String>> aggExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.agg:(Lscala/Tuple2;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.apply ( String colName ) : Column
[mangled: org/apache/spark/sql/DataFrame.apply:(Ljava/lang/String;)Lorg/apache/spark/sql/Column;]
DataFrame.as ( scala.Symbol alias ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.as:(Lscala/Symbol;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.as ( String alias ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.as:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.cache ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.cache:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.cache ( ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.cache:()Lorg/apache/spark/sql/RDDApi;]
DataFrame.col ( String colName ) : Column
[mangled: org/apache/spark/sql/DataFrame.col:(Ljava/lang/String;)Lorg/apache/spark/sql/Column;]
DataFrame.collect ( ) : Object
[mangled: org/apache/spark/sql/DataFrame.collect:()Ljava/lang/Object;]
DataFrame.collect ( ) : Row[ ]
[mangled: org/apache/spark/sql/DataFrame.collect:()[Lorg/apache/spark/sql/Row;]
DataFrame.collectAsList ( ) : java.util.List<Row>
[mangled: org/apache/spark/sql/DataFrame.collectAsList:()Ljava/util/List;]
DataFrame.columns ( ) : String[ ]
[mangled: org/apache/spark/sql/DataFrame.columns:()[Ljava/lang/String;]
DataFrame.count ( ) : long
[mangled: org/apache/spark/sql/DataFrame.count:()J]
DataFrame.createJDBCTable ( String url, String table, boolean allowExisting ) : void
[mangled: org/apache/spark/sql/DataFrame.createJDBCTable:(Ljava/lang/String;Ljava/lang/String;Z)V]
DataFrame.DataFrame ( SQLContext sqlContext, catalyst.plans.logical.LogicalPlan logicalPlan )
[mangled: org/apache/spark/sql/DataFrame."<init>":(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)V]
DataFrame.DataFrame ( SQLContext sqlContext, SQLContext.QueryExecution queryExecution )
[mangled: org/apache/spark/sql/DataFrame."<init>":(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/SQLContext$QueryExecution;)V]
DataFrame.distinct ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.distinct:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.dtypes ( ) : scala.Tuple2<String,String>[ ]
[mangled: org/apache/spark/sql/DataFrame.dtypes:()[Lscala/Tuple2;]
DataFrame.except ( DataFrame other ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.except:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.explain ( ) : void
[mangled: org/apache/spark/sql/DataFrame.explain:()V]
DataFrame.explain ( boolean extended ) : void
[mangled: org/apache/spark/sql/DataFrame.explain:(Z)V]
DataFrame.explode ( scala.collection.Seq<Column> input, scala.Function1<Row,scala.collection.TraversableOnce<A>> f, scala.reflect.api.TypeTags.TypeTag<A> p3 ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.explode:(Lscala/collection/Seq;Lscala/Function1;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.explode ( String inputColumn, String outputColumn, scala.Function1<A,scala.collection.TraversableOnce<B>> f, scala.reflect.api.TypeTags.TypeTag<B> p4 ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.explode:(Ljava/lang/String;Ljava/lang/String;Lscala/Function1;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.filter ( Column condition ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.filter:(Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.filter ( String conditionExpr ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.filter:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.first ( ) : Object
[mangled: org/apache/spark/sql/DataFrame.first:()Ljava/lang/Object;]
DataFrame.first ( ) : Row
[mangled: org/apache/spark/sql/DataFrame.first:()Lorg/apache/spark/sql/Row;]
DataFrame.flatMap ( scala.Function1<Row,scala.collection.TraversableOnce<R>> f, scala.reflect.ClassTag<R> p2 ) : org.apache.spark.rdd.RDD<R>
[mangled: org/apache/spark/sql/DataFrame.flatMap:(Lscala/Function1;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
DataFrame.foreach ( scala.Function1<Row,scala.runtime.BoxedUnit> f ) : void
[mangled: org/apache/spark/sql/DataFrame.foreach:(Lscala/Function1;)V]
DataFrame.foreachPartition ( scala.Function1<scala.collection.Iterator<Row>,scala.runtime.BoxedUnit> f ) : void
[mangled: org/apache/spark/sql/DataFrame.foreachPartition:(Lscala/Function1;)V]
DataFrame.groupBy ( Column... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.groupBy ( scala.collection.Seq<Column> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:(Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.groupBy ( String col1, scala.collection.Seq<String> cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.groupBy ( String col1, String... cols ) : GroupedData
[mangled: org/apache/spark/sql/DataFrame.groupBy:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/GroupedData;]
DataFrame.head ( ) : Row
[mangled: org/apache/spark/sql/DataFrame.head:()Lorg/apache/spark/sql/Row;]
DataFrame.head ( int n ) : Row[ ]
[mangled: org/apache/spark/sql/DataFrame.head:(I)[Lorg/apache/spark/sql/Row;]
DataFrame.insertInto ( String tableName ) : void
[mangled: org/apache/spark/sql/DataFrame.insertInto:(Ljava/lang/String;)V]
DataFrame.insertInto ( String tableName, boolean overwrite ) : void
[mangled: org/apache/spark/sql/DataFrame.insertInto:(Ljava/lang/String;Z)V]
DataFrame.insertIntoJDBC ( String url, String table, boolean overwrite ) : void
[mangled: org/apache/spark/sql/DataFrame.insertIntoJDBC:(Ljava/lang/String;Ljava/lang/String;Z)V]
DataFrame.intersect ( DataFrame other ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.intersect:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.isLocal ( ) : boolean
[mangled: org/apache/spark/sql/DataFrame.isLocal:()Z]
DataFrame.javaRDD ( ) : org.apache.spark.api.java.JavaRDD<Row>
[mangled: org/apache/spark/sql/DataFrame.javaRDD:()Lorg/apache/spark/api/java/JavaRDD;]
DataFrame.javaToPython ( ) : org.apache.spark.api.java.JavaRDD<byte[ ]>
[mangled: org/apache/spark/sql/DataFrame.javaToPython:()Lorg/apache/spark/api/java/JavaRDD;]
DataFrame.join ( DataFrame right ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.join ( DataFrame right, Column joinExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.join ( DataFrame right, Column joinExprs, String joinType ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.join:(Lorg/apache/spark/sql/DataFrame;Lorg/apache/spark/sql/Column;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.limit ( int n ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.limit:(I)Lorg/apache/spark/sql/DataFrame;]
DataFrame.logicalPlan ( ) : catalyst.plans.logical.LogicalPlan
[mangled: org/apache/spark/sql/DataFrame.logicalPlan:()Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;]
DataFrame.map ( scala.Function1<Row,R> f, scala.reflect.ClassTag<R> p2 ) : org.apache.spark.rdd.RDD<R>
[mangled: org/apache/spark/sql/DataFrame.map:(Lscala/Function1;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
DataFrame.mapPartitions ( scala.Function1<scala.collection.Iterator<Row>,scala.collection.Iterator<R>> f, scala.reflect.ClassTag<R> p2 ) : org.apache.spark.rdd.RDD<R>
[mangled: org/apache/spark/sql/DataFrame.mapPartitions:(Lscala/Function1;Lscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD;]
DataFrame.numericColumns ( ) : scala.collection.Seq<catalyst.expressions.Expression>
[mangled: org/apache/spark/sql/DataFrame.numericColumns:()Lscala/collection/Seq;]
DataFrame.orderBy ( Column... sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.orderBy ( scala.collection.Seq<Column> sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.orderBy ( String sortCol, scala.collection.Seq<String> sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.orderBy ( String sortCol, String... sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.orderBy:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.persist ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.persist:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.persist ( ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.persist:()Lorg/apache/spark/sql/RDDApi;]
DataFrame.persist ( org.apache.spark.storage.StorageLevel newLevel ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.persist:(Lorg/apache/spark/storage/StorageLevel;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.persist ( org.apache.spark.storage.StorageLevel newLevel ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.persist:(Lorg/apache/spark/storage/StorageLevel;)Lorg/apache/spark/sql/RDDApi;]
DataFrame.printSchema ( ) : void
[mangled: org/apache/spark/sql/DataFrame.printSchema:()V]
DataFrame.queryExecution ( ) : SQLContext.QueryExecution
[mangled: org/apache/spark/sql/DataFrame.queryExecution:()Lorg/apache/spark/sql/SQLContext$QueryExecution;]
DataFrame.rdd ( ) : org.apache.spark.rdd.RDD<Row>
[mangled: org/apache/spark/sql/DataFrame.rdd:()Lorg/apache/spark/rdd/RDD;]
DataFrame.registerTempTable ( String tableName ) : void
[mangled: org/apache/spark/sql/DataFrame.registerTempTable:(Ljava/lang/String;)V]
DataFrame.repartition ( int numPartitions ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.repartition:(I)Lorg/apache/spark/sql/DataFrame;]
DataFrame.resolve ( String colName ) : catalyst.expressions.NamedExpression
[mangled: org/apache/spark/sql/DataFrame.resolve:(Ljava/lang/String;)Lorg/apache/spark/sql/catalyst/expressions/NamedExpression;]
DataFrame.sample ( boolean withReplacement, double fraction ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sample:(ZD)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sample ( boolean withReplacement, double fraction, long seed ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sample:(ZDJ)Lorg/apache/spark/sql/DataFrame;]
DataFrame.save ( String path ) : void
[mangled: org/apache/spark/sql/DataFrame.save:(Ljava/lang/String;)V]
DataFrame.save ( String path, SaveMode mode ) : void
[mangled: org/apache/spark/sql/DataFrame.save:(Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;)V]
DataFrame.save ( String path, String source ) : void
[mangled: org/apache/spark/sql/DataFrame.save:(Ljava/lang/String;Ljava/lang/String;)V]
DataFrame.save ( String path, String source, SaveMode mode ) : void
[mangled: org/apache/spark/sql/DataFrame.save:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;)V]
DataFrame.save ( String source, SaveMode mode, java.util.Map<String,String> options ) : void
[mangled: org/apache/spark/sql/DataFrame.save:(Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Ljava/util/Map;)V]
DataFrame.save ( String source, SaveMode mode, scala.collection.immutable.Map<String,String> options ) : void
[mangled: org/apache/spark/sql/DataFrame.save:(Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;)V]
DataFrame.saveAsParquetFile ( String path ) : void
[mangled: org/apache/spark/sql/DataFrame.saveAsParquetFile:(Ljava/lang/String;)V]
DataFrame.saveAsTable ( String tableName ) : void
[mangled: org/apache/spark/sql/DataFrame.saveAsTable:(Ljava/lang/String;)V]
DataFrame.saveAsTable ( String tableName, SaveMode mode ) : void
[mangled: org/apache/spark/sql/DataFrame.saveAsTable:(Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;)V]
DataFrame.saveAsTable ( String tableName, String source ) : void
[mangled: org/apache/spark/sql/DataFrame.saveAsTable:(Ljava/lang/String;Ljava/lang/String;)V]
DataFrame.saveAsTable ( String tableName, String source, SaveMode mode ) : void
[mangled: org/apache/spark/sql/DataFrame.saveAsTable:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;)V]
DataFrame.saveAsTable ( String tableName, String source, SaveMode mode, java.util.Map<String,String> options ) : void
[mangled: org/apache/spark/sql/DataFrame.saveAsTable:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Ljava/util/Map;)V]
DataFrame.saveAsTable ( String tableName, String source, SaveMode mode, scala.collection.immutable.Map<String,String> options ) : void
[mangled: org/apache/spark/sql/DataFrame.saveAsTable:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;)V]
DataFrame.schema ( ) : types.StructType
[mangled: org/apache/spark/sql/DataFrame.schema:()Lorg/apache/spark/sql/types/StructType;]
DataFrame.select ( Column... cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.select ( scala.collection.Seq<Column> cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.select ( String col, scala.collection.Seq<String> cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.select ( String col, String... cols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.select:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.selectExpr ( scala.collection.Seq<String> exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.selectExpr:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.selectExpr ( String... exprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.selectExpr:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.show ( ) : void
[mangled: org/apache/spark/sql/DataFrame.show:()V]
DataFrame.show ( int numRows ) : void
[mangled: org/apache/spark/sql/DataFrame.show:(I)V]
DataFrame.showString ( int numRows ) : String
[mangled: org/apache/spark/sql/DataFrame.showString:(I)Ljava/lang/String;]
DataFrame.sort ( Column... sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:([Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sort ( scala.collection.Seq<Column> sortExprs ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sort ( String sortCol, scala.collection.Seq<String> sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:(Ljava/lang/String;Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sort ( String sortCol, String... sortCols ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.sort:(Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.sqlContext ( ) : SQLContext
[mangled: org/apache/spark/sql/DataFrame.sqlContext:()Lorg/apache/spark/sql/SQLContext;]
DataFrame.take ( int n ) : Object
[mangled: org/apache/spark/sql/DataFrame.take:(I)Ljava/lang/Object;]
DataFrame.take ( int n ) : Row[ ]
[mangled: org/apache/spark/sql/DataFrame.take:(I)[Lorg/apache/spark/sql/Row;]
DataFrame.toDF ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.toDF:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.toDF ( scala.collection.Seq<String> colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.toDF:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.toDF ( String... colNames ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.toDF:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.toJavaRDD ( ) : org.apache.spark.api.java.JavaRDD<Row>
[mangled: org/apache/spark/sql/DataFrame.toJavaRDD:()Lorg/apache/spark/api/java/JavaRDD;]
DataFrame.toJSON ( ) : org.apache.spark.rdd.RDD<String>
[mangled: org/apache/spark/sql/DataFrame.toJSON:()Lorg/apache/spark/rdd/RDD;]
DataFrame.toString ( ) : String
[mangled: org/apache/spark/sql/DataFrame.toString:()Ljava/lang/String;]
DataFrame.unionAll ( DataFrame other ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.unionAll:(Lorg/apache/spark/sql/DataFrame;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.unpersist ( ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.unpersist:()Lorg/apache/spark/sql/DataFrame;]
DataFrame.unpersist ( ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.unpersist:()Lorg/apache/spark/sql/RDDApi;]
DataFrame.unpersist ( boolean blocking ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.unpersist:(Z)Lorg/apache/spark/sql/DataFrame;]
DataFrame.unpersist ( boolean blocking ) : RDDApi
[mangled: org/apache/spark/sql/DataFrame.unpersist:(Z)Lorg/apache/spark/sql/RDDApi;]
DataFrame.where ( Column condition ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.where:(Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.withColumn ( String colName, Column col ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.withColumn:(Ljava/lang/String;Lorg/apache/spark/sql/Column;)Lorg/apache/spark/sql/DataFrame;]
DataFrame.withColumnRenamed ( String existingName, String newName ) : DataFrame
[mangled: org/apache/spark/sql/DataFrame.withColumnRenamed:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
spark-sql_2.10-1.3.0.jar, LogicalRDD.class
package org.apache.spark.sql.execution
LogicalRDD.LogicalRDD ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output, org.apache.spark.rdd.RDD<org.apache.spark.sql.Row> rdd, org.apache.spark.sql.SQLContext sqlContext )
[mangled: org/apache/spark/sql/execution/LogicalRDD."<init>":(Lscala/collection/Seq;Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/SQLContext;)V]
spark-sql_2.10-1.3.0.jar, SQLContext.class
package org.apache.spark.sql
SQLContext.applySchemaToPythonRDD ( org.apache.spark.rdd.RDD<Object[ ]> rdd, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.applySchemaToPythonRDD:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.applySchemaToPythonRDD ( org.apache.spark.rdd.RDD<Object[ ]> rdd, String schemaString ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.applySchemaToPythonRDD:(Lorg/apache/spark/rdd/RDD;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.baseRelationToDataFrame ( sources.BaseRelation baseRelation ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.baseRelationToDataFrame:(Lorg/apache/spark/sql/sources/BaseRelation;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.cacheManager ( ) : CacheManager
[mangled: org/apache/spark/sql/SQLContext.cacheManager:()Lorg/apache/spark/sql/CacheManager;]
SQLContext.checkAnalysis ( ) : catalyst.analysis.CheckAnalysis
[mangled: org/apache/spark/sql/SQLContext.checkAnalysis:()Lorg/apache/spark/sql/catalyst/analysis/CheckAnalysis;]
SQLContext.clearCache ( ) : void
[mangled: org/apache/spark/sql/SQLContext.clearCache:()V]
SQLContext.conf ( ) : SQLConf
[mangled: org/apache/spark/sql/SQLContext.conf:()Lorg/apache/spark/sql/SQLConf;]
SQLContext.createDataFrame ( org.apache.spark.api.java.JavaRDD<?> rdd, Class<?> beanClass ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/api/java/JavaRDD;Ljava/lang/Class;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.api.java.JavaRDD<Row> rowRDD, java.util.List<String> columns ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/api/java/JavaRDD;Ljava/util/List;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.api.java.JavaRDD<Row> rowRDD, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/api/java/JavaRDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.rdd.RDD<?> rdd, Class<?> beanClass ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/rdd/RDD;Ljava/lang/Class;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.rdd.RDD<A> rdd, scala.reflect.api.TypeTags.TypeTag<A> p2 ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/rdd/RDD;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( org.apache.spark.rdd.RDD<Row> rowRDD, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createDataFrame ( scala.collection.Seq<A> data, scala.reflect.api.TypeTags.TypeTag<A> p2 ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createDataFrame:(Lscala/collection/Seq;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String path ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String path, String source ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String source, java.util.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Ljava/util/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String source, types.StructType schema, java.util.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/types/StructType;Ljava/util/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String source, types.StructType schema, scala.collection.immutable.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Lorg/apache/spark/sql/types/StructType;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.createExternalTable ( String tableName, String source, scala.collection.immutable.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.createExternalTable:(Ljava/lang/String;Ljava/lang/String;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.ddlParser ( ) : sources.DDLParser
[mangled: org/apache/spark/sql/SQLContext.ddlParser:()Lorg/apache/spark/sql/sources/DDLParser;]
SQLContext.dropTempTable ( String tableName ) : void
[mangled: org/apache/spark/sql/SQLContext.dropTempTable:(Ljava/lang/String;)V]
SQLContext.emptyDataFrame ( ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.emptyDataFrame:()Lorg/apache/spark/sql/DataFrame;]
SQLContext.emptyResult ( ) : org.apache.spark.rdd.RDD<Row>
[mangled: org/apache/spark/sql/SQLContext.emptyResult:()Lorg/apache/spark/rdd/RDD;]
SQLContext.experimental ( ) : ExperimentalMethods
[mangled: org/apache/spark/sql/SQLContext.experimental:()Lorg/apache/spark/sql/ExperimentalMethods;]
SQLContext.functionRegistry ( ) : catalyst.analysis.FunctionRegistry
[mangled: org/apache/spark/sql/SQLContext.functionRegistry:()Lorg/apache/spark/sql/catalyst/analysis/FunctionRegistry;]
SQLContext.getAllConfs ( ) : scala.collection.immutable.Map<String,String>
[mangled: org/apache/spark/sql/SQLContext.getAllConfs:()Lscala/collection/immutable/Map;]
SQLContext.getConf ( String key ) : String
[mangled: org/apache/spark/sql/SQLContext.getConf:(Ljava/lang/String;)Ljava/lang/String;]
SQLContext.getConf ( String key, String defaultValue ) : String
[mangled: org/apache/spark/sql/SQLContext.getConf:(Ljava/lang/String;Ljava/lang/String;)Ljava/lang/String;]
SQLContext.getSchema ( Class<?> beanClass ) : scala.collection.Seq<catalyst.expressions.AttributeReference>
[mangled: org/apache/spark/sql/SQLContext.getSchema:(Ljava/lang/Class;)Lscala/collection/Seq;]
SQLContext.implicits ( ) : SQLContext.implicits.
[mangled: org/apache/spark/sql/SQLContext.implicits:()Lorg/apache/spark/sql/SQLContext$implicits$;]
SQLContext.isCached ( String tableName ) : boolean
[mangled: org/apache/spark/sql/SQLContext.isCached:(Ljava/lang/String;)Z]
SQLContext.isTraceEnabled ( ) : boolean
[mangled: org/apache/spark/sql/SQLContext.isTraceEnabled:()Z]
SQLContext.jdbc ( String url, String table ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jdbc:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jdbc ( String url, String table, String columnName, long lowerBound, long upperBound, int numPartitions ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jdbc:(Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;JJI)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jdbc ( String url, String table, String[ ] theParts ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jdbc:(Ljava/lang/String;Ljava/lang/String;[Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonFile ( String path ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonFile:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonFile ( String path, double samplingRatio ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonFile:(Ljava/lang/String;D)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonFile ( String path, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonFile:(Ljava/lang/String;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonRDD ( org.apache.spark.api.java.JavaRDD<String> json ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/api/java/JavaRDD;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonRDD ( org.apache.spark.api.java.JavaRDD<String> json, double samplingRatio ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/api/java/JavaRDD;D)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonRDD ( org.apache.spark.api.java.JavaRDD<String> json, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/api/java/JavaRDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonRDD ( org.apache.spark.rdd.RDD<String> json ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/rdd/RDD;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonRDD ( org.apache.spark.rdd.RDD<String> json, double samplingRatio ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/rdd/RDD;D)Lorg/apache/spark/sql/DataFrame;]
SQLContext.jsonRDD ( org.apache.spark.rdd.RDD<String> json, types.StructType schema ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.jsonRDD:(Lorg/apache/spark/rdd/RDD;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.load ( String path ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.load:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.load ( String path, String source ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.load:(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.load ( String source, java.util.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.load:(Ljava/lang/String;Ljava/util/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.load ( String source, types.StructType schema, java.util.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.load:(Ljava/lang/String;Lorg/apache/spark/sql/types/StructType;Ljava/util/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.load ( String source, types.StructType schema, scala.collection.immutable.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.load:(Ljava/lang/String;Lorg/apache/spark/sql/types/StructType;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.load ( String source, scala.collection.immutable.Map<String,String> options ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.load:(Ljava/lang/String;Lscala/collection/immutable/Map;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.log ( ) : org.slf4j.Logger
[mangled: org/apache/spark/sql/SQLContext.log:()Lorg/slf4j/Logger;]
SQLContext.logDebug ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/SQLContext.logDebug:(Lscala/Function0;)V]
SQLContext.logDebug ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/SQLContext.logDebug:(Lscala/Function0;Ljava/lang/Throwable;)V]
SQLContext.logError ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/SQLContext.logError:(Lscala/Function0;)V]
SQLContext.logError ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/SQLContext.logError:(Lscala/Function0;Ljava/lang/Throwable;)V]
SQLContext.logInfo ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/SQLContext.logInfo:(Lscala/Function0;)V]
SQLContext.logInfo ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/SQLContext.logInfo:(Lscala/Function0;Ljava/lang/Throwable;)V]
SQLContext.logName ( ) : String
[mangled: org/apache/spark/sql/SQLContext.logName:()Ljava/lang/String;]
SQLContext.logTrace ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/SQLContext.logTrace:(Lscala/Function0;)V]
SQLContext.logTrace ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/SQLContext.logTrace:(Lscala/Function0;Ljava/lang/Throwable;)V]
SQLContext.logWarning ( scala.Function0<String> msg ) : void
[mangled: org/apache/spark/sql/SQLContext.logWarning:(Lscala/Function0;)V]
SQLContext.logWarning ( scala.Function0<String> msg, Throwable throwable ) : void
[mangled: org/apache/spark/sql/SQLContext.logWarning:(Lscala/Function0;Ljava/lang/Throwable;)V]
SQLContext.optimizer ( ) : catalyst.optimizer.Optimizer
[mangled: org/apache/spark/sql/SQLContext.optimizer:()Lorg/apache/spark/sql/catalyst/optimizer/Optimizer;]
SQLContext.org.apache.spark.Logging..log_ ( ) : org.slf4j.Logger
[mangled: org/apache/spark/sql/SQLContext.org.apache.spark.Logging..log_:()Lorg/slf4j/Logger;]
SQLContext.org.apache.spark.Logging..log__.eq ( org.slf4j.Logger p1 ) : void
[mangled: org/apache/spark/sql/SQLContext.org.apache.spark.Logging..log__.eq:(Lorg/slf4j/Logger;)V]
SQLContext.parquetFile ( scala.collection.Seq<String> paths ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.parquetFile:(Lscala/collection/Seq;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.parquetFile ( String... paths ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.parquetFile:([Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.parseDataType ( String dataTypeString ) : types.DataType
[mangled: org/apache/spark/sql/SQLContext.parseDataType:(Ljava/lang/String;)Lorg/apache/spark/sql/types/DataType;]
SQLContext.registerDataFrameAsTable ( DataFrame df, String tableName ) : void
[mangled: org/apache/spark/sql/SQLContext.registerDataFrameAsTable:(Lorg/apache/spark/sql/DataFrame;Ljava/lang/String;)V]
SQLContext.setConf ( java.util.Properties props ) : void
[mangled: org/apache/spark/sql/SQLContext.setConf:(Ljava/util/Properties;)V]
SQLContext.setConf ( String key, String value ) : void
[mangled: org/apache/spark/sql/SQLContext.setConf:(Ljava/lang/String;Ljava/lang/String;)V]
SQLContext.sql ( String sqlText ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.sql:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.SQLContext ( org.apache.spark.api.java.JavaSparkContext sparkContext )
[mangled: org/apache/spark/sql/SQLContext."<init>":(Lorg/apache/spark/api/java/JavaSparkContext;)V]
SQLContext.sqlParser ( ) : SparkSQLParser
[mangled: org/apache/spark/sql/SQLContext.sqlParser:()Lorg/apache/spark/sql/SparkSQLParser;]
SQLContext.table ( String tableName ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.table:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.tableNames ( ) : String[ ]
[mangled: org/apache/spark/sql/SQLContext.tableNames:()[Ljava/lang/String;]
SQLContext.tableNames ( String databaseName ) : String[ ]
[mangled: org/apache/spark/sql/SQLContext.tableNames:(Ljava/lang/String;)[Ljava/lang/String;]
SQLContext.tables ( ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.tables:()Lorg/apache/spark/sql/DataFrame;]
SQLContext.tables ( String databaseName ) : DataFrame
[mangled: org/apache/spark/sql/SQLContext.tables:(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;]
SQLContext.udf ( ) : UDFRegistration
[mangled: org/apache/spark/sql/SQLContext.udf:()Lorg/apache/spark/sql/UDFRegistration;]
to the top
Problems with Data Types, High Severity (19)
spark-core_2.10-1.3.0.jar
package org.apache.spark
[+] Logging (1)
| Change | Effect |
---|
1 | Abstract method logName ( ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (14)
isTraceEnabled ( )This abstract method is from 'Logging' interface.
log ( )This abstract method is from 'Logging' interface.
logDebug ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logDebug ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
logError ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logError ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
logInfo ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logInfo ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
logTrace ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logTrace ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
logWarning ( scala.Function0<java.lang.String> )This abstract method is from 'Logging' interface.
logWarning ( scala.Function0<java.lang.String>, java.lang.Throwable )This abstract method is from 'Logging' interface.
Logging..log_ ( )This abstract method is from 'Logging' interface.
Logging..log__.eq ( org.slf4j.Logger )This abstract method is from 'Logging' interface.
[+] SecurityManager (1)
| Change | Effect |
---|
1 | Removed super-interface network.sasl.SecretKeyHolder. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
securityManager ( )Return value of this method has type 'SecurityManager'.
[+] SparkContext (1)
| Change | Effect |
---|
1 | Removed super-interface ExecutorAllocationClient. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (146)
accumulable ( R, AccumulableParam<R,T> )This method is from 'SparkContext' class.
accumulableCollection ( R, scala.Function1<R,scala.collection.generic.Growable<T>>, scala.reflect.ClassTag<R> )This method is from 'SparkContext' class.
accumulator ( T, AccumulatorParam<T> )This method is from 'SparkContext' class.
addedFiles ( )This method is from 'SparkContext' class.
addedJars ( )This method is from 'SparkContext' class.
addFile ( java.lang.String )This method is from 'SparkContext' class.
addJar ( java.lang.String )This method is from 'SparkContext' class.
addSparkListener ( scheduler.SparkListener )This method is from 'SparkContext' class.
appName ( )This method is from 'SparkContext' class.
booleanWritableConverter ( )This method is from 'SparkContext' class.
boolToBoolWritable ( boolean )This method is from 'SparkContext' class.
broadcast ( T, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
bytesToBytesWritable ( byte[ ] )This method is from 'SparkContext' class.
bytesWritableConverter ( )This method is from 'SparkContext' class.
cancelAllJobs ( )This method is from 'SparkContext' class.
cancelJob ( int )This method is from 'SparkContext' class.
cancelJobGroup ( java.lang.String )This method is from 'SparkContext' class.
cancelStage ( int )This method is from 'SparkContext' class.
checkpointDir ( )This method is from 'SparkContext' class.
checkpointDir_.eq ( scala.Option<java.lang.String> )This method is from 'SparkContext' class.
checkpointFile ( java.lang.String, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
cleaner ( )This method is from 'SparkContext' class.
cleanup ( long )This method is from 'SparkContext' class.
clearCallSite ( )This method is from 'SparkContext' class.
clearJobGroup ( )This method is from 'SparkContext' class.
conf ( )This method is from 'SparkContext' class.
dagScheduler ( )This method is from 'SparkContext' class.
dagScheduler_.eq ( scheduler.DAGScheduler )This method is from 'SparkContext' class.
defaultMinPartitions ( )This method is from 'SparkContext' class.
defaultParallelism ( )This method is from 'SparkContext' class.
doubleRDDToDoubleRDDFunctions ( rdd.RDD<java.lang.Object> )This method is from 'SparkContext' class.
doubleToDoubleWritable ( double )This method is from 'SparkContext' class.
doubleWritableConverter ( )This method is from 'SparkContext' class.
emptyRDD ( scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
env ( )This method is from 'SparkContext' class.
eventLogger ( )This method is from 'SparkContext' class.
executorEnvs ( )This method is from 'SparkContext' class.
executorMemory ( )This method is from 'SparkContext' class.
files ( )This method is from 'SparkContext' class.
floatToFloatWritable ( float )This method is from 'SparkContext' class.
floatWritableConverter ( )This method is from 'SparkContext' class.
getAllPools ( )This method is from 'SparkContext' class.
getCheckpointDir ( )This method is from 'SparkContext' class.
getConf ( )This method is from 'SparkContext' class.
getExecutorMemoryStatus ( )This method is from 'SparkContext' class.
getExecutorStorageStatus ( )This method is from 'SparkContext' class.
getLocalProperties ( )This method is from 'SparkContext' class.
getLocalProperty ( java.lang.String )This method is from 'SparkContext' class.
getPersistentRDDs ( )This method is from 'SparkContext' class.
getPoolForName ( java.lang.String )This method is from 'SparkContext' class.
getPreferredLocs ( rdd.RDD<?>, int )This method is from 'SparkContext' class.
getRDDStorageInfo ( )This method is from 'SparkContext' class.
getSchedulingMode ( )This method is from 'SparkContext' class.
getSparkHome ( )This method is from 'SparkContext' class.
hadoopConfiguration ( )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, int, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, java.lang.Class<? extends org.apache.hadoop.mapred.InputFormat<K,V>>, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
hadoopFile ( java.lang.String, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
hadoopRDD ( org.apache.hadoop.mapred.JobConf, java.lang.Class<? extends org.apache.hadoop.mapred.InputFormat<K,V>>, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
intToIntWritable ( int )This method is from 'SparkContext' class.
intWritableConverter ( )This method is from 'SparkContext' class.
isLocal ( )This method is from 'SparkContext' class.
isTraceEnabled ( )This method is from 'SparkContext' class.
jarOfClass ( java.lang.Class<?> )This method is from 'SparkContext' class.
jarOfObject ( java.lang.Object )This method is from 'SparkContext' class.
jars ( )This method is from 'SparkContext' class.
listenerBus ( )This method is from 'SparkContext' class.
log ( )This method is from 'SparkContext' class.
logDebug ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logDebug ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logError ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logError ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logInfo ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logInfo ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logTrace ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logTrace ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
logWarning ( scala.Function0<java.lang.String> )This method is from 'SparkContext' class.
logWarning ( scala.Function0<java.lang.String>, java.lang.Throwable )This method is from 'SparkContext' class.
longToLongWritable ( long )This method is from 'SparkContext' class.
longWritableConverter ( )This method is from 'SparkContext' class.
makeRDD ( scala.collection.Seq<scala.Tuple2<T,scala.collection.Seq<java.lang.String>>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
makeRDD ( scala.collection.Seq<T>, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
master ( )This method is from 'SparkContext' class.
metadataCleaner ( )This method is from 'SparkContext' class.
newAPIHadoopFile ( java.lang.String, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V>, org.apache.hadoop.conf.Configuration )This method is from 'SparkContext' class.
newAPIHadoopFile ( java.lang.String, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.reflect.ClassTag<F> )This method is from 'SparkContext' class.
newAPIHadoopRDD ( org.apache.hadoop.conf.Configuration, java.lang.Class<F>, java.lang.Class<K>, java.lang.Class<V> )This method is from 'SparkContext' class.
newRddId ( )This method is from 'SparkContext' class.
newShuffleId ( )This method is from 'SparkContext' class.
numericRDDToDoubleRDDFunctions ( rdd.RDD<T>, scala.math.Numeric<T> )This method is from 'SparkContext' class.
objectFile ( java.lang.String, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
Logging..log_ ( )This method is from 'SparkContext' class.
Logging..log__.eq ( org.slf4j.Logger )This method is from 'SparkContext' class.
SparkContext..warnSparkMem ( java.lang.String )This method is from 'SparkContext' class.
parallelize ( scala.collection.Seq<T>, int, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
persistentRdds ( )This method is from 'SparkContext' class.
persistRDD ( rdd.RDD<?> )This method is from 'SparkContext' class.
preferredNodeLocationData ( )This method is from 'SparkContext' class.
preferredNodeLocationData_.eq ( scala.collection.Map<java.lang.String,scala.collection.Set<scheduler.SplitInfo>> )This method is from 'SparkContext' class.
rddToAsyncRDDActions ( rdd.RDD<T>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
rddToOrderedRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.math.Ordering<K>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V> )This method is from 'SparkContext' class.
rddToPairRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.math.Ordering<K> )This method is from 'SparkContext' class.
rddToSequenceFileRDDFunctions ( rdd.RDD<scala.Tuple2<K,V>>, scala.Function1<K,org.apache.hadoop.io.Writable>, scala.reflect.ClassTag<K>, scala.Function1<V,org.apache.hadoop.io.Writable>, scala.reflect.ClassTag<V> )This method is from 'SparkContext' class.
runApproximateJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, partial.ApproximateEvaluator<U,R>, long )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, boolean, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, boolean, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, boolean, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
runJob ( rdd.RDD<T>, scala.Function2<TaskContext,scala.collection.Iterator<T>,U>, scala.reflect.ClassTag<U> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, int, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.Function0<WritableConverter<K>>, scala.Function0<WritableConverter<V>> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V> )This method is from 'SparkContext' class.
sequenceFile ( java.lang.String, java.lang.Class<K>, java.lang.Class<V>, int )This method is from 'SparkContext' class.
setCallSite ( java.lang.String )This method is from 'SparkContext' class.
setCheckpointDir ( java.lang.String )This method is from 'SparkContext' class.
setJobDescription ( java.lang.String )This method is from 'SparkContext' class.
setJobGroup ( java.lang.String, java.lang.String, boolean )This method is from 'SparkContext' class.
setLocalProperties ( java.util.Properties )This method is from 'SparkContext' class.
setLocalProperty ( java.lang.String, java.lang.String )This method is from 'SparkContext' class.
SparkContext ( )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, java.lang.String )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, java.lang.String, scala.collection.Seq<java.lang.String> )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, java.lang.String, scala.collection.Seq<java.lang.String>, scala.collection.Map<java.lang.String,java.lang.String>, scala.collection.Map<java.lang.String,scala.collection.Set<scheduler.SplitInfo>> )This constructor is from 'SparkContext' class.
SparkContext ( java.lang.String, java.lang.String, SparkConf )This constructor is from 'SparkContext' class.
SparkContext ( SparkConf )This constructor is from 'SparkContext' class.
SparkContext ( SparkConf, scala.collection.Map<java.lang.String,scala.collection.Set<scheduler.SplitInfo>> )This constructor is from 'SparkContext' class.
sparkUser ( )This method is from 'SparkContext' class.
startTime ( )This method is from 'SparkContext' class.
stop ( )This method is from 'SparkContext' class.
stringToText ( java.lang.String )This method is from 'SparkContext' class.
stringWritableConverter ( )This method is from 'SparkContext' class.
submitJob ( rdd.RDD<T>, scala.Function1<scala.collection.Iterator<T>,U>, scala.collection.Seq<java.lang.Object>, scala.Function2<java.lang.Object,U,scala.runtime.BoxedUnit>, scala.Function0<R> )This method is from 'SparkContext' class.
tachyonFolderName ( )This method is from 'SparkContext' class.
taskScheduler ( )This method is from 'SparkContext' class.
taskScheduler_.eq ( scheduler.TaskScheduler )This method is from 'SparkContext' class.
textFile ( java.lang.String, int )This method is from 'SparkContext' class.
union ( rdd.RDD<T>, scala.collection.Seq<rdd.RDD<T>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
union ( scala.collection.Seq<rdd.RDD<T>>, scala.reflect.ClassTag<T> )This method is from 'SparkContext' class.
unpersistRDD ( int, boolean )This method is from 'SparkContext' class.
version ( )This method is from 'SparkContext' class.
wholeTextFiles ( java.lang.String, int )This method is from 'SparkContext' class.
writableWritableConverter ( )This method is from 'SparkContext' class.
sparkContext ( )Return value of this method has type 'SparkContext'.
SQLContext ( SparkContext )1st parameter 'sparkContext' of this method has type 'SparkContext'.
[+] TaskContext (8)
| Change | Effect |
---|
1 | Abstract method addTaskCompletionListener ( util.TaskCompletionListener ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
2 | Abstract method addTaskCompletionListener ( scala.Function1<TaskContext,scala.runtime.BoxedUnit> ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
3 | Abstract method attemptNumber ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
4 | Abstract method isCompleted ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
5 | Abstract method isInterrupted ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
6 | Abstract method isRunningLocally ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
7 | Abstract method taskAttemptId ( ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
8 | Removed super-interface java.io.Serializable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (3)
partitionId ( )This abstract method is from 'TaskContext' abstract class.
stageId ( )This abstract method is from 'TaskContext' abstract class.
taskMetrics ( )This abstract method is from 'TaskContext' abstract class.
package org.apache.spark.broadcast
[+] Broadcast<T> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.Logging. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
broadcast ( T, scala.reflect.ClassTag<T> )Return value of this method has type 'Broadcast<T>'.
package org.apache.spark.rdd
[+] PairRDDFunctions<K,V> (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.mapreduce.SparkHadoopMapReduceUtil. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
rddToPairRDDFunctions ( RDD<scala.Tuple2<K,V>>, scala.reflect.ClassTag<K>, scala.reflect.ClassTag<V>, scala.math.Ordering<K> )Return value of this method has type 'PairRDDFunctions<K,V>'.
package org.apache.spark.serializer
[+] Serializer (1)
| Change | Effect |
---|
1 | This class became interface. | A client program may be interrupted by IncompatibleClassChangeError or InstantiationError exception dependent on the usage of this class. |
[+] affected methods (2)
closureSerializer ( )Return value of this method has type 'Serializer'.
serializer ( )Return value of this method has type 'Serializer'.
package org.apache.spark.storage
[+] BlockManager (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.network.BlockDataManager. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (1)
blockManager ( )Return value of this method has type 'BlockManager'.
spark-sql_2.10-1.3.0.jar
package org.apache.spark.sql
[+] DataFrame (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (114)
agg ( java.util.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
agg ( Column, Column... )This method is from 'DataFrame' class.
agg ( Column, scala.collection.Seq<Column> )This method is from 'DataFrame' class.
agg ( scala.collection.immutable.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
agg ( scala.Tuple2<java.lang.String,java.lang.String>, scala.collection.Seq<scala.Tuple2<java.lang.String,java.lang.String>> )This method is from 'DataFrame' class.
apply ( java.lang.String )This method is from 'DataFrame' class.
as ( java.lang.String )This method is from 'DataFrame' class.
as ( scala.Symbol )This method is from 'DataFrame' class.
cache ( )This method is from 'DataFrame' class.
cache ( )This method is from 'DataFrame' class.
col ( java.lang.String )This method is from 'DataFrame' class.
collect ( )This method is from 'DataFrame' class.
collect ( )This method is from 'DataFrame' class.
collectAsList ( )This method is from 'DataFrame' class.
columns ( )This method is from 'DataFrame' class.
count ( )This method is from 'DataFrame' class.
createJDBCTable ( java.lang.String, java.lang.String, boolean )This method is from 'DataFrame' class.
DataFrame ( SQLContext, catalyst.plans.logical.LogicalPlan )This constructor is from 'DataFrame' class.
DataFrame ( SQLContext, SQLContext.QueryExecution )This constructor is from 'DataFrame' class.
distinct ( )This method is from 'DataFrame' class.
dtypes ( )This method is from 'DataFrame' class.
except ( DataFrame )This method is from 'DataFrame' class.
explain ( )This method is from 'DataFrame' class.
explain ( boolean )This method is from 'DataFrame' class.
explode ( java.lang.String, java.lang.String, scala.Function1<A,scala.collection.TraversableOnce<B>>, scala.reflect.api.TypeTags.TypeTag<B> )This method is from 'DataFrame' class.
explode ( scala.collection.Seq<Column>, scala.Function1<Row,scala.collection.TraversableOnce<A>>, scala.reflect.api.TypeTags.TypeTag<A> )This method is from 'DataFrame' class.
filter ( java.lang.String )This method is from 'DataFrame' class.
filter ( Column )This method is from 'DataFrame' class.
first ( )This method is from 'DataFrame' class.
first ( )This method is from 'DataFrame' class.
flatMap ( scala.Function1<Row,scala.collection.TraversableOnce<R>>, scala.reflect.ClassTag<R> )This method is from 'DataFrame' class.
foreach ( scala.Function1<Row,scala.runtime.BoxedUnit> )This method is from 'DataFrame' class.
foreachPartition ( scala.Function1<scala.collection.Iterator<Row>,scala.runtime.BoxedUnit> )This method is from 'DataFrame' class.
groupBy ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
groupBy ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
groupBy ( Column... )This method is from 'DataFrame' class.
groupBy ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
head ( )This method is from 'DataFrame' class.
head ( int )This method is from 'DataFrame' class.
insertInto ( java.lang.String )This method is from 'DataFrame' class.
insertInto ( java.lang.String, boolean )This method is from 'DataFrame' class.
insertIntoJDBC ( java.lang.String, java.lang.String, boolean )This method is from 'DataFrame' class.
intersect ( DataFrame )This method is from 'DataFrame' class.
isLocal ( )This method is from 'DataFrame' class.
javaRDD ( )This method is from 'DataFrame' class.
javaToPython ( )This method is from 'DataFrame' class.
join ( DataFrame )This method is from 'DataFrame' class.
join ( DataFrame, Column )This method is from 'DataFrame' class.
join ( DataFrame, Column, java.lang.String )This method is from 'DataFrame' class.
limit ( int )This method is from 'DataFrame' class.
logicalPlan ( )This method is from 'DataFrame' class.
map ( scala.Function1<Row,R>, scala.reflect.ClassTag<R> )This method is from 'DataFrame' class.
mapPartitions ( scala.Function1<scala.collection.Iterator<Row>,scala.collection.Iterator<R>>, scala.reflect.ClassTag<R> )This method is from 'DataFrame' class.
numericColumns ( )This method is from 'DataFrame' class.
orderBy ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
orderBy ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
orderBy ( Column... )This method is from 'DataFrame' class.
orderBy ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
persist ( )This method is from 'DataFrame' class.
persist ( )This method is from 'DataFrame' class.
persist ( org.apache.spark.storage.StorageLevel )This method is from 'DataFrame' class.
persist ( org.apache.spark.storage.StorageLevel )This method is from 'DataFrame' class.
printSchema ( )This method is from 'DataFrame' class.
queryExecution ( )This method is from 'DataFrame' class.
rdd ( )This method is from 'DataFrame' class.
registerTempTable ( java.lang.String )This method is from 'DataFrame' class.
repartition ( int )This method is from 'DataFrame' class.
resolve ( java.lang.String )This method is from 'DataFrame' class.
sample ( boolean, double )This method is from 'DataFrame' class.
sample ( boolean, double, long )This method is from 'DataFrame' class.
save ( java.lang.String )This method is from 'DataFrame' class.
save ( java.lang.String, java.lang.String )This method is from 'DataFrame' class.
save ( java.lang.String, java.lang.String, SaveMode )This method is from 'DataFrame' class.
save ( java.lang.String, SaveMode )This method is from 'DataFrame' class.
save ( java.lang.String, SaveMode, java.util.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
save ( java.lang.String, SaveMode, scala.collection.immutable.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
saveAsParquetFile ( java.lang.String )This method is from 'DataFrame' class.
saveAsTable ( java.lang.String )This method is from 'DataFrame' class.
saveAsTable ( java.lang.String, java.lang.String )This method is from 'DataFrame' class.
saveAsTable ( java.lang.String, java.lang.String, SaveMode )This method is from 'DataFrame' class.
saveAsTable ( java.lang.String, java.lang.String, SaveMode, java.util.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
saveAsTable ( java.lang.String, java.lang.String, SaveMode, scala.collection.immutable.Map<java.lang.String,java.lang.String> )This method is from 'DataFrame' class.
saveAsTable ( java.lang.String, SaveMode )This method is from 'DataFrame' class.
schema ( )This method is from 'DataFrame' class.
select ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
select ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
select ( Column... )This method is from 'DataFrame' class.
select ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
selectExpr ( java.lang.String... )This method is from 'DataFrame' class.
selectExpr ( scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
show ( )This method is from 'DataFrame' class.
show ( int )This method is from 'DataFrame' class.
showString ( int )This method is from 'DataFrame' class.
sort ( java.lang.String, java.lang.String... )This method is from 'DataFrame' class.
sort ( java.lang.String, scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
sort ( Column... )This method is from 'DataFrame' class.
sort ( scala.collection.Seq<Column> )This method is from 'DataFrame' class.
sqlContext ( )This method is from 'DataFrame' class.
take ( int )This method is from 'DataFrame' class.
take ( int )This method is from 'DataFrame' class.
toDF ( )This method is from 'DataFrame' class.
toDF ( java.lang.String... )This method is from 'DataFrame' class.
toDF ( scala.collection.Seq<java.lang.String> )This method is from 'DataFrame' class.
toJavaRDD ( )This method is from 'DataFrame' class.
toJSON ( )This method is from 'DataFrame' class.
toString ( )This method is from 'DataFrame' class.
unionAll ( DataFrame )This method is from 'DataFrame' class.
unpersist ( )This method is from 'DataFrame' class.
unpersist ( )This method is from 'DataFrame' class.
unpersist ( boolean )This method is from 'DataFrame' class.
unpersist ( boolean )This method is from 'DataFrame' class.
where ( Column )This method is from 'DataFrame' class.
withColumn ( java.lang.String, Column )This method is from 'DataFrame' class.
withColumnRenamed ( java.lang.String, java.lang.String )This method is from 'DataFrame' class.
[+] SQLContext (1)
| Change | Effect |
---|
1 | Removed super-interface org.apache.spark.Logging. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (11)
analyzer ( )This method is from 'SQLContext' class.
cacheTable ( java.lang.String )This method is from 'SQLContext' class.
catalog ( )This method is from 'SQLContext' class.
executePlan ( catalyst.plans.logical.LogicalPlan )This method is from 'SQLContext' class.
executeSql ( java.lang.String )This method is from 'SQLContext' class.
parseSql ( java.lang.String )This method is from 'SQLContext' class.
planner ( )This method is from 'SQLContext' class.
prepareForExecution ( )This method is from 'SQLContext' class.
sparkContext ( )This method is from 'SQLContext' class.
SQLContext ( org.apache.spark.SparkContext )This constructor is from 'SQLContext' class.
uncacheTable ( java.lang.String )This method is from 'SQLContext' class.
[+] SQLContext.QueryExecution (1)
| Change | Effect |
---|
1 | This class became abstract. | A client program may be interrupted by InstantiationError exception. |
[+] affected methods (2)
executePlan ( catalyst.plans.logical.LogicalPlan )Return value of this method has type 'SQLContext.QueryExecution'.
executeSql ( java.lang.String )Return value of this method has type 'SQLContext.QueryExecution'.
package org.apache.spark.sql.execution
[+] LogicalRDD (1)
| Change | Effect |
---|
1 | This class has been removed. | A client program may be interrupted by NoClassDefFoundError exception. |
[+] affected methods (1)
LogicalRDD ( scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute>, org.apache.spark.rdd.RDD<org.apache.spark.sql.Row>, org.apache.spark.sql.SQLContext )This constructor is from 'LogicalRDD' class.
to the top
Problems with Data Types, Medium Severity (1)
spark-core_2.10-1.3.0.jar
package org.apache.spark.scheduler
[+] LiveListenerBus (1)
| Change | Effect |
---|
1 | Removed super-class org.apache.spark.util.AsynchronousListenerBus<SparkListener,SparkListenerEvent>. | Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. |
[+] affected methods (1)
listenerBus ( )Return value of this method has type 'LiveListenerBus'.
to the top
Problems with Data Types, Low Severity (3)
spark-core_2.10-1.3.0.jar
package org.apache.spark
[+] TaskContext (3)
| Change | Effect |
---|
1 | Abstract method partitionId ( ) became non-abstract. | Some methods in this class may change behavior. |
2 | Abstract method stageId ( ) became non-abstract. | Some methods in this class may change behavior. |
3 | Abstract method taskMetrics ( ) became non-abstract. | Some methods in this class may change behavior. |
[+] affected methods (3)
partitionId ( )This abstract method is from 'TaskContext' abstract class.
stageId ( )This abstract method is from 'TaskContext' abstract class.
taskMetrics ( )This abstract method is from 'TaskContext' abstract class.
to the top
Problems with Methods, Low Severity (3)
spark-core_2.10-1.3.0.jar, TaskContext
package org.apache.spark
[+] TaskContext.partitionId ( ) [abstract] : int (1)
[mangled: org/apache/spark/TaskContext.partitionId:()I]
| Change | Effect |
---|
1 | Method became non-abstract.
| A client program may change behavior. |
[+] TaskContext.stageId ( ) [abstract] : int (1)
[mangled: org/apache/spark/TaskContext.stageId:()I]
| Change | Effect |
---|
1 | Method became non-abstract.
| A client program may change behavior. |
[+] TaskContext.taskMetrics ( ) [abstract] : executor.TaskMetrics (1)
[mangled: org/apache/spark/TaskContext.taskMetrics:()Lorg/apache/spark/executor/TaskMetrics;]
| Change | Effect |
---|
1 | Method became non-abstract.
| A client program may change behavior. |
to the top
Java ARchives (3)
spark-core_2.10-1.3.0.jar
spark-mllib_2.10-1.3.0.jar
spark-sql_2.10-1.3.0.jar
to the top
Generated on Thu Jun 4 22:47:54 2015 for sparkling-water-core_2.10-1.3.3 by Java API Compliance Checker 1.4.1
A tool for checking backward compatibility of a Java library API