Spark- Create an empty RDD
Configuration for a Spark application. Used to set various Spark parameters as key-value pairs.
Most of the time,
you would create a SparkConf object with
new SparkConf(), which will load values from any
spark.* Java system properties set in
your application as well. In this case, parameters you set directly on the
SparkConf object take priority over
For unit tests,
you can also call
SparkConf(false) to skip loading external settings and get the
same configuration no matter what the system properties are.
All setter methods
in this class support chaining. For example, you can write
Note that once a SparkConf object is passed to Spark, it is cloned and can no longer be modified by the user. Spark does not support modifying the configuration at runtime.
Main entry point for Spark functionality. A SparkContext represents the connection to a Spark cluster, and can be used to create RDDs, accumulators and broadcast variables on that cluster.
SparkContext may be active per JVM. You must
stop() the active SparkContext before creating a new one.
This limitation may eventually be removed
Get an RDD that has no partitions or elements.
Returns the number of partitions of this RDD.
· JDK 1.7 or higher
· Scala 2.10.3
2. Example :
Following example illustrates about Spark- Create an empty RDD in scala
Save the file as − CreateEmptyRDD.scala.
mvn clean install
run as a scala application
rdd: EmptyRDD at emptyRDD at CreateRDD.scala:13
rddStr: EmptyRDD at emptyRDD at CreateRDD.scala:18
Num of Partitions: 0