case class SparkSimulation(local: Boolean = false, cores: Int = 4) extends Product with Serializable
A Spark session utility wrapper.
This session can be used to parallelize simulations. It requires the local machine to have
a Spark driver installed, if used with flag local = true. See the Apache Spark
website https://spark.apache.org/downloads.html for installation instructions.
- local
Flag to specify whether or not should run in local mode.
- cores
Number of cores to use in local mode, default is 4. Set to 0 to request maximum number of cores.
- Alphabetic
- By Inheritance
- SparkSimulation
- Serializable
- Serializable
- Product
- Equals
- AnyRef
- Any
- by any2stringadd
- by StringFormat
- by Ensuring
- by ArrowAssoc
- Hide All
- Show All
- Public
- All
Instance Constructors
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
def
+(other: String): String
- Implicit
- This member is added by an implicit conversion from SparkSimulation to any2stringadd[SparkSimulation] performed by method any2stringadd in scala.Predef.
- Definition Classes
- any2stringadd
-
def
->[B](y: B): (SparkSimulation, B)
- Implicit
- This member is added by an implicit conversion from SparkSimulation to ArrowAssoc[SparkSimulation] performed by method ArrowAssoc in scala.Predef.
- Definition Classes
- ArrowAssoc
- Annotations
- @inline()
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )
-
def
context: SparkContext
Returns the local spark context.
- val cores: Int
-
def
ensuring(cond: (SparkSimulation) ⇒ Boolean, msg: ⇒ Any): SparkSimulation
- Implicit
- This member is added by an implicit conversion from SparkSimulation to Ensuring[SparkSimulation] performed by method Ensuring in scala.Predef.
- Definition Classes
- Ensuring
-
def
ensuring(cond: (SparkSimulation) ⇒ Boolean): SparkSimulation
- Implicit
- This member is added by an implicit conversion from SparkSimulation to Ensuring[SparkSimulation] performed by method Ensuring in scala.Predef.
- Definition Classes
- Ensuring
-
def
ensuring(cond: Boolean, msg: ⇒ Any): SparkSimulation
- Implicit
- This member is added by an implicit conversion from SparkSimulation to Ensuring[SparkSimulation] performed by method Ensuring in scala.Predef.
- Definition Classes
- Ensuring
-
def
ensuring(cond: Boolean): SparkSimulation
- Implicit
- This member is added by an implicit conversion from SparkSimulation to Ensuring[SparkSimulation] performed by method Ensuring in scala.Predef.
- Definition Classes
- Ensuring
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
finalize(): Unit
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
def
formatted(fmtstr: String): String
- Implicit
- This member is added by an implicit conversion from SparkSimulation to StringFormat[SparkSimulation] performed by method StringFormat in scala.Predef.
- Definition Classes
- StringFormat
- Annotations
- @inline()
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- val local: Boolean
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
parallelize[T](seq: Seq[T])(implicit arg0: ClassTag[T]): RDD[T]
Parallelizes a sequence of type T.
Parallelizes a sequence of type T.
Use this to parallelize simulations and run them in a Spark session. All transformation applied to the resulting
org.apache.spark.rdd.RDDare lazily executed in parallel.- T
Optional type parameter.
- seq
Sequence of type T to be parallelized.
- returns
A resilient distributed data structure
org.apache.spark.rdd.RDD.
-
def
shutdown(): Unit
Shuts down the Spark session.
Shuts down the Spark session.
Use this to cleanly close the Spark session and ensure all jobs are finished properly. No further sequences can be parallelized.
- val spark: SparkSession
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )
-
def
→[B](y: B): (SparkSimulation, B)
- Implicit
- This member is added by an implicit conversion from SparkSimulation to ArrowAssoc[SparkSimulation] performed by method ArrowAssoc in scala.Predef.
- Definition Classes
- ArrowAssoc
Language Agents Simulation framework
For detailed documentation on using the framework please see the README.md file at the Github repository at https://github.com/markblokpoel/lanag-core.