IDEA非sbt下spark开发

来源:互联网 发布:免费开源oa系统 java 编辑:程序博客网 时间:2024/05/11 18:36
  • 创建非sbt的scala项目
  • 引入spark的jar包
File->Project Structure->Libararies引用spark-assembly-1.5.2-hadoop2.6.0.jar
  • 编写代码
import scala.math.randomimport org.apache.spark._/** * Created by code-pc on 16/3/2.  */object test1 {  def main(args: Array[String]) {    val conf = new SparkConf().setAppName("Spark Pi").setMaster("local")    val spark = new SparkContext(conf)    val slices = if (args.length > 0) args(0).toInt else 2    val n = 100000 * slices    val count = spark.parallelize(1 to n, slices).map { i =>      val x = random * 2 - 1      val y = random * 2 - 1      if (x*x + y*y < 1) 1 else 0    }.reduce(_ + _)    println("Pi is roughly " + 4.0 * count / n)    spark.stop()  }}
  • 编译jar包
File -> Project Structure -> Artifacts  -> + -> Jars -> From moudles with dependencies
菜单栏:build Artifacts
  • 运行
./spark-submit --class test1 --master local ~/IdeaProjects/test1/out/artifacts/Pi/test1.jar
0 0
原创粉丝点击