Spark1.0.1sbt打包运行自己程序及问题解决

来源:互联网 发布:外贸海关数据购买 编辑:程序博客网 时间:2024/06/16 03:40

1.安装sbt并加入环境变量

http://www.scala-sbt.org/0.13/tutorial/Manual-Installation.html

2.创建simple.sbt

name := "Simple Project"version := "1.0"scalaVersion := "2.10.3"libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.1"resolvers += "Akka Repository" at "http://repo.akka.io/releases/"

3.创建Simple.scala

/* SimpleApp.scala */import org.apache.spark.SparkContextimport org.apache.spark.SparkContext._import org.apache.spark.SparkConfobject SimpleApp {  def main(args: Array[String]) {    val logFile = "../README.md" // Should be some file on your system    val conf = new SparkConf().setAppName("Simple Application")    val sc = new SparkContext(conf)    val logData = sc.textFile(logFile, 2).cache()    val numAs = logData.filter(line => line.contains("a")).count()    val numBs = logData.filter(line => line.contains("b")).count()    println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))  }}
4. 文件如下

src  simple.sbt   (src/main/scala/SimpleApp.scala)


5. 运行 sbt package

6. 多了 project, target

7. 运行命令行


8.有如下错误查看日志可知文件不存在


9. 上传README.md到正确路径再次运行成功


10.运行成功 输出结果如下


0 0
原创粉丝点击