sbt pom reader - sbt读取pom

来源:互联网 发布:js base64解码图片 编辑:程序博客网 时间:2024/04/29 03:18

sbt pom reader - sbt读取pom

Spark为了依赖项的统一管理用了sbt pom reader这个插件,sbt编译发布也是从pom里读GAV了。 我记得老版本sbt的依赖项是写在project/SparkBuild.scala里的。这个工具在java/scala项目内很有使用场景。

地址在:https://github.com/sbt/sbt-pom-reader

1.配置插件

/app/hadoop/shengli/spark1.3/project/project/SparkPluginBuild.scala

import sbt._import sbt.Keys._/** * This plugin project is there to define new scala style rules for spark. This is * a plugin project so that this gets compiled first and is put on the classpath and * becomes available for scalastyle sbt plugin. */object SparkPluginDef extends Build {  lazy val root = Project("plugins", file(".")) dependsOn(sparkStyle, sbtPomReader)  lazy val sparkStyle = Project("spark-style", file("spark-style"), settings = styleSettings)  lazy val sbtPomReader = uri("https://github.com/ScrapCodes/sbt-pom-reader.git#ignore_artifact_id")  // There is actually no need to publish this artifact.  def styleSettings = Defaults.defaultSettings ++ Seq (    name                 :=  "spark-style",    organization         :=  "org.apache.spark",    scalaVersion         :=  "2.10.4",    scalacOptions        :=  Seq("-unchecked", "-deprecation"),    libraryDependencies  ++= Dependencies.scalaStyle  )  object Dependencies {    val scalaStyle = Seq("org.scalastyle" %% "scalastyle" % "0.4.0")  }}

2.自定义类继承PomBuild,配置project

使用ProjectRef
格式:

// This is a heuristic, assuming we're running sbt in the same directory as the build.val buildLocation = (file(".").getAbsoluteFile.getParentFile)// Here we define a reference to a subproject.  The string "subproject" refers to the artifact id of// the subproject.val subproject = ProjectRef(buildLoc, "subproject")// Disable all scalac arguemnts when running the REPL.scalacOptions in subproject in Compile in console := Seq.empty

在Spark中的使用

在SparkBuild.scala内

...import sbt._import com.typesafe.sbt.pom.{loadEffectivePom, PomBuild, SbtPomKeys}...private val buildLocation = file(".").getAbsoluteFile.getParentFile  val allProjects@Seq(bagel, catalyst, core, graphx, hive, hiveThriftServer, mllib, repl,    sql, networkCommon, networkShuffle, streaming, streamingFlumeSink, streamingFlume, streamingKafka,    streamingMqtt, streamingTwitter, streamingZeromq) =    Seq("bagel", "catalyst", "core", "graphx", "hive", "hive-thriftserver", "mllib", "repl",      "sql", "network-common", "network-shuffle", "streaming", "streaming-flume-sink",      "streaming-flume", "streaming-kafka", "streaming-mqtt", "streaming-twitter",      "streaming-zeromq").map(ProjectRef(buildLocation, _))  val optionallyEnabledProjects@Seq(yarn, yarnStable, java8Tests, sparkGangliaLgpl,    sparkKinesisAsl) = Seq("yarn", "yarn-stable", "java8-tests", "ganglia-lgpl",    "kinesis-asl").map(ProjectRef(buildLocation, _))  val assemblyProjects@Seq(assembly, examples, networkYarn, streamingKafkaAssembly) =    Seq("assembly", "examples", "network-yarn", "streaming-kafka-assembly")      .map(ProjectRef(buildLocation, _))...object SparkBuild extends PomBuild {...

原创文章,转载请注明出自:http://blog.csdn.net/oopsoom/article/details/45148381

0 0