spark 1.3.0 测试

来源:互联网 发布:淘宝宝贝详情怎么设置 编辑:程序博客网 时间:2024/04/30 21:23



[jifeng@feng02 spark-1.3.0-bin-2.4.1]$ master=spark://feng02:7077 ./bin/spark-shellSpark assembly has been built with Hive, including Datanucleus jars on classpathlog4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).log4j:WARN Please initialize the log4j system properly.log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties15/03/03 15:30:26 INFO SecurityManager: Changing view acls to: jifeng15/03/03 15:30:26 INFO SecurityManager: Changing modify acls to: jifeng15/03/03 15:30:26 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jifeng); users with modify permissions: Set(jifeng)15/03/03 15:30:26 INFO HttpServer: Starting HTTP Server15/03/03 15:30:27 INFO Server: jetty-8.y.z-SNAPSHOT15/03/03 15:30:27 INFO AbstractConnector: Started SocketConnector@0.0.0.0:4984815/03/03 15:30:27 INFO Utils: Successfully started service 'HTTP class server' on port 49848.Welcome to      ____              __     / __/__  ___ _____/ /__    _\ \/ _ \/ _ `/ __/  '_/   /___/ .__/\_,_/_/ /_/\_\   version 1.3.0      /_/Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_45)Type in expressions to have them evaluated.Type :help for more information.15/03/03 15:30:33 INFO SparkContext: Running Spark version 1.3.015/03/03 15:30:33 INFO SecurityManager: Changing view acls to: jifeng15/03/03 15:30:33 INFO SecurityManager: Changing modify acls to: jifeng15/03/03 15:30:33 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jifeng); users with modify permissions: Set(jifeng)15/03/03 15:30:33 INFO Slf4jLogger: Slf4jLogger started15/03/03 15:30:34 INFO Remoting: Starting remoting15/03/03 15:30:34 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@feng02:36823]15/03/03 15:30:34 INFO Utils: Successfully started service 'sparkDriver' on port 36823.15/03/03 15:30:34 INFO SparkEnv: Registering MapOutputTracker15/03/03 15:30:34 INFO SparkEnv: Registering BlockManagerMaster15/03/03 15:30:34 INFO DiskBlockManager: Created local directory at /tmp/spark-303ca9ef-f76b-4c9c-a315-a10351265aa4/blockmgr-2c1283b6-9466-40f2-83af-bc408a5e4ec015/03/03 15:30:34 INFO MemoryStore: MemoryStore started with capacity 267.3 MB15/03/03 15:30:34 INFO HttpFileServer: HTTP File server directory is /tmp/spark-e7ee2d00-4c21-4221-9f65-3e86d509235a/httpd-729605bc-2cdf-4d64-9b83-842bfedc522315/03/03 15:30:34 INFO HttpServer: Starting HTTP Server15/03/03 15:30:34 INFO Server: jetty-8.y.z-SNAPSHOT15/03/03 15:30:34 INFO AbstractConnector: Started SocketConnector@0.0.0.0:3873415/03/03 15:30:34 INFO Utils: Successfully started service 'HTTP file server' on port 38734.15/03/03 15:30:34 INFO SparkEnv: Registering OutputCommitCoordinator15/03/03 15:30:34 INFO Server: jetty-8.y.z-SNAPSHOT15/03/03 15:30:35 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:404015/03/03 15:30:35 INFO Utils: Successfully started service 'SparkUI' on port 4040.15/03/03 15:30:35 INFO SparkUI: Started SparkUI at http://feng02:404015/03/03 15:30:35 INFO Executor: Starting executor ID <driver> on host localhost15/03/03 15:30:35 INFO Executor: Using REPL class URI: http://10.6.3.201:4984815/03/03 15:30:35 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@feng02:36823/user/HeartbeatReceiver15/03/03 15:30:35 INFO NettyBlockTransferService: Server created on 5126515/03/03 15:30:35 INFO BlockManagerMaster: Trying to register BlockManager15/03/03 15:30:35 INFO BlockManagerMasterActor: Registering block manager localhost:51265 with 267.3 MB RAM, BlockManagerId(<driver>, localhost, 51265)15/03/03 15:30:35 INFO BlockManagerMaster: Registered BlockManager15/03/03 15:30:35 INFO SparkILoop: Created spark context..Spark context available as sc.15/03/03 15:30:36 INFO SparkILoop: Created sql context (with Hive support)..SQL context available as sqlContext.scala> val sqlContext = new org.apache.spark.sql.SQLContext(sc)sqlContext: org.apache.spark.sql.SQLContext = org.apache.spark.sql.SQLContext@7e6c0e3scala> import sqlContext.createSchemaRDD<console>:23: error: value createSchemaRDD is not a member of org.apache.spark.sql.SQLContext       import sqlContext.createSchemaRDD              ^scala> case class Person(name: String, age: Int)defined class Personscala> val people = sc.textFile("examples/src/main/resources/people.txt").map(_.split(",")).map(p => Person(p(0), p(1).trim.toInt))15/03/03 15:42:12 INFO MemoryStore: ensureFreeSpace(163705) called with curMem=0, maxMem=28024897515/03/03 15:42:12 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 159.9 KB, free 267.1 MB)15/03/03 15:42:12 INFO MemoryStore: ensureFreeSpace(22736) called with curMem=163705, maxMem=28024897515/03/03 15:42:12 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 22.2 KB, free 267.1 MB)15/03/03 15:42:12 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:51265 (size: 22.2 KB, free: 267.2 MB)15/03/03 15:42:12 INFO BlockManagerMaster: Updated info of block broadcast_0_piece015/03/03 15:42:12 INFO SparkContext: Created broadcast 0 from textFile at <console>:23people: org.apache.spark.rdd.RDD[Person] = MapPartitionsRDD[3] at map at <console>:23scala> people.registerTempTable("people")<console>:26: error: value registerTempTable is not a member of org.apache.spark.rdd.RDD[Person]              people.registerTempTable("people")                     ^scala> val teenagers = sqlContext.sql("SELECT name FROM people WHERE age >= 13 AND age <= 19")java.lang.RuntimeException: Table Not Found: people        at scala.sys.package$.error(package.scala:27)        at org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:111)        at org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:111)        at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)        at scala.collection.AbstractMap.getOrElse(Map.scala:58)        at org.apache.spark.sql.catalyst.analysis.SimpleCatalog.lookupRelation(Catalog.scala:111)        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable(Analyzer.scala:258)        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$8.applyOrElse(Analyzer.scala:270)        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$8.applyOrElse(Analyzer.scala:265)        at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:187)        at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:187)        at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:50)        at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:186)        at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:207)        at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)        at scala.collection.Iterator$class.foreach(Iterator.scala:727)        at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)        at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)        at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)        at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)        at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)        at scala.collection.AbstractIterator.to(Iterator.scala:1157)        at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)        at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)        at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)        at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)        at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenDown(TreeNode.scala:236)        at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:192)        at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:207)        at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)        at scala.collection.Iterator$class.foreach(Iterator.scala:727)        at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)        at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)        at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)        at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)        at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)        at scala.collection.AbstractIterator.to(Iterator.scala:1157)        at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)        at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)        at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)        at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)        at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenDown(TreeNode.scala:236)        at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:192)        at org.apache.spark.sql.catalyst.trees.TreeNode.transform(TreeNode.scala:177)        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:265)        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:255)        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1$$anonfun$apply$2.apply(RuleExecutor.scala:61)        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1$$anonfun$apply$2.apply(RuleExecutor.scala:59)        at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111)        at scala.collection.immutable.List.foldLeft(List.scala:84)        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1.apply(RuleExecutor.scala:59)        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1.apply(RuleExecutor.scala:51)        at scala.collection.immutable.List.foreach(List.scala:318)        at org.apache.spark.sql.catalyst.rules.RuleExecutor.apply(RuleExecutor.scala:51)        at org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:1063)        at org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:1063)        at org.apache.spark.sql.DataFrameImpl.<init>(DataFrameImpl.scala:63)        at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:39)        at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:915)        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:23)        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:28)        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:30)        at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:32)        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:34)        at $iwC$$iwC$$iwC.<init>(<console>:36)        at $iwC$$iwC.<init>(<console>:38)        at $iwC.<init>(<console>:40)        at <init>(<console>:42)        at .<init>(<console>:46)        at .<clinit>(<console>)        at .<init>(<console>:7)        at .<clinit>(<console>)        at $print(<console>)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)        at org.apache.spark.repl.Main$.main(Main.scala:31)        at org.apache.spark.repl.Main.main(Main.scala)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)scala> val sqlContext = new org.apache.spark.sql.SQLContext(sc)sqlContext: org.apache.spark.sql.SQLContext = org.apache.spark.sql.SQLContext@50f71e2bscala> val people = sc.textFile("examples/src/main/resources/people.txt")val people = sc.textFile("examples/src/main/resources/people.txt")<console>:1: error: ';' expected but 'val' found.       val people = sc.textFile("examples/src/main/resources/people.txt")val people = sc.textFile("examples/src/main/resources/people.txt")                                                                         ^scala> val people = sc.textFile("examples/src/main/resources/people.txt")15/03/03 16:16:00 INFO MemoryStore: ensureFreeSpace(163753) called with curMem=186441, maxMem=28024897515/03/03 16:16:00 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 159.9 KB, free 266.9 MB)15/03/03 16:16:00 INFO MemoryStore: ensureFreeSpace(22736) called with curMem=350194, maxMem=28024897515/03/03 16:16:00 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 22.2 KB, free 266.9 MB)15/03/03 16:16:00 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:51265 (size: 22.2 KB, free: 267.2 MB)15/03/03 16:16:00 INFO BlockManagerMaster: Updated info of block broadcast_1_piece015/03/03 16:16:00 INFO SparkContext: Created broadcast 1 from textFile at <console>:21people: org.apache.spark.rdd.RDD[String] = examples/src/main/resources/people.txt MapPartitionsRDD[5] at textFile at <console>:21scala> val schemaString = "name age"schemaString: String = name agescala> import org.apache.spark.sql._import org.apache.spark.sql._scala> val schema =     |   StructType(     |     schemaString.split(" ").map(fieldName => StructField(fieldName, StringType, true)))<console>:25: error: not found: value StructType         StructType(         ^scala> val schema = StructType(schemaString.split(" ").map(fieldName => StructField(fieldName, StringType, true)))<console>:24: error: not found: value StructType       val schema = StructType(schemaString.split(" ").map(fieldName => StructField(fieldName, StringType, true)))                    ^scala> val rowRDD = people.map(_.split(",")).map(p => Row(p(0), p(1).trim))rowRDD: org.apache.spark.rdd.RDD[org.apache.spark.sql.Row] = MapPartitionsRDD[7] at map at <console>:26scala> val peopleSchemaRDD = sqlContext.applySchema(rowRDD, schema)<console>:30: error: package schema is not a value       val peopleSchemaRDD = sqlContext.applySchema(rowRDD, schema)                                                            ^scala> peopleSchemaRDD.registerTempTable("people")<console>:23: error: not found: value peopleSchemaRDD              peopleSchemaRDD.registerTempTable("people")              ^scala> val results = sqlContext.sql("SELECT name FROM people")java.lang.RuntimeException: Table Not Found: people        at scala.sys.package$.error(package.scala:27)        at org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:111)        at org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:111)        at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)        at scala.collection.AbstractMap.getOrElse(Map.scala:58)        at org.apache.spark.sql.catalyst.analysis.SimpleCatalog.lookupRelation(Catalog.scala:111)        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable(Analyzer.scala:258)        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$8.applyOrElse(Analyzer.scala:270)        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$8.applyOrElse(Analyzer.scala:265)        at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:187)        at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:187)        at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:50)        at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:186)        at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:207)        at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)        at scala.collection.Iterator$class.foreach(Iterator.scala:727)        at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)        at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)        at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)        at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)        at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)        at scala.collection.AbstractIterator.to(Iterator.scala:1157)        at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)        at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)        at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)        at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)        at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenDown(TreeNode.scala:236)        at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:192)        at org.apache.spark.sql.catalyst.trees.TreeNode.transform(TreeNode.scala:177)        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:265)        at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:255)        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1$$anonfun$apply$2.apply(RuleExecutor.scala:61)        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1$$anonfun$apply$2.apply(RuleExecutor.scala:59)        at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111)        at scala.collection.immutable.List.foldLeft(List.scala:84)        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1.apply(RuleExecutor.scala:59)        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$apply$1.apply(RuleExecutor.scala:51)        at scala.collection.immutable.List.foreach(List.scala:318)        at org.apache.spark.sql.catalyst.rules.RuleExecutor.apply(RuleExecutor.scala:51)        at org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:1063)        at org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:1063)        at org.apache.spark.sql.DataFrameImpl.<init>(DataFrameImpl.scala:63)        at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:39)        at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:915)        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:26)        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37)        at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:39)        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:41)        at $iwC$$iwC$$iwC.<init>(<console>:43)        at $iwC$$iwC.<init>(<console>:45)        at $iwC.<init>(<console>:47)        at <init>(<console>:49)        at .<init>(<console>:53)        at .<clinit>(<console>)        at .<init>(<console>:7)        at .<clinit>(<console>)        at $print(<console>)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)        at org.apache.spark.repl.Main$.main(Main.scala:31)        at org.apache.spark.repl.Main.main(Main.scala)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)scala> results.map(t => "Name: " + t(0)).collect().foreach(println)<console>:23: error: not found: value results              results.map(t => "Name: " + t(0)).collect().foreach(println)              ^scala> rowRDD.collect()15/03/03 16:19:12 INFO FileInputFormat: Total input paths to process : 115/03/03 16:19:12 INFO SparkContext: Starting job: collect at <console>:2915/03/03 16:19:12 INFO DAGScheduler: Got job 0 (collect at <console>:29) with 1 output partitions (allowLocal=false)15/03/03 16:19:12 INFO DAGScheduler: Final stage: Stage 0(collect at <console>:29)15/03/03 16:19:12 INFO DAGScheduler: Parents of final stage: List()15/03/03 16:19:12 INFO DAGScheduler: Missing parents: List()15/03/03 16:19:12 INFO DAGScheduler: Submitting Stage 0 (MapPartitionsRDD[7] at map at <console>:26), which has no missing parents15/03/03 16:19:12 INFO MemoryStore: ensureFreeSpace(3056) called with curMem=372930, maxMem=28024897515/03/03 16:19:12 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 3.0 KB, free 266.9 MB)15/03/03 16:19:12 INFO MemoryStore: ensureFreeSpace(2122) called with curMem=375986, maxMem=28024897515/03/03 16:19:12 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 2.1 KB, free 266.9 MB)15/03/03 16:19:12 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on localhost:51265 (size: 2.1 KB, free: 267.2 MB)15/03/03 16:19:12 INFO BlockManagerMaster: Updated info of block broadcast_2_piece015/03/03 16:19:12 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:83915/03/03 16:19:12 INFO DAGScheduler: Submitting 1 missing tasks from Stage 0 (MapPartitionsRDD[7] at map at <console>:26)15/03/03 16:19:12 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks15/03/03 16:19:12 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 1349 bytes)15/03/03 16:19:12 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)15/03/03 16:19:12 INFO HadoopRDD: Input split: file:/home/jifeng/hadoop/spark-1.3.0-bin-2.4.1/examples/src/main/resources/people.txt:0+3215/03/03 16:19:12 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id15/03/03 16:19:12 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id15/03/03 16:19:12 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap15/03/03 16:19:12 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition15/03/03 16:19:12 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id15/03/03 16:19:12 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 2016 bytes result sent to driver15/03/03 16:19:12 INFO DAGScheduler: Stage 0 (collect at <console>:29) finished in 0.196 s15/03/03 16:19:12 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 147 ms on localhost (1/1)15/03/03 16:19:12 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 15/03/03 16:19:12 INFO DAGScheduler: Job 0 finished: collect at <console>:29, took 0.348604 sres3: Array[org.apache.spark.sql.Row] = Array([Michael,29], [Andy,30], [Justin,19])


Hive Tables

[jifeng@feng02 spark-1.3.0-bin-2.4.1]$ master=spark://feng02:7077 ./bin/spark-shellSpark assembly has been built with Hive, including Datanucleus jars on classpathlog4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).log4j:WARN Please initialize the log4j system properly.log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties15/03/03 22:55:12 INFO SecurityManager: Changing view acls to: jifeng15/03/03 22:55:12 INFO SecurityManager: Changing modify acls to: jifeng15/03/03 22:55:12 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jifeng); users with modify permissions: Set(jifeng)15/03/03 22:55:12 INFO HttpServer: Starting HTTP Server15/03/03 22:55:12 INFO Server: jetty-8.y.z-SNAPSHOT15/03/03 22:55:12 INFO AbstractConnector: Started SocketConnector@0.0.0.0:5629815/03/03 22:55:12 INFO Utils: Successfully started service 'HTTP class server' on port 56298.Welcome to      ____              __     / __/__  ___ _____/ /__    _\ \/ _ \/ _ `/ __/  '_/   /___/ .__/\_,_/_/ /_/\_\   version 1.3.0      /_/Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_45)Type in expressions to have them evaluated.Type :help for more information.15/03/03 22:55:18 INFO SparkContext: Running Spark version 1.3.015/03/03 22:55:18 INFO SecurityManager: Changing view acls to: jifeng15/03/03 22:55:18 INFO SecurityManager: Changing modify acls to: jifeng15/03/03 22:55:18 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jifeng); users with modify permissions: Set(jifeng)15/03/03 22:55:19 INFO Slf4jLogger: Slf4jLogger started15/03/03 22:55:19 INFO Remoting: Starting remoting15/03/03 22:55:19 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@feng02:59702]15/03/03 22:55:19 INFO Utils: Successfully started service 'sparkDriver' on port 59702.15/03/03 22:55:19 INFO SparkEnv: Registering MapOutputTracker15/03/03 22:55:19 INFO SparkEnv: Registering BlockManagerMaster15/03/03 22:55:19 INFO DiskBlockManager: Created local directory at /tmp/spark-cbda6663-0eeb-46f3-942a-b02b202adb53/blockmgr-3cf465f3-5759-49a5-8d7e-4b35d57e9a4a15/03/03 22:55:19 INFO MemoryStore: MemoryStore started with capacity 267.3 MB15/03/03 22:55:19 INFO HttpFileServer: HTTP File server directory is /tmp/spark-529754e8-50f5-45cb-aa62-4f55a6035c61/httpd-d35daa35-8e66-41a9-aa84-dd40e227359915/03/03 22:55:19 INFO HttpServer: Starting HTTP Server15/03/03 22:55:19 INFO Server: jetty-8.y.z-SNAPSHOT15/03/03 22:55:19 INFO AbstractConnector: Started SocketConnector@0.0.0.0:4219715/03/03 22:55:19 INFO Utils: Successfully started service 'HTTP file server' on port 42197.15/03/03 22:55:19 INFO SparkEnv: Registering OutputCommitCoordinator15/03/03 22:55:20 INFO Server: jetty-8.y.z-SNAPSHOT15/03/03 22:55:20 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:404015/03/03 22:55:20 INFO Utils: Successfully started service 'SparkUI' on port 4040.15/03/03 22:55:20 INFO SparkUI: Started SparkUI at http://feng02:404015/03/03 22:55:20 INFO Executor: Starting executor ID <driver> on host localhost15/03/03 22:55:20 INFO Executor: Using REPL class URI: http://10.6.3.201:5629815/03/03 22:55:20 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@feng02:59702/user/HeartbeatReceiver15/03/03 22:55:20 INFO NettyBlockTransferService: Server created on 4140315/03/03 22:55:20 INFO BlockManagerMaster: Trying to register BlockManager15/03/03 22:55:20 INFO BlockManagerMasterActor: Registering block manager localhost:41403 with 267.3 MB RAM, BlockManagerId(<driver>, localhost, 41403)15/03/03 22:55:20 INFO BlockManagerMaster: Registered BlockManager15/03/03 22:55:20 INFO SparkILoop: Created spark context..Spark context available as sc.15/03/03 22:55:21 INFO SparkILoop: Created sql context (with Hive support)..SQL context available as sqlContext.scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)sqlContext: org.apache.spark.sql.hive.HiveContext = org.apache.spark.sql.hive.HiveContext@3d9ece38scala> sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")15/03/03 22:55:59 WARN HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect.  Use hive.hmshandler.retry.* instead15/03/03 22:55:59 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore15/03/03 22:55:59 INFO ObjectStore: ObjectStore, initialize called15/03/03 22:56:00 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored15/03/03 22:56:00 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored15/03/03 22:56:00 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)        at org.apache.spark.sql.hive.HiveContext$$anonfun$5.apply(HiveContext.scala:216)        at org.apache.spark.sql.hive.HiveContext$$anonfun$5.apply(HiveContext.scala:212)        at scala.Option.orElse(Option.scala:257)        at org.apache.spark.sql.hive.HiveContext.x$4$lzycompute(HiveContext.scala:212)        at org.apache.spark.sql.hive.HiveContext.x$4(HiveContext.scala:210)        at org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:210)        at org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:210)        at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:71)        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)        at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:35)        at $iwC$$iwC$$iwC.<init>(<console>:37)        at $iwC$$iwC.<init>(<console>:39)        at $iwC.<init>(<console>:41)        at <init>(<console>:43)        at .<init>(<console>:47)        at .<clinit>(<console>)        at .<init>(<console>:7)        at .<clinit>(<console>)        at $print(<console>)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)        at org.apache.spark.repl.Main$.main(Main.scala:31)        at org.apache.spark.repl.Main.main(Main.scala)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)        ... 54 moreCaused by: java.lang.reflect.InvocationTargetException        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)        ... 59 moreCaused by: javax.jdo.JDOFatalInternalException: Error creating transactional connection factoryNestedThrowables:java.lang.reflect.InvocationTargetException        at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        at java.lang.reflect.Method.invoke(Method.java:606)        at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)        at java.security.AccessController.doPrivileged(Native Method)        at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)        at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)        at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)        at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)        at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)        at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)        at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)        at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497)        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475)        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:356)        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)        at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)        ... 64 moreCaused by: java.lang.reflect.InvocationTargetException        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)        at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)        at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)        at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:282)        at org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:240)        at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:286)        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)        at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)        at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)        at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)        at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)        ... 93 moreCaused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "dbcp-builtin" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.        at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:259)        at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131)        at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:85)        ... 111 moreCaused by: org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException: The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.        at org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:58)        at org.datanucleus.store.rdbms.connectionpool.DBCPBuiltinConnectionPoolFactory.createConnectionPool(DBCPBuiltinConnectionPoolFactory.java:49)        at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)        ... 113 more



0 2
原创粉丝点击