ScalaException之No TypeTag available for ***

来源:互联网 发布:骑行组队软件 编辑:程序博客网 时间:2024/05/17 03:36

今天使用Scala编写Spark程序的时候因为需要使用SparkSql建表所以有了如下的代码(代码只是例子,不是现实工程):

import org.apache.spark.{SparkContext, SparkConf}/** * Created by Utopia on 2016/7/13. */object testforjob {  def main (args: Array[String]) {    val conf = new SparkConf()    conf.setAppName("Test")    conf.setMaster("local")    val sc = new SparkContext(conf)    val lines = sc.textFile("C://Users/802/Desktop/uip.txt").map(_.split(","))    val sqlContext = new org.apache.spark.sql.SQLContext(sc)    import sqlContext.implicits._    case class UserItemPref(UserID: Int, ItemID: Int, Pref: Int)    val useritempref = lines.map(k => UserItemPref(k(0).toInt, k(1).toInt, k(2).trim.toInt)).toDF()    useritempref.registerTempTable("UIF")    val userid = sqlContext.sql("SELECT UserID FROM UIF")    userid.map(t => "Name: " + t).collect(````````````.foreach(println)  }}

写完以后运行,出现了下图的错误:error

发现不是spark出现问题了 ,而是scala编写出现问题了,在starckoverflow找到了解决方法。就是把case类放在主类外面就行了。
代码:

import org.apache.spark.{SparkContext, SparkConf}/** * Created by Utopia on 2016/7/13. */object testforjob {  case class UserItemPref(UserID: Int, ItemID: Int, Pref: Int)  def main (args: Array[String]) {    val conf = new SparkConf()    conf.setAppName("Test")    conf.setMaster("local")    val sc = new SparkContext(conf)    val lines = sc.textFile("C://Users/802/Desktop/uip.txt").map(_.split(","))    val sqlContext = new org.apache.spark.sql.SQLContext(sc)    import sqlContext.implicits._    val useritempref = lines.map(k => UserItemPref(k(0).toInt, k(1).toInt, k(2).trim.toInt)).toDF()    useritempref.registerTempTable("UIF")    val userid = sqlContext.sql("SELECT UserID FROM UIF")    userid.map(t => "Name: " + t).collect().foreach(println)  }}

得到结果:
这里写图片描述

0 0