spark处理jsonFile

来源:互联网 发布:fkled编辑软件下载 编辑:程序博客网 时间:2024/06/03 19:21

按照spark的说法,这里的jsonFile是特殊的文件:

Note that the file that is offered as jsonFile is not a typical JSON file. Each line must contain a separate, self-contained valid JSON object. As a consequence, a regular multi-line JSON file will most often fail.

它是按行分隔多个JSON对象,否则的话就会出错。

以下是一个jsonFile的内容:

scala> val path = "examples/src/main/resources/people.json"path: String = examples/src/main/resources/people.jsonscala> Source.fromFile(path).foreach(print){"name":"Michael"}{"name":"Andy", "age":30}{"name":"Justin", "age":19}

可以获取到一个SchemaRDD:

scala> val sqlContext = new org.apache.spark.sql.SQLContext(sc)scala> val jsonFile = sqlContext.jsonFile(path)scala> jsonFile.printSchema()root |-- age: integer (nullable = true) |-- name: string (nullable = true)

针对该SchemaRDD可以做遍历操作:

jsonFile.filter(row=>{val age=row(0).asInstanceOf[Int];age>=13&&age<=19}).collect

既然是SchemaRDD,就可以采用SQL:

scala> jsonFile.registerTempTable("people")scala> val teenagers = sqlContext.sql("SELECT name FROM people WHERE age >= 13 AND age <= 19")scala> teenagers.foreach(println)
0 0
原创粉丝点击