Spark 下使用分析函数报错

来源:互联网 发布:js脚本被劫持 广告 编辑:程序博客网 时间:2024/05/22 02:22


[Author]: Julia

如题,在spark下运行带有分析函数的sql语句,程序报错,但是同样的语句在Hive环境下面运行正常。

sql:--这一段是实际想要执行的语句,里面用到了sum() over (partition by )函数,子查询单独执行是没有问题的

select sum(r_count) over(partition by a.day) sum_day,
       a.r_count,
       a.datetime_v1,
       a.datetime_v2,
       a.day
  from (select count(*) r_count,
               SUBSTR(cast(fld_datetime as char(18)), 1, 10) AS DATETIME_V1,
               SUBSTR(cast(fld_datetime as char(18)), 12, 2) datetime_v2,
               day
          from ods.table_test
         where day = '20150521'
         group BY SUBSTR(cast(fld_datetime as char(18)), 1, 10),
                  SUBSTR(cast(fld_datetime as char(18)), 12, 2),
                  day) a;

报错如下:Error: org.apache.spark.sql.AnalysisException:
Unsupported language features in query:

是说明spark 不支持sql里用分析函数吗?

如果想要使用分析函数要怎么处理,求解答!~~~

0 0
原创粉丝点击