码迷,mamicode.com
首页 > Web开发 > 详细

【原创】问题定位分享(16)spark写数据到hive外部表报错ClassCastException: org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat cannot be cast to org.apache.hadoop.hive.ql.io.HiveOutputFormat

时间:2018-12-18 17:12:14      阅读:1201      评论:0      收藏:0      [点我收藏+]

标签:hadoop   get   div   blog   ble   官方   代码   conf   http   

 spark 2.1.1

 

spark在写数据到hive外部表(底层数据在hbase中)时会报错

Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat cannot be cast to org.apache.hadoop.hive.ql.io.HiveOutputFormat
at org.apache.spark.sql.hive.SparkHiveWriterContainer.outputFormat$lzycompute(hiveWriterContainers.scala:82)

 

org.apache.spark.sql.hive.SparkHiveWriterContainer

org.apache.spark.sql.hive.SparkHiveWriterContainer
  @transient private lazy val outputFormat = conf.value.getOutputFormat.asInstanceOf[HiveOutputFormat[AnyRef, Writable]]

 

报错的是这一句,查看代码发现此时这个变量并没有什么用处,可以在不能cast时置为null

  @transient private lazy val outputFormat =
    // conf.value.getOutputFormat.asInstanceOf[HiveOutputFormat[AnyRef, Writable]]
    conf.value.getOutputFormat match {
      case format if format.isInstanceOf[HiveOutputFormat[AnyRef, Writable]] => format.asInstanceOf[HiveOutputFormat[AnyRef, Writable]]
      case _ => null
    }

问题解决,官方讨论如下: https://issues.apache.org/jira/browse/SPARK-6628

 

 

【原创】问题定位分享(16)spark写数据到hive外部表报错ClassCastException: org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat cannot be cast to org.apache.hadoop.hive.ql.io.HiveOutputFormat

标签:hadoop   get   div   blog   ble   官方   代码   conf   http   

原文地址:https://www.cnblogs.com/barneywill/p/10137915.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!