码迷,mamicode.com
首页 > 其他好文 > 详细

spark 执行诡异问题

时间:2014-08-28 15:00:30      阅读:339      评论:0      收藏:0      [点我收藏+]

标签:spark 执行诡异问题

今天在centos 6.3上搭建了hadoop+hive+spark 系统,在运行

/usr/local/spark-1.0.0/bin/spark-shell  


出现找不到hive 相关的类,于是修改了spark-env.sh


在spark_classpath 里添加了  hive 的库路径

export SPARK_CLASSPATH=/usr/local/spark-1.0.0/lib_managed/jars/spark-assembly-1.0.0-hadoop2.4.0.jar:/usr/local/spark-1.0.0/assembly/target/scala-2.10/mysql-connector-java-5.1.25-bin.jar:/usr/local/hive-0.13/lib/*


再次运行 

/usr/local/spark-1.0.0/bin/spark-shell   又出现


java.lang.IllegalAccessError: tried to access field org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.conf from class org.apache.hadoop.hive.ql.security.ProxyUserAuthenticator

at org.apache.hadoop.hive.ql.security.ProxyUserAuthenticator.setConf(ProxyUserAuthenticator.java:40)

at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)

at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)

at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:365)

at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:278)

at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:166)



进到 /usr/local/spark-1.0.0/assembly/target/scala-2.10/org/apache/hadoop/hive/ql/security 目录下,把authorization 目录修改成另外一个名字 ,再次运行就成了。 原因应该是类重复了(猜的,呵呵)

本文出自 “决胜千里之外” 博客,请务必保留此出处http://lubing.blog.51cto.com/5293532/1546130

spark 执行诡异问题

标签:spark 执行诡异问题

原文地址:http://lubing.blog.51cto.com/5293532/1546130

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!