码迷,mamicode.com
首页 > 其他好文 > 详细

错误总结

时间:2020-12-17 12:52:27      阅读:3      评论:0      收藏:0      [点我收藏+]

标签:evel   memory   ati   EAP   ast   stopped   manager   text   pac   

20/12/12 15:49:47 ERROR SparkContext: Error initializing SparkContext. java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration. at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:216) at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:198) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:174) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:432) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) at apple.TFIDF$.main(TFIDF.scala:11) at apple.TFIDF.main(TFIDF.scala) 20/12/12 15:49:47 INFO SparkContext: Successfully stopped SparkContext Exception in thread "main" java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration. at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:216) at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:198) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:174) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:432) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) at apple.TFIDF$.main(TFIDF.scala:11) at apple.TFIDF.main(TFIDF.scala)

val ss= SparkSession.builder().master("local").appName("hello").getOrCreate()
val sc=ss.sparkContext
sc.setLogLevel("ERROR")
这个报错是JVM申请的memory不够导致无法启动SparkContext
本地测试的话,可以直接在代码中conf里设置一下spark.testing.memory
val sparkConf = new SparkConf().set("spark.testing.memory", "2147480000")
val ss= SparkSession.builder().config(sparkConf).master("local").appName("tfidf").getOrCreate()
val sc=ss.sparkContext

错误总结

标签:evel   memory   ati   EAP   ast   stopped   manager   text   pac   

原文地址:https://www.cnblogs.com/ShyPeanut/p/14125001.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!