码迷,mamicode.com
首页 > 其他好文 > 详细

spark-maven打包报错

时间:2019-07-05 22:42:36      阅读:272      评论:0      收藏:0      [点我收藏+]

标签:pos   err   expected   java   test   ctc   gen   require   uil   

 

 maven打包时报错:

报错信息:

"D:\Program Files\Java\jdk1.8.0_131\bin\java" -Dmaven.multiModuleProjectDirectory=D:\Workspace\IDEA_work\Spark_Work\spark01\sparkCore "-Dmaven.home=D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\plugins\maven\lib\maven3" "-Dclassworlds.conf=D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\plugins\maven\lib\maven3\bin\m2.conf" "-javaagent:D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\lib\idea_rt.jar=61000:D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\bin" -Dfile.encoding=UTF-8 -classpath "D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\plugins\maven\lib\maven3\boot\plexus-classworlds-2.5.2.jar" org.codehaus.classworlds.Launcher -Didea.version=2017.3.1 -DskipTests=true package
[INFO] Scanning for projects...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building sparkCore 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ sparkCore ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] Copying 0 resource
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ sparkCore ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (default) @ sparkCore ---
[WARNING]  Expected all dependencies to require Scala version: 2.11.8
[WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7
[WARNING] Multiple versions of scala libraries detected!
[INFO] D:\Workspace\IDEA_work\Spark_Work\spark01\sparkCore\src\main\java:-1: info: compiling
[INFO] D:\Workspace\IDEA_work\Spark_Work\spark01\sparkCore\src\main\scala:-1: info: compiling
[INFO] Compiling 1 source files to D:\Workspace\IDEA_work\Spark_Work\spark01\sparkCore\target\classes at 1562322123123
[ERROR] error: error while loading <root>, Error accessing C:\Users\67001\.m2\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar
[ERROR] error: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found.
[ERROR]     at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:17)
[ERROR]     at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:18)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:53)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getRequiredClass(Mirrors.scala:105)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1394)
[ERROR]     at scala.tools.nsc.Global$Run.<init>(Global.scala:1215)
[ERROR]     at scala.tools.nsc.Driver.doCompile(Driver.scala:31)
[ERROR]     at scala.tools.nsc.MainClass.doCompile(Main.scala:23)
[ERROR]     at scala.tools.nsc.Driver.process(Driver.scala:51)
[ERROR]     at scala.tools.nsc.Driver.main(Driver.scala:64)
[ERROR]     at scala.tools.nsc.Main.main(Main.scala)
[ERROR]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[ERROR]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[ERROR]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[ERROR]     at java.lang.reflect.Method.invoke(Method.java:498)
[ERROR]     at scala_maven_executions.MainHelper.runMain(MainHelper.java:164)
[ERROR]     at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4.995 s
[INFO] Finished at: 2019-07-05T18:22:03+08:00
[INFO] Final Memory: 26M/698M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (default) on project sparkCore: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

Process finished with exit code 1

 

解决方法:

 

spark-maven打包报错

标签:pos   err   expected   java   test   ctc   gen   require   uil   

原文地址:https://www.cnblogs.com/LXL616/p/11140975.html

(1)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!