DWQA QuestionsCategory: Development ToolIntelliJ idea out of memory error
ningyuwhut asked 2 months ago

I run a spark program in IntelliJ idea, and every time I encounter an out of memory error, then I set the JVM parameter of IntelliJ idea/Applications/IntelliJ IDEA 15.app/Contents/binThere is oneidea.vmoptionsDocument, amend to read:

-Xms2g
-Xmx6g
-XX:MaxNewSize=256m
-XX:MaxPermSize=512m
-XX:ReservedCodeCacheSize=1024m
-XX:+UseCompressedOops

Then restart IntelliJ idea to run the project or the oom error.
Then I configured VM options under edit configuration in run tab:

-XX:MaxNewSize=256m -XX:MaxPermSize=512m

Run the program again, no oom error.
I use JDK7, spark is a very simple example program:

import org.apache.spark.{SparkContext, SparkConf}
import org.apache.spark.sql.hive.HiveContext

object SparkArchetype {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("TestSparkHQL").setMaster("local");
    //The local mode is for local testing only. To run in the cluster, delete the setmaster code
//    conf.setMaster("local")
    val sc = new SparkContext(conf)
    val sqlContext = new HiveContext(sc)

    val arr = sc.parallelize(1 to 8 toList).filter(_ % 2 == 0).take(3)
    println(arr.mkString(", "))

    sc.stop()
  }
}

The exception log is as follows:

17/08/03 16:47:08 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
Exception in thread "main" java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:183)
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.<init>(IsolatedClientLoader.scala:179)
    at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:226)
    at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:185)
    at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:392)
    at org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:174)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:177)
    at com.meituan.mthdp.sparktools.sometools.SparkArchetype$.main(SparkArchetype.scala:12)
    at com.meituan.mthdp.sparktools.sometools.SparkArchetype.main(SparkArchetype.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.OutOfMemoryError: PermGen space
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.doLoadClass(IsolatedClientLoader.scala:165)
    at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.loadClass(IsolatedClientLoader.scala:153)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at org.apache.hadoop.hive.ql.metadata.HiveException.<init>(HiveException.java:31)
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
    at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
    at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:183)
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.<init>(IsolatedClientLoader.scala:179)
    at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:226)
    at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:185)
    at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:392)
    at org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:174)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:177)
    at com.meituan.mthdp.sparktools.sometools.SparkArchetype$.main(SparkArchetype.scala:12)
    at com.meituan.mthdp.sparktools.sometools.SparkArchetype.main(SparkArchetype.scala)
Exception in thread "Thread-2" java.lang.OutOfMemoryError: PermGen space
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at org.apache.log4j.spi.LoggingEvent.<init>(LoggingEvent.java:165)
    at org.apache.log4j.Category.forcedLog(Category.java:391)
    at org.apache.log4j.Category.log(Category.java:856)
    at org.slf4j.impl.Log4jLoggerAdapter.log(Log4jLoggerAdapter.java:601)
    at org.apache.commons.logging.impl.SLF4JLocationAwareLog.warn(SLF4JLocationAwareLog.java:199)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:56)

Who knows what’s going on

2 Answers
jpmonty answered 2 months ago

You try to change the JVM parameters of the running program. You change the parameters of the starting ide.

ningyuwhut replied 2 months ago

There is a problem that if the JVM parameters are not set separately for the program, then what parameters are used in this project? I feel that only the parameters in idea.vmoptions are used

ningyuwhut replied 2 months ago

It’s strange that I didn’t find out where the java_opts variable was set. If I didn’t set this variable, what parameters would the JVM use

A fabricated belief answered 2 months ago

Run idea64.exe instead of idea.exe to use more memory.