Beeline Failed To Start With OOM Error When Calling getConsoleReader Method

Beeline Failed To Start With OOM Error When Calling getConsoleReader Method

If you get the following error when trying to start up beeline from command line:
Exception in thread "main" java.lang.OutOfMemoryError: Requested array size exceeds VM limit 
at java.util.Arrays.copyOf(Arrays.java:2271) 
at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:113) 
at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93) 
at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:122) 
at org.apache.hive.beeline.BeeLine.getConsoleReader(BeeLine.java:854) 
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:766) 
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:480) 
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:463) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:606) 
at org.apache.hadoop.util.RunJar.run(RunJar.java:221) 
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 
Based on the stacktrace, we can see that Beeline was at startup phase and was trying to initialize through getConsoleReader method, which will read data from beeline’s history file:
    try {
      // now load in the previous history
      if (hist != null) {
        History h = consoleReader.getHistory();
        if (h instanceof FileHistory) {
          ((FileHistory) consoleReader.getHistory()).load(new ByteArrayInputStream(hist
              .toByteArray()));
        } else {
          consoleReader.getHistory().add(hist.toString());
        }
      }
    } catch (Exception e) {
        handleException(e);
    }
By default, the history file is located under ~/.beeline/history and beeline will load the latest 500 rows into memory. If those queries are super big, containing lots of characters, it is possible that the history file size will reach as big as a few GBs. When beeline is trying to load such big history file into memory, it will eventually fail with OutOfMemory error. I have reported such issue in the Hive upstream JIRA: HIVE-15166, and I am in the middle of submitting a patch for it. For the time being, the best way is to remove the ~/.beeline/history file before you fire up beeline.

Loading

Leave a Reply

Your email address will not be published. Required fields are marked *

My new Snowflake Blog is now live. I will not be updating this blog anymore but will continue with new contents in the Snowflake world!