HiveMetaStore Failed to Start in Cloudera Manager: cmf.service.config.ConfigGenException: Unable to generate config file creds.localjceks

HiveMetaStore Failed to Start in Cloudera Manager: cmf.service.config.ConfigGenException: Unable to generate config file creds.localjceks

Recently I was dealing with an issues that HiveMetaStore failed to start in a Cloudera Manager managed environment. It failed with below errors:
Caused by: com.cloudera.cmf.service.config.ConfigGenException: Unable to generate config file creds.localjceks
        at com.cloudera.cmf.service.config.JceksConfigFileGenerator.generate(JceksConfigFileGenerator.java:63)
        at com.cloudera.cmf.service.HandlerUtil.emitConfigFiles(HandlerUtil.java:133)
        at com.cloudera.cmf.service.AbstractRoleHandler.generateConfiguration(AbstractRoleHandler.java:887)
This problem is very common if you have the following misconfiguration in your cluster: 1. Wrong version of Java being used. For a list of supported version of Java by Cloudera, please refer to below link: CDH and Cloudera Manager Supported JDK Versions 2. Different version of Java used across the cluster hosts. So run:
java -version
and check symlinks under /usr/java/jdk****-cloudera to confirm they are consistent across the whole cluster. After all above were performed, try to restart failed service, most likely the issue should be resolved. If not, please let me know in the comments below.

Loading

2 Comments

  1. THY

    java -version :/usr/java/jdk1.7.0_67-cloudera,restart failed service ,failed again….
    sad…
    Completed only 0/1 steps. First failure: Could not create process: com.cloudera.cmf.service.config.ConfigGenException: Unable to generate config file creds.localjceks

    1. Eric Lin

      Hi Thy,

      Thanks for posting comment in my blog. I am wondering if there are more details on the error? Does it mention why it was unable to generate config file creds.localjceks?

      Cheers

Leave a Reply

Your email address will not be published. Required fields are marked *

My new Snowflake Blog is now live. I will not be updating this blog anymore but will continue with new contents in the Snowflake world!