hbase:table Permission Error in Spark

hbase:table Permission Error in Spark

If your spark-shell or pyspark failed to start with below error message:

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException): org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions (user=@, scope=hbase:meta, params=[table=hbase:meta],action=EXEC)

You are not alone. This error is common in Spark 2 when you run spark-shell or pyspark on a host that has both Spark and HBase gateway installed, and Spark will pick up HBase configurations and try to access hbase:meta table upon start up. If the user who tries to start spark job and he/she does not have EXEC access to hbase:meta table, then spark will fail with above error.

This checking was introduced by SPARK-12523 – Support long-running of the Spark On HBase and hive meta store.

There are two options here.

  1. If you do not need to access HBase from Spark, simply disable the HBase table checking from Spark side, by adding spark.security.credentials.hbase.enabled=false to spark’s client configuration file under /etc/spark/conf/spark-defaults.conf. You can just add it to the end of the file.

    However, if you are using Cloudera Manager, a better solution is to update the setting at cluster level so that all hosts will have the new setting. You can go to CM > Spark > Configuration page, search for “Spark Client Advanced Configuration Snippet (Safety Valve) for spark-conf/spark-defaults.conf” (in CM6.x) or “Spark 2 Client Advanced Configuration Snippet (Safety Valve) for spark2-conf/spark-defaults.conf” (in CM5.x), and add “spark.security.credentials.hbase.enabled=false” into the textbox, like below:

  2. Alternative, if you do need HBase access from Spark, then you will have to grant the permission from HBase side. Go into hbase shell and then run:
    grant '<username>', 'X', 'hbase:meta'

Hope above info can help with anyone who also have the same issue.

If you have any suggestions or questions, please feel fee to add comments on this post.



    1. Eric Lin

      Hi Vidhi,

      Thanks for visiting my site and post questions.

      You should not need to change more, the only issue here is permissions, once you fix the permissions, your Spark should work.

      Can you share the command you used to run in option 2 and do you still see the EXACT same error message as before?


Leave a Reply

Your email address will not be published. Required fields are marked *

My new Snowflake Blog is now live. I will not be updating this blog anymore but will continue with new contents in the Snowflake world!