$ sudo yum install snappy snappy-develThis will create a file called libsnappy.so under /usr/lib64 directory, we need to create a link to this file under “$HADOOP_HOME/lib/native”
sudo ln -s /usr/lib64/libsnappy.so $HADOOP_HOME/lib/native/libsnappy.soThen update three configuration files: $HADOOP_HOME/etc/hadoop/core-site.xml
And finally add the following line into $HADOOP_HOME/etc/hadoop/hadoop-env.sh to tell Hadoop to load the native library from the exact location:
mapreduce.map.output.compress true mapred.map.output.compress.codec org.apache.hadoop.io.compress.SnappyCodec
export JAVA_LIBRARY_PATH="/usr/local/hadoop/lib/native"That’s it, just restart HDFS and Yarn by running:
$HADOOP_HOME/sbin/stop-all.sh $HADOOP_HOME/sbin/start-all.shNow you should be able to create hive tables with Snappy compressed.
This article doesn’t work. I did find one that worked though here at http://www.linuxsecrets.com/easyblog/2015/05/18/1485-compiling-apache-hadoop-64bit-version-with-compression-support-for-linux
Thanks. It solved my issue.
Thanks Laurent for visiting my blog. I am glad that my post helps in your case.