/* Site under maintenance */

[Linux] Compile Hadoop 2.2.0 (fix: Unable to load native-hadoop library)

EDIT: Updated for Ubuntu 14.04 and Hadoop 2.4.0

Compile Hadoop 2.4.0 under Ubuntu Linux 13.10 amd64. You may want to compile Hadoop in order to fix the issue:
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  • Install a JDK: Oracle JDK (suggested) or package openjdk-7-jdk
  • Install maven, libssl-dev, build-essential, pkgconf, and cmake.
  • Install the library protobuf:
    • If you are running Ubuntu 13.10 or earlier then install locally protobuf-2.5.0 and insert it in the PATH:cd protobuf-2.5.0/./configure --prefix=`pwd`/inst/bin && make && make installexport PATH=`pwd`/inst/bin:$PATH
    • Otherwise (Ubuntu 14.04 or newer):sudo apt-get install libprotobuf8 protobuf-compiler
  • Download Hadoop sources.
  • Compile it:tar xvf hadoop-2.4.0-src.tar.gzcd hadoop-2.4.0-srcmvn package -Pdist,native -DskipTests -DtarCheck hadoop-dist/target/hadoop-2.4.0.tar.gz (e.g., use this as your hadoop binary) or hadoop-dist/target/hadoop-2.4.0. If you have already installed a 32bit Hadoop, then you need only to replace the native libs in $HADOOP/lib/ with the new native libs (hadoop-dist/target/hadoop-2.4.0/lib) and remove (if applicable) from $HADOOP/etc/hadoop-env.sh:export HADOOP_COMMON_LIB_NATIVE_DIR="~/hadoop/lib/" export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=~/hadoop/lib/"
  • You can delete Hadoop (and Protobuf sources if Ubuntu 13.10 or earlier).


14 commentiInserisci un commento • Pubblicato il 5 December 2013 • Ultima modifica 27 August 2014 • Feed commenti
1. Jitendra - 8 March 2014 @ 16:25
I am getting following exception while compiling the hadoop source

[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.2.0:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: protoc version is 'libprotoc 2.4.1', expected version is '2.5.0' -> [Help 1]

Please Help.
2. ercoppa - 8 March 2014 @ 16:31
Hi Jitendra,

You need to compile&install protobuf 2.5 (or later) and then put it in your PATH env var. This is explained in this post.
3. sreemanth - 20 March 2014 @ 12:36
Please make sure that your folder name should not spaces,
/home/hadoop2.2/hadoop2.2-src --- Build has no issues,

if your folder is like
/home/hadoop 2.2/hadoop2.2-src --- will get build issues.
4. srossetto - 3 April 2014 @ 23:45
Change export PATH=`pwd`/inst:$PATH by export PATH=`pwd`/inst/bin:$PATH
5. ercoppa - 4 April 2014 @ 00:14
@srossetto: you are right. Fixed. Thank you :)
6. Frank - 18 April 2014 @ 17:25
Hi Emilio

I was having problems starting the datanode. Java for some reason doesn't like the file permissions.

So i decided to recompile hadoop so the native libraies will work.

I accidentally downloaded the wrong protobuf version 2.4.1 instead of 2.5.0. now that I have corrected that I am trying to compile hadoop.

It crashes out because it thinks it has found the wrong version of protobuf.

Any ideas how to clear the protobuf value?

Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.2.0:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: protoc version is 'libprotoc 2.4.1', expected version is '2.5.0' -> [Help 1]

Reagrds

Frank
7. Kaldr - 19 April 2014 @ 02:59
I'm installing hadoop on MacOSX. By following your helpful tips, i meet this problem
Failed to execute goal org.codehaus.mojo:native-maven-plugin:1.0-alpha-7
8. ercoppa - 22 April 2014 @ 11:25
@Frank,

You need to compile and install protobuf >=2.5.x. Remember to set the PATH variable with the export command (as discussed in the post).

@Kaldr

Did you try this? http://stackoverflow.com/questions/17201366/no-basedir-set-from-native-maven-plugin
9. Amaya - 22 May 2014 @ 11:49
mvn package -Pdist,native -DskipTests -Dtar

After doing this what should I do?. I couldnt understand what is to be done after compiling
10. ercoppa - 28 May 2014 @ 11:57
Hi Amaya,

Use hadoop-dist/target/hadoop-2.2.0.tar.gz as Hadoop binary.
11. CTO - 29 May 2014 @ 08:56
I run through the process with hadoop 2.4 but the problem still exist.
Build completed successfully
I copied the native directory from my build directory to the native directory in the original hadoop 32 lib folder.
started dfs and yarn and re-run the test but still getting
4/05/29 09:53:47 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

btw, I am running ubunto 14.4
Any ideas?
12. Matt - 29 May 2014 @ 23:44
Hello,

I am confused on this step...not sure exactly what I should do.

Check hadoop-dist/target/hadoop-2.2.0.tar.gz or hadoop-dist/target/hadoop-2.2.0. If you have already installed a 32bit Hadoop, then you need only to replace the native libs in $HADOOP/lib/native with the new native libs and remove (if applicable) from $HADOOP/etc/hadoop-env.sh:
export HADOOP_COMMON_LIB_NATIVE_DIR="~/hadoop/lib/"
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=~/hadoop/lib/"

how do I do this step?

replace the native libs in $HADOOP/lib/native with the new native libs and remove (if applicable) from $HADOOP/etc/hadoop-env.sh:

Thanks!
13. Matt - 30 May 2014 @ 15:29
Hello,

I am confused on this step...not sure exactly what I should do.

Check hadoop-dist/target/hadoop-2.2.0.tar.gz or hadoop-dist/target/hadoop-2.2.0. If you have already installed a 32bit Hadoop, then you need only to replace the native libs in $HADOOP/lib/native with the new native libs and remove (if applicable) from $HADOOP/etc/hadoop-env.sh:
export HADOOP_COMMON_LIB_NATIVE_DIR="~/hadoop/lib/"
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=~/hadoop/lib/"

how do I do this step?

replace the native libs in $HADOOP/lib/native with the new native libs and remove (if applicable) from $HADOOP/etc/hadoop-env.sh:

Thanks!
14. Matt - 30 May 2014 @ 15:29
Hello,

I am confused on this step...not sure exactly what I should do.

Check hadoop-dist/target/hadoop-2.2.0.tar.gz or hadoop-dist/target/hadoop-2.2.0. If you have already installed a 32bit Hadoop, then you need only to replace the native libs in $HADOOP/lib/native with the new native libs and remove (if applicable) from $HADOOP/etc/hadoop-env.sh:
export HADOOP_COMMON_LIB_NATIVE_DIR="~/hadoop/lib/"
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=~/hadoop/lib/"

how do I do this step?

replace the native libs in $HADOOP/lib/native with the new native libs and remove (if applicable) from $HADOOP/etc/hadoop-env.sh:

Thanks!