Build with existing libhdfs.a


I tried to build master with HDFS enabled on one of production gateway machine. This point libhdfs to /opt/yarn/binary/lib/native/libhdfs.a and complain. Does it means I need to recompile libhdfs from source?

/usr/bin/ld: /opt/yarn/binary/lib/native/libhdfs.a(hdfs.c.o): relocation R_X86_64_32 against `.rodata.str1.1’ can not be used when making a shared object; recompile with -fPIC
/opt/yarn/binary/lib/native/libhdfs.a: error adding symbols: Bad value

Here is changes in my code

make/ USE_HDFS=1
dmlc-core/make/ USE_HDFS=1
jvm-packages/ “USE_HDFS”: “ON”

Here is failed command in maven script.

cq@hadoopgw02-dca1:~/xgboost/jvm-packages$ cmake --build . --config Release
[ 7%] Built target rabit
[ 71%] Built target objxgboost
[ 92%] Built target dmlc
[ 95%] Built target runxgboost
[ 96%] Built target xgboost
[ 98%] Linking CXX shared library …/…/lib/
/usr/bin/ld: /opt/yarn/binary/lib/native/libhdfs.a(hdfs.c.o): relocation R_X86_64_32 against `.rodata.str1.1’ can not be used when making a shared object; recompile with -fPIC
/opt/yarn/binary/lib/native/libhdfs.a: error adding symbols: Bad value
collect2: error: ld returned 1 exit status
jvm-packages/CMakeFiles/xgboost4j.dir/build.make:172: recipe for target ‘…/lib/’ failed
make[2]: *** […/lib/] Error 1
CMakeFiles/Makefile2:351: recipe for target ‘jvm-packages/CMakeFiles/xgboost4j.dir/all’ failed
make[1]: *** [jvm-packages/CMakeFiles/xgboost4j.dir/all] Error 2
Makefile:129: recipe for target ‘all’ failed
make: *** [all] Error 2


– CMake version 3.12.3
– Architecture: x64
– HDFS_LIB_PATHS: /opt/yarn/binary/lib/native
– Hadoop 2.6.0-cdh5.7.2
Subversion Unknown -r Unknown
Compiled by root on 2019-03-18T04:54Z
Compiled with protoc 2.5.0
From source with checksum 9e5ffedfeea2de4cd586f0865c569f8
This command was run using /opt/hadoop/hadoop-cdh5-2.6.0_5.7.2-57/share/hadoop/common/hadoop-common-2.6.0-cdh5.7.2.jar
– HDFS_INCLUDE_DIR: /opt/yarn/binary/include
– HDFS_LIBRARIES: /opt/yarn/binary/lib/native/
– hdfs_static: /opt/yarn/binary/lib/native/libhdfs.a
– /home/cq/xgboost/dmlc-core/cmake/ -> /home/cq/xgboost/dmlc-core/include/dmlc/build_config.h
– Configuring done
– Generating done
– Build files have been written to: /home/cq/xgboost/jvm-packages


I found this from long ago. will dig into and see if can solve the problem. wish we have better documentation.


@chenqin I agree that we need better documentation. I’m assuming you are using upstream Spark and not CDH or other 3rd-party distributions?


Actually, I tried to run on top of yarn example directly without spark at the moment.
looks like we do need to recompile hadoop-common/hadoop-hdfs/src with -fPIC flag to build with HDFS ON.

seems facing similar issue as

Container: container_e314_1557041842011_2389895_01_000001 on

Log Upload Time:Fri May 24 20:58:06 +0000 2019
Log Contents:
Error: A JNI error has occurred, please check your installation and try again
Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/io/DataOutputBuffer
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(
at java.lang.Class.privateGetMethodRecursive(
at java.lang.Class.getMethod0(
at java.lang.Class.getMethod(
at sun.launcher.LauncherHelper.validateMainClass(
at sun.launcher.LauncherHelper.checkAndLoadMain(
Caused by: java.lang.ClassNotFoundException:
at java.lang.ClassLoader.loadClass(
at sun.misc.Launcher$AppClassLoader.loadClass(
at java.lang.ClassLoader.loadClass(
… 7 more



@chenqin any resolution for this issue? I’m facing a similar issue now.


I gave up on this and use xgboost4j-spark, essentially use spark hdfs reader and feed into trainer with dataframe.


Got it. I gave up on this too. The reason I was trying this was because a job that I have which works with xgboost 0.82, scala 2.11, spark 2.3.2 doesn’t work with xgboost 0.9, scala 2.12, spark 2.4.3. Run into an issue that suggests it is because of HDFS build.