Go bindings for libhdfs, for manipulating files on Hadoop distributed file system.
hdfs.Fs
: file system handlehdfs.File
: file handlehdfs.FileInfo
: file metadata structure, represented within Go
see go doc
- JVM
- HDFS: c bindings for libhdfs, java binary packages
- HDFS: configured cluster
Based on hadoop-1.0.4.
- change
<error.h>
to<err.h>
insrc/c++/libhdfs/hdfsJniHelper.c
- change
md5sum
tomd5
insrc/saveVersion.sh
- run
chmod +x src/c++/libhdfs/install-sh
- run
ant -Dcompile.c++=true -Dlibhdfs=true compile-c++-libhdfs
to build libhdfs. - Upon successful building, libraries have been installed in
build/c++/Mac_OS_X-x86_64-64/lib
;Makefile
inbuild/c++-build/Mac_OS_X-x86_64-64/libhdfs
is very helpful for later-on re-compilation.it is ok the build ends up with installation errors if you can already find compiled libs inbuild/c++-build/Mac_OS_X-x86_64-64/libhdfs/.libs
or so - change install_name for libhdfs:
install_name_tool -id /usr/lib/java/libhdfs.0.0.0.dylib libhdfs.0.0.0.dylib
- put
libhdfs*.dylib
in/usr/lib/java
Based on hadoop-1.0.1; libhadoop would be loaded by util.NativeCodeLoader
when accessing local file system.
-
java: change
-ljvm
to-framework JavaVM
in bothMakefile.am
andMakefile.in
-
libz: apply patch to
acinclude.m4
:elif test ! -z "`which otool | grep -v 'no otool'`"; then ac_cv_libname_$1=\"`otool -L conftest | grep $1 | sed -e 's/^[ ]*//' -e 's/ .*//' -e 's/.*\/\(.*\)$/\1/'`\";
and
configure
:elif test ! -z "`which otool | grep -v 'no otool'`"; then ac_cv_libname_z=\"`otool -L conftest | grep z | sed -e 's/^ *//' -e 's/ .*//' -e 's/.*\/\(.*\)$/\1/'`\";
-
apply patch to source code
src/org/apache/hadoop/security/JniBasedUnixGroupsNetgroupMapping.c
. -
run
ant compile-native
-
put the compiled library
libhadoop.1.0.0.dylib
and its symbolic links in/usr/lib/java
, which is one of the default element ofjava.library.path
. -
change install_name for libhadoop:
sudo install_name_tool -id /usr/lib/java/libhadoop.1.dylib /usr/lib/java/libhadoop.1.0.0.dylib
-
put
.jar
from hadoop in.libs/javalibs
;conf/
in.libs
; seemktest.sh
for details, or you can modify it to accommodate your environment. -
set
LD_LIBRARY_PATH
for Linux:export LD_LIBRARY_PATH=./lib:/opt/jdk/jre/lib/amd64/server
make sure
libhdfs.so
andlibjvm
are declared inLD_LIBRARY_PATH
You don't have to do this on OS X. You can always use
install_name_tool
to set or change a library's install name, also jvm on OS X is a system framework, so that it is not necessory to add jvm's path, while the only thing in step 3 is providinghdfs.h
header path for #cgo. -
correct the #cgo header in
hdfs.go
, according to your enviornment.
- After the preparation, correct the constants in
hdfs_test.go
. - run
./mktest.sh
.
Currently connecting to local file system is not handled correctly. SoIt is okay now to access to local file system.Connect("", 0)
would lead to error.errno
in libhdfs is not handled precisely. For example,invokeMethod()
would probably setserrno
to 2 in a lot of routines.