-
Notifications
You must be signed in to change notification settings - Fork 56
Building HBase
The instructions provided below specify the steps to build HBase version 2.2.4 on Linux on IBM Z for following distributions:
- RHEL (7.6, 7.7, 7.8, 8.1, 8.2)
- SLES (12 SP4, 12 SP5, 15 SP1)
- Ubuntu (16.04, 18.04, 20.04)
General Notes:
- When following the steps below please use a standard permission user unless otherwise specified.
- A directory
/<source_root>/will be referred to in these instructions, this is a temporary writable directory anywhere you'd like to place it.
export SOURCE_ROOT=/<source_root>/-
RHEL (7.6, 7.7, 7.8)
-
With OpenJDK
sudo yum install -y git wget tar make gcc java-1.8.0-openjdk.s390x java-1.8.0-openjdk-devel.s390x ant ant-junit.noarch unzip hostname gcc-c++
-
With AdoptOpenJDK
sudo yum install -y git wget tar make gcc unzip hostname gcc-c++
- Download and install AdoptOpenJDK 8 with OpenJ9 from here.
-
-
RHEL (8.1, 8.2)
-
With OpenJDK
sudo yum install -y git wget tar make gcc maven java-1.8.0-openjdk.s390x java-1.8.0-openjdk-devel.s390x ant unzip hostname diffutils gcc-c++
-
With AdoptOpenJDK
sudo yum install -y git wget tar make gcc maven ant unzip hostname diffutils gcc-c++
- Download and install AdoptOpenJDK 8 with OpenJ9 from here.
-
-
SLES (12 SP4, 12 SP5, 15 SP1)
-
With OpenJDK
sudo zypper install git wget tar make gcc java-1_8_0-openjdk java-1_8_0-openjdk-devel ant ant-junit ant-nodeps net-tools gcc-c++ unzip awk gzip curl
-
With AdoptOpenJDK
sudo zypper install git wget tar make gcc ant ant-junit ant-nodeps net-tools gcc-c++ unzip awk gzip curl
- Download and install AdoptOpenJDK 8 with OpenJ9 from here.
-
-
Ubuntu (16.04, 18.04, 20.04)
-
With OpenJDK
sudo apt-get update sudo apt-get install -y git openjdk-8-jdk wget maven tar make gcc g++ ant curl unzip
-
With AdoptOpenJDK
sudo apt-get update sudo apt-get install -y git wget maven tar make gcc g++ ant curl unzip
- Download and install AdoptOpenJDK 8 with OpenJ9 from here.
-
cd $SOURCE_ROOT
wget https://archive.apache.org/dist/maven/maven-3/3.3.9/binaries/apache-maven-3.3.9-bin.tar.gz
tar zxf apache-maven-3.3.9-bin.tar.gzexport JAVA_HOME=<path to java>
export PATH=$JAVA_HOME/bin:$PATH
export MAVEN_OPTS="-Xms1024m -Xmx1024m -XX:MaxPermSize=1024m"
export PATH=$SOURCE_ROOT/apache-maven-3.3.9/bin:$PATH #(for SLES and RHEL 7.x)cd $SOURCE_ROOT
git clone git://github.com/apache/hbase.git
cd hbase
git checkout rel/2.2.41.5) Build Protobuf 2.5.0 till step no.5 and install the Maven artifacts as follows
mvn install:install-file -DgroupId=com.google.protobuf -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=linux-s390_64 -Dpackaging=exe -Dfile=/<v2.5.0 build location>protobuf/src/.libs/protocFor installing dependencies please follow Step 2: Install the build dependencies from Building 3.x Protobuf
cd $SOURCE_ROOT
git clone https://github.com/protocolbuffers/protobuf.git
cd protobuf
git checkout v3.5.1
git submodule update --init --recursiveEdit following file:
--- a/src/google/protobuf/stubs/atomicops_internals_generic_gcc.h
+++ b/src/google/protobuf/stubs/atomicops_internals_generic_gcc.h
@@ -146,6 +146,14 @@ inline Atomic64 NoBarrier_Load(volatile const Atomic64* ptr) {
return __atomic_load_n(ptr, __ATOMIC_RELAXED);
}
+inline Atomic64 Release_CompareAndSwap(volatile Atomic64* ptr,
+ Atomic64 old_value,
+ Atomic64 new_value) {
+ __atomic_compare_exchange_n(ptr, &old_value, new_value, false,
+ __ATOMIC_RELEASE, __ATOMIC_ACQUIRE);
+ return old_value;
+}
+
#endif // defined(__LP64__)
} // namespace internal./autogen.sh
./configure
make
mvn install:install-file -DgroupId=com.google.protobuf -DartifactId=protoc -Dversion=3.5.1-1 -Dclassifier=linux-s390_64 -Dpackaging=exe -Dfile=/<SOURCE_ROOT>/protobuf/src/.libs/protocexport LD_LIBRARY_PATH=$LD_LIBRARY_PATH:<v3.5.1_source_path>/src/.libs:<v2.5.0_source_path>/src/.libs
cd $SOURCE_ROOT/hbase
mvn package -DskipTestsmvn test -fnNote: Following test failures are observed on s390x
-
Tests which are failing with OpenJDK or AdoptOpenJDK consistently
-
org.apache.hadoop.hbase.filter.TestFuzzyRowFilter: This test fails because functionjava.nio.Bits.unaligned()doesn't return true on s390x. This issue is been tracked here. -
org.apache.hadoop.hbase.procedure2.store.TestProcedureStoreTracker: This test fails due to time out. Increasingsurefire.timeoutvalue in pom.xml to around 9000, resolves this test case.--- a/pom.xml +++ b/pom.xml @@ -1254,7 +1254,7 @@ <test.output.tofile>true</test.output.tofile> - <surefire.timeout>900</surefire.timeout> + <surefire.timeout>9000</surefire.timeout> <test.exclude.pattern></test.exclude.pattern>
-
-
Tests which fails with the error
SUREFIRE-859: JVMDUMP013I Processed dump event "systhrow", detail java/lang/OutOfMemoryError"can be resolved by increasing the stack size usingulimit -s 9999999Maven heap can also be increased to address OutOfMemoryErrorexport MAVEN_OPTS="-Xmx4000m" -
Tests which are failing intermittently Note: The following is a list of tests which are passed after either increasing the timeout value or rerunning individual test.
org.apache.hadoop.hbase.TestChoreServiceorg.apache.hadoop.hbase.util.TestWeakObjectPoolorg.apache.hadoop.hbase.master.snapshot.TestSnapshotManagerorg.apache.hadoop.hbase.coprocessor.example.TestValueReplacingCompactionorg.apache.hadoop.hbase.filter.TestMultipleColumnPrefixFilterorg.apache.hadoop.hbase.coprocessor.TestSecureExportorg.apache.hadoop.hbase.regionserver.TestStripeStoreFileManager
Some stubborn test cases timeout despite increasing values suggested above. These are:
org.apache.hadoop.hbase.types.TestCopyOnWriteMapsorg.apache.hadoop.hbase.procedure2.store.TestProcedureStoreTrackerorg.apache.hadoop.hbase.snapshot.TestExportSnapshotWithTemporaryDirectoryorg.apache.hadoop.hbase.regionserver.TestMemStoreLABorg.apache.hadoop.hbase.util.TestAvlUtilorg.apache.hadoop.hbase.regionserver.TestMetricsTableAggregateorg.apache.hadoop.hbase.snapshot.TestSnapshotManifest
Such test cases can be modified to run as IntegrationTests to allow for more resources. As an example,
--- a/hbase-common/src/test/java/org/apache/hadoop/hbase/types/TestCopyOnWriteMaps.java
+++ b/hbase-common/src/test/java/org/apache/hadoop/hbase/types/TestCopyOnWriteMaps.java
@@ -30,12 +30,13 @@ import java.util.concurrent.ThreadLocalRandom;
import org.apache.hadoop.hbase.HBaseClassTestRule;
import org.apache.hadoop.hbase.testclassification.MiscTests;
import org.apache.hadoop.hbase.testclassification.SmallTests;
+import org.apache.hadoop.hbase.testclassification.IntegrationTests;
import org.junit.Before;
import org.junit.ClassRule;
import org.junit.Test;
import org.junit.experimental.categories.Category;
-@Category({ MiscTests.class, SmallTests.class })
+@Category({ MiscTests.class, IntegrationTests.class })
public class TestCopyOnWriteMaps {
@ClassRule
public static final HBaseClassTestRule CLASS_RULE =-
Tests which are failing with AdoptOpenJDK consistently (which are also observed on x86)
org.apache.hadoop.hbase.snapshot.TestExportSnapshotNoClusterorg.apache.hadoop.hbase.io.TestHeapSizeorg.apache.hadoop.hbase.tool.coprocessor.CoprocessorValidatorTestorg.apache.hadoop.hbase.util.TestJSONMetricUtil
-
Tests which are failing with OpenJDK consistently (which are also observed on x86)
org.apache.hadoop.hbase.snapshot.TestExportSnapshotNoCluster-
org.apache.hadoop.hbase.io.TestHeapSizefails due to theZerovariant of OpenJDK 8 that is supported on s390x. It is also observed on x86 whenZerovariant is enabled.
Note: HBase needs a native library (libjffi-1.2.so: java foreign language interface)
cd $SOURCE_ROOT
wget https://github.com/jnr/jffi/archive/1.2.0.tar.gz
tar -xvf 1.2.0.tar.gz
cd jffi-1.2.0/-
$SOURCE_ROOT/jffi-1.2.0/jni/GNUmakefile--- a/jni/GNUmakefile +++ b/jni/GNUmakefile @@ -68,7 +68,7 @@ WERROR = -Werror ifneq ($(OS),darwin) WFLAGS += -Wundef $(WERROR) endif -WFLAGS += -W -Wall -Wno-unused -Wno-parentheses +WFLAGS += -W -Wall -Wno-unused -Wno-parentheses -Wno-unused-parameter PICFLAGS = -fPIC SOFLAGS = # Filled in for each OS specifically FFI_MMAP_EXEC = -DFFI_MMAP_EXEC_WRIT @@ -183,7 +183,7 @@ ifeq ($(OS), darwin) endif ifeq ($(OS), linux) - SOFLAGS = -shared -mimpure-text -static-libgcc -Wl,-soname,$(@F) -Wl,-O1 + SOFLAGS = -shared -static-libgcc -Wl,-soname,$(@F) -Wl,-O1 CFLAGS += -pthread endif
-
$SOURCE_ROOT/jffi-1.2.0/libtest/GNUmakefile--- a/libtest/GNUmakefile +++ b/libtest/GNUmakefile @@ -45,9 +45,9 @@ TEST_OBJS := $(patsubst $(SRC_DIR)/%.c, $(TEST_BUILD_DIR)/%.o, $(TEST_SRCS)) # http://weblogs.java.net/blog/kellyohair/archive/2006/01/compilation_of_1.html JFLAGS = -fno-omit-frame-pointer -fno-strict-aliasing OFLAGS = -O2 $(JFLAGS) -WFLAGS = -W -Werror -Wall -Wno-unused -Wno-parentheses +WFLAGS = -W -Werror -Wall -Wno-unused -Wno-parentheses -Wno-unused-parameter PICFLAGS = -fPIC -SOFLAGS = -shared -mimpure-text -Wl,-O1 +SOFLAGS = -shared -Wl,-O1 LDFLAGS += $(SOFLAGS) IFLAGS = -I"$(BUILD_DIR)"
ant jarNote: There are some test case failures. The library file is created in /<source_root>/jffi-1.2.0/build/jni/libjffi-1.2.so
mkdir $SOURCE_ROOT/jar_tmp
cp ~/.m2/repository/org/jruby/jruby-complete/9.1.13.0/jruby-complete-9.1.13.0.jar $SOURCE_ROOT/jar_tmp
cd $SOURCE_ROOT/jar_tmp
jar xf jruby-complete-9.1.13.0.jar
mkdir jni/s390x-Linux
cp $SOURCE_ROOT/jffi-1.2.0/build/jni/libjffi-1.2.so jni/s390x-Linux
jar uf jruby-complete-9.1.13.0.jar jni/s390x-Linux/libjffi-1.2.so
mv jruby-complete-9.1.13.0.jar ~/.m2/repository/org/jruby/jruby-complete/9.1.13.0/jruby-complete-9.1.13.0.jarcd $SOURCE_ROOT/hbase
bin/start-hbase.sh
bin/hbase shellNote: On SLES you may get following messages during shell startup
[INFO] Unable to bind key for unsupported operation: up-history
[INFO] Unable to bind key for unsupported operation: down-history
Modify /etc/inputrc and replace up-history by previous-history and down-history by next-history to avoid such messages.
https://hbase.apache.org/
https://github.com/apache/hbase
The information provided in this article is accurate at the time of writing, but on-going development in the open-source projects involved may make the information incorrect or obsolete. Please open issue or contact us on IBM Z Community if you have any questions or feedback.