Releases
1.5.4
saudet
released this
09 Sep 08:02
September 9, 2020 version 1.5.4
Bundle libpostal_data
program, executable via Loader.load()
for convenience (issue #939 )
Enable all stable target architectures in the presets for LLVM (pull #937 )
Virtualize QObject
and its subclasses from Qt to allow customization (issue bytedeco/javacpp#419 )
Bundle programs from Clang and LLVM, executable via Loader.load()
for convenience (issue #833 )
Include nnvm/c_api.h
header file in presets for MXNet (issue #912 )
Enable OpenMP for DNNL on Mac using same library name as MKL to prevent conflicts (issue #907 )
Fix loading issue with opencv_ximgproc
(issue #911 )
Build LibTIFF after WebP to make sure they link correctly in presets for Leptonica
Virtualize IInt8Calibrator
plus subclasses from TensorRT to allow customization (issue #902 )
Replace requires
with requires static
in JPMS .platform
modules (pull #900 )
Add presets for OpenPose 1.6.0 (pull #898 )
Add comparison against MKL in llvm/samples/polly/MatMulBenchmark.java
Add requires org.bytedeco.javacpp.${javacpp.platform.module}
to load jnijavacpp
with JPMS (pull #893 )
Bundle configuration files required by AOT compilation with GraalVM (issue eclipse/deeplearning4j#7362 )
Add support for Windows to presets for Qt (issue #862 )
Fix JPMS modules for CUDA, ARPACK-NG, GSL, SciPy, Gym, MXNet (pull #880 and pull #881 ), OpenCV, CPython, LLVM, Tesseract, Qt (pull #928 )
Build OpenBLAS with a TARGET
even for DYNAMIC_ARCH
to avoid SIGILL (issue eclipse/deeplearning4j#8747 )
Upgrade presets for OpenCV 4.4.0, FFmpeg 4.3.1 (pull #891 ), Arrow 1.0.1, Hyperscan 5.3.0, MKL 2020.3, MKL-DNN 0.21.5, DNNL 1.6.2, OpenBLAS 0.3.10, CPython 3.7.9, NumPy 1.19.1, SciPy 1.5.2, Gym 0.17.2, LLVM 10.0.1, Leptonica 1.80.0, CUDA 11.0.3, cuDNN 8.0.3, NCCL 2.7.8, MXNet 1.7.0, TensorFlow 1.15.3, TensorRT 7.1, ONNX 1.7.0 (pull #882 ), ONNX Runtime 1.4.0 (pull #887 ), Qt 5.15.0, Skia 2.80.1, and their dependencies
Add FullOptimization.h
allowing users to fully optimize LLVM modules (pull #869 )
You can’t perform that action at this time.