Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HDFS-17630. Avoid PacketReceiver#MAX_PACKET_SIZE Initialized to 0 #7063

Open
wants to merge 4 commits into
base: trunk
Choose a base branch
from

Conversation

cxzl25
Copy link
Contributor

@cxzl25 cxzl25 commented Sep 23, 2024

Description of PR

There are nested calls, causing the MAX_PACKET_SIZE of PacketReceiver to be 0.

Related HDFS-15469

java.io.IOException: Incorrect value for packet payload size: 1014776
    at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:167)
    at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:112)
    at org.apache.hadoop.hdfs.client.impl.BlockReaderRemote.readNextPacket(BlockReaderRemote.java:187)
    at org.apache.hadoop.hdfs.client.impl.BlockReaderRemote.read(BlockReaderRemote.java:146)
    at org.apache.hadoop.hdfs.ByteArrayStrategy.readFromBlock(ReaderStrategy.java:118)
    at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:789)
    at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:855)
    at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:919)
    at java.base/java.io.DataInputStream.read(DataInputStream.java:158)
    at java.base/java.io.InputStream.transferTo(InputStream.java:796)
    at java.base/java.nio.file.Files.copy(Files.java:3151)
    at java.base/sun.net.www.protocol.jar.URLJarFile$1.run(URLJarFile.java:216)
    at java.base/sun.net.www.protocol.jar.URLJarFile$1.run(URLJarFile.java:212)
    at java.base/java.security.AccessController.doPrivileged(AccessController.java:571)


    at org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:1319)
    at org.apache.hadoop.conf.Configuration.getInt(Configuration.java:1545)
    at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.<clinit>(PacketReceiver.java:82)
    at org.apache.hadoop.hdfs.client.impl.BlockReaderRemote.<init>(BlockReaderRemote.java:101) 

How was this patch tested?

For code changes:

  • Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')?
  • Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation?
  • If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under ASF 2.0?
  • If applicable, have you updated the LICENSE, LICENSE-binary, NOTICE-binary files?

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 11m 52s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
-1 ❌ test4tests 0m 0s The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.
_ trunk Compile Tests _
+1 💚 mvninstall 44m 43s trunk passed
+1 💚 compile 1m 2s trunk passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 compile 0m 54s trunk passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 checkstyle 0m 33s trunk passed
+1 💚 mvnsite 0m 59s trunk passed
+1 💚 javadoc 0m 51s trunk passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javadoc 0m 42s trunk passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 spotbugs 2m 37s trunk passed
+1 💚 shadedclient 36m 37s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 0m 49s the patch passed
+1 💚 compile 0m 54s the patch passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javac 0m 54s the patch passed
+1 💚 compile 0m 46s the patch passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 javac 0m 46s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
-0 ⚠️ checkstyle 0m 20s /results-checkstyle-hadoop-hdfs-project_hadoop-hdfs-client.txt hadoop-hdfs-project/hadoop-hdfs-client: The patch generated 3 new + 1 unchanged - 0 fixed = 4 total (was 1)
+1 💚 mvnsite 0m 49s the patch passed
+1 💚 javadoc 0m 38s the patch passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javadoc 0m 33s the patch passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
-1 ❌ spotbugs 2m 37s /new-spotbugs-hadoop-hdfs-project_hadoop-hdfs-client.html hadoop-hdfs-project/hadoop-hdfs-client generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0)
+1 💚 shadedclient 36m 23s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 2m 27s hadoop-hdfs-client in the patch passed.
+1 💚 asflicense 0m 37s The patch does not generate ASF License warnings.
147m 16s
Reason Tests
SpotBugs module:hadoop-hdfs-project/hadoop-hdfs-client
org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.MAX_PACKET_SIZE isn't final but should be refactored to be so At PacketReceiver.java:be refactored to be so At PacketReceiver.java:[line 51]
Subsystem Report/Notes
Docker ClientAPI=1.47 ServerAPI=1.47 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7063/1/artifact/out/Dockerfile
GITHUB PR #7063
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux dcb310fda747 5.15.0-119-generic #129-Ubuntu SMP Fri Aug 2 19:25:20 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 749a854
Default Java Private Build-1.8.0_422-8u422-b05-1~20.04-b05
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_422-8u422-b05-1~20.04-b05
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7063/1/testReport/
Max. process+thread count 697 (vs. ulimit of 5500)
modules C: hadoop-hdfs-project/hadoop-hdfs-client U: hadoop-hdfs-project/hadoop-hdfs-client
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7063/1/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 30s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 1 new or modified test files.
_ trunk Compile Tests _
+0 🆗 mvndep 15m 1s Maven dependency ordering for branch
+1 💚 mvninstall 32m 35s trunk passed
+1 💚 compile 5m 34s trunk passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 compile 5m 26s trunk passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 checkstyle 1m 26s trunk passed
+1 💚 mvnsite 2m 25s trunk passed
+1 💚 javadoc 2m 1s trunk passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javadoc 2m 31s trunk passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 spotbugs 5m 53s trunk passed
+1 💚 shadedclient 37m 50s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 33s Maven dependency ordering for patch
+1 💚 mvninstall 2m 4s the patch passed
+1 💚 compile 5m 28s the patch passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javac 5m 28s the patch passed
+1 💚 compile 5m 22s the patch passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 javac 5m 22s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 1m 14s hadoop-hdfs-project: The patch generated 0 new + 32 unchanged - 2 fixed = 32 total (was 34)
+1 💚 mvnsite 2m 7s the patch passed
+1 💚 javadoc 1m 41s the patch passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javadoc 2m 15s the patch passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 spotbugs 5m 53s the patch passed
+1 💚 shadedclient 36m 49s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 2m 29s hadoop-hdfs-client in the patch passed.
+1 💚 unit 224m 3s hadoop-hdfs in the patch passed.
+1 💚 asflicense 0m 48s The patch does not generate ASF License warnings.
400m 44s
Subsystem Report/Notes
Docker ClientAPI=1.47 ServerAPI=1.47 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7063/2/artifact/out/Dockerfile
GITHUB PR #7063
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux 40d2434a6c71 5.15.0-119-generic #129-Ubuntu SMP Fri Aug 2 19:25:20 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 577163b
Default Java Private Build-1.8.0_422-8u422-b05-1~20.04-b05
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_422-8u422-b05-1~20.04-b05
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7063/2/testReport/
Max. process+thread count 3290 (vs. ulimit of 5500)
modules C: hadoop-hdfs-project/hadoop-hdfs-client hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7063/2/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@cxzl25
Copy link
Contributor Author

cxzl25 commented Sep 24, 2024

Spark uses FsUrlStreamHandlerFactory to support HDFS Jar, but in some scenarios PacketReceiver will be called nested, causing Spark to fail to start.

cc @sunchao

https://github.com/apache/spark/blob/982028ea7fc61d7aa84756aa46860ebb49bfe9d1/sql/core/src/main/scala/org/apache/spark/sql/internal/SharedState.scala#L201

PacketReceiver Exception
java.lang.Exception
	at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:166)
	at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:112)
	at org.apache.hadoop.hdfs.client.impl.BlockReaderRemote.readNextPacket(BlockReaderRemote.java:187)
	at org.apache.hadoop.hdfs.client.impl.BlockReaderRemote.read(BlockReaderRemote.java:146)
	at org.apache.hadoop.hdfs.ByteArrayStrategy.readFromBlock(ReaderStrategy.java:118)
	at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:789)
	at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:855)
	at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:919)
	at java.base/java.io.DataInputStream.read(DataInputStream.java:158)
	at java.base/java.io.InputStream.transferTo(InputStream.java:796)
	at java.base/java.nio.file.Files.copy(Files.java:3151)
	at java.base/sun.net.www.protocol.jar.URLJarFile$1.run(URLJarFile.java:216)
	at java.base/sun.net.www.protocol.jar.URLJarFile$1.run(URLJarFile.java:212)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:571)
	at java.base/sun.net.www.protocol.jar.URLJarFile.retrieve(URLJarFile.java:211)
	at java.base/sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:71)
	at java.base/sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:153)
	at java.base/sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:109)
	at java.base/sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:70)
	at java.base/jdk.internal.loader.URLClassPath$JarLoader.getJarFile(URLClassPath.java:814)
	at java.base/jdk.internal.loader.URLClassPath$JarLoader$1.run(URLClassPath.java:774)
	at java.base/jdk.internal.loader.URLClassPath$JarLoader$1.run(URLClassPath.java:768)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:714)
	at java.base/jdk.internal.loader.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:767)
	at java.base/jdk.internal.loader.URLClassPath$JarLoader.<init>(URLClassPath.java:734)
	at java.base/jdk.internal.loader.URLClassPath$3.run(URLClassPath.java:497)
	at java.base/jdk.internal.loader.URLClassPath$3.run(URLClassPath.java:479)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:714)
	at java.base/jdk.internal.loader.URLClassPath.getLoader(URLClassPath.java:478)
	at java.base/jdk.internal.loader.URLClassPath.getLoader(URLClassPath.java:446)
	at java.base/jdk.internal.loader.URLClassPath.findResource(URLClassPath.java:292)
	at java.base/java.net.URLClassLoader$2.run(URLClassLoader.java:629)
	at java.base/java.net.URLClassLoader$2.run(URLClassLoader.java:627)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:400)
	at java.base/java.net.URLClassLoader.findResource(URLClassLoader.java:626)
	at java.base/java.lang.ClassLoader.getResource(ClassLoader.java:1418)
	at org.apache.hadoop.conf.Configuration.getResource(Configuration.java:2861)
	at org.apache.hadoop.conf.Configuration.getStreamReader(Configuration.java:3135)
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3094)
	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:3067)
	at org.apache.hadoop.conf.Configuration.loadProps(Configuration.java:2945)
	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2927)
	at org.apache.hadoop.conf.Configuration.get(Configuration.java:1265)
	at org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:1319)
	at org.apache.hadoop.conf.Configuration.getInt(Configuration.java:1545)
	at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.<clinit>(PacketReceiver.java:82)
	at org.apache.hadoop.hdfs.client.impl.BlockReaderRemote.<init>(BlockReaderRemote.java:101)
	at org.apache.hadoop.hdfs.client.impl.BlockReaderRemote.newBlockReader(BlockReaderRemote.java:437)
	at org.apache.hadoop.hdfs.client.impl.BlockReaderFactory.getRemoteBlockReader(BlockReaderFactory.java:861)
	at org.apache.hadoop.hdfs.client.impl.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:757)
	at org.apache.hadoop.hdfs.client.impl.BlockReaderFactory.build(BlockReaderFactory.java:381)
	at org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:715)
	at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:645)
	at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:845)
	at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:919)
	at java.base/java.io.DataInputStream.read(DataInputStream.java:158)
	at java.base/java.io.InputStream.transferTo(InputStream.java:796)
	at java.base/java.nio.file.Files.copy(Files.java:3151)
	at java.base/sun.net.www.protocol.jar.URLJarFile$1.run(URLJarFile.java:216)
	at java.base/sun.net.www.protocol.jar.URLJarFile$1.run(URLJarFile.java:212)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:571)
	at java.base/sun.net.www.protocol.jar.URLJarFile.retrieve(URLJarFile.java:211)
	at java.base/sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:71)
	at java.base/sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:153)
	at java.base/sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:109)
	at java.base/sun.net.www.protocol.jar.JarURLConnection.getJarFile(JarURLConnection.java:70)
	at java.base/jdk.internal.loader.URLClassPath$JarLoader.getJarFile(URLClassPath.java:814)
	at java.base/jdk.internal.loader.URLClassPath$JarLoader$1.run(URLClassPath.java:774)
	at java.base/jdk.internal.loader.URLClassPath$JarLoader$1.run(URLClassPath.java:768)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:714)
	at java.base/jdk.internal.loader.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:767)
	at java.base/jdk.internal.loader.URLClassPath$JarLoader.<init>(URLClassPath.java:734)
	at java.base/jdk.internal.loader.URLClassPath$3.run(URLClassPath.java:497)
	at java.base/jdk.internal.loader.URLClassPath$3.run(URLClassPath.java:479)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:714)
	at java.base/jdk.internal.loader.URLClassPath.getLoader(URLClassPath.java:478)
	at java.base/jdk.internal.loader.URLClassPath.getLoader(URLClassPath.java:446)
	at java.base/jdk.internal.loader.URLClassPath.findResource(URLClassPath.java:292)
	at java.base/java.net.URLClassLoader$2.run(URLClassLoader.java:629)
	at java.base/java.net.URLClassLoader$2.run(URLClassLoader.java:627)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:400)
	at java.base/java.net.URLClassLoader.findResource(URLClassLoader.java:626)
	at java.base/java.lang.ClassLoader.getResource(ClassLoader.java:1418)
	at org.apache.hadoop.conf.Configuration.getResource(Configuration.java:2861)
	at org.apache.hadoop.conf.Configuration.getStreamReader(Configuration.java:3135)
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3094)
	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:3067)
	at org.apache.hadoop.conf.Configuration.loadProps(Configuration.java:2945)
	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2927)
	at org.apache.hadoop.conf.Configuration.get(Configuration.java:1265)
	at org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:1319)
	at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1726)
	at org.apache.hadoop.fs.statistics.impl.IOStatisticsContextIntegration.<clinit>(IOStatisticsContextIntegration.java:79)
	at org.apache.hadoop.fs.statistics.IOStatisticsContext.getCurrentIOStatisticsContext(IOStatisticsContext.java:75)
	at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileInputStream.<init>(RawLocalFileSystem.java:173)
	at org.apache.hadoop.fs.RawLocalFileSystem.open(RawLocalFileSystem.java:393)
	at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:189)
	at org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:581)
	at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:995)
	at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:489)
	at org.apache.hadoop.hive.cli.CliDriver.processInitFiles(CliDriver.java:524)
	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:204)
	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:75)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:52)
	at java.base/java.lang.reflect.Method.invoke(Method.java:580)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1031)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:203)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:226)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:100)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1136)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1145)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

@cxzl25
Copy link
Contributor Author

cxzl25 commented Oct 16, 2024

Hadoop 3.4.1 is being released, Spark may not work properly with Hadoop 3.4.x.

cc @mukund-thakur @steveloughran @Hexiaoqiao @dongjoon-hyun

Copy link
Contributor

@steveloughran steveloughran left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

style review. Someone who knows HDFS code must review this -not me

@steveloughran
Copy link
Contributor

I don't normally go near hdfs, so had missed this. It also means: I don't review their patches.

I think this should target 3.4.2 and we focus on getting that out

@dongjoon-hyun
Copy link
Member

I also agree with Steve's comment. +1 for targeting this for 3.4.2 instead of the on-going 3.4.1 RC3 vote.

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 30s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 1 new or modified test files.
_ trunk Compile Tests _
+0 🆗 mvndep 15m 39s Maven dependency ordering for branch
+1 💚 mvninstall 31m 47s trunk passed
+1 💚 compile 5m 14s trunk passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 compile 5m 8s trunk passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 checkstyle 1m 17s trunk passed
+1 💚 mvnsite 2m 19s trunk passed
+1 💚 javadoc 1m 54s trunk passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javadoc 2m 25s trunk passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 spotbugs 5m 42s trunk passed
+1 💚 shadedclient 36m 37s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 31s Maven dependency ordering for patch
+1 💚 mvninstall 1m 59s the patch passed
+1 💚 compile 5m 13s the patch passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javac 5m 13s the patch passed
+1 💚 compile 4m 58s the patch passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 javac 4m 58s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 1m 4s the patch passed
+1 💚 mvnsite 2m 1s the patch passed
+1 💚 javadoc 1m 33s the patch passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javadoc 2m 14s the patch passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 spotbugs 5m 43s the patch passed
+1 💚 shadedclient 36m 49s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 2m 28s hadoop-hdfs-client in the patch passed.
-1 ❌ unit 222m 50s /patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt hadoop-hdfs in the patch passed.
+1 💚 asflicense 0m 43s The patch does not generate ASF License warnings.
395m 14s
Reason Tests
Failed junit tests hadoop.hdfs.TestDecommission
hadoop.hdfs.TestDecommissionWithBackoffMonitor
Subsystem Report/Notes
Docker ClientAPI=1.47 ServerAPI=1.47 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7063/3/artifact/out/Dockerfile
GITHUB PR #7063
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux f2600d371847 5.15.0-119-generic #129-Ubuntu SMP Fri Aug 2 19:25:20 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / e4fc75c
Default Java Private Build-1.8.0_422-8u422-b05-1~20.04-b05
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_422-8u422-b05-1~20.04-b05
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7063/3/testReport/
Max. process+thread count 3658 (vs. ulimit of 5500)
modules C: hadoop-hdfs-project/hadoop-hdfs-client hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7063/3/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

int totalLen = payloadLen + headerLen;
if (totalLen < 0 || totalLen > MAX_PACKET_SIZE) {
if (totalLen < 0 || totalLen > maxPacketSize) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we check this static final variable, MAX_PACKET_SIZE, at this runtime layer instead of the initialization ?

static {
Configuration conf = new HdfsConfiguration();
MAX_PACKET_SIZE = conf.getInt(HdfsClientConfigKeys.
DFS_DATA_TRANSFER_MAX_PACKET_SIZE,
HdfsClientConfigKeys.DFS_DATA_TRANSFER_MAX_PACKET_SIZE_DEFAULT);
}

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#7063 (comment)

PacketReceiver may make nested calls when reading conf, and the static block has not yet been initialized.

	at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:166)
	at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:112)

...

	at org.apache.hadoop.conf.Configuration.getInt(Configuration.java:1545)
	at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.<clinit>(PacketReceiver.java:82)
	at org.apache.hadoop.hdfs.client.impl.BlockReaderRemote.<init>(BlockReaderRemote.java:101)

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 20s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 1 new or modified test files.
_ trunk Compile Tests _
+0 🆗 mvndep 14m 53s Maven dependency ordering for branch
+1 💚 mvninstall 19m 7s trunk passed
+1 💚 compile 2m 58s trunk passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 compile 2m 55s trunk passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 checkstyle 0m 44s trunk passed
+1 💚 mvnsite 1m 20s trunk passed
+1 💚 javadoc 1m 12s trunk passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javadoc 1m 37s trunk passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 spotbugs 3m 7s trunk passed
+1 💚 shadedclient 21m 7s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 22s Maven dependency ordering for patch
+1 💚 mvninstall 1m 4s the patch passed
+1 💚 compile 2m 49s the patch passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javac 2m 49s the patch passed
+1 💚 compile 2m 44s the patch passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 javac 2m 44s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 37s the patch passed
+1 💚 mvnsite 1m 5s the patch passed
+1 💚 javadoc 0m 57s the patch passed with JDK Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04
+1 💚 javadoc 1m 27s the patch passed with JDK Private Build-1.8.0_422-8u422-b05-1~20.04-b05
+1 💚 spotbugs 3m 13s the patch passed
+1 💚 shadedclient 21m 25s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 1m 58s hadoop-hdfs-client in the patch passed.
-1 ❌ unit 202m 14s /patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt hadoop-hdfs in the patch passed.
+1 💚 asflicense 0m 32s The patch does not generate ASF License warnings.
309m 20s
Reason Tests
Failed junit tests hadoop.hdfs.TestDecommissionWithBackoffMonitor
Subsystem Report/Notes
Docker ClientAPI=1.47 ServerAPI=1.47 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7063/4/artifact/out/Dockerfile
GITHUB PR #7063
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux b77ee8416479 5.15.0-124-generic #134-Ubuntu SMP Fri Sep 27 20:20:17 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 6295fcb
Default Java Private Build-1.8.0_422-8u422-b05-1~20.04-b05
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.24+8-post-Ubuntu-1ubuntu320.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_422-8u422-b05-1~20.04-b05
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7063/4/testReport/
Max. process+thread count 4156 (vs. ulimit of 5500)
modules C: hadoop-hdfs-project/hadoop-hdfs-client hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7063/4/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants