You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Looks like that trying to upload a run for some specific tests having a lot of dataset (10k or more) makes the active transaction to be forcefully killed (did we hit a timeout?)
2024-09-09 17:16:26,946 797ee630d871 quarkus-run.jar[7] WARN [com.net.sch.UnknownKeywordFactory] (executor-thread-17) Unknown keyword $id - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2024-09-09 17:18:15,573 797ee630d871 quarkus-run.jar[7] WARN [com.arj.ats.arjuna] (Transaction Reaper) ARJUNA012117: TransactionReaper::check processing TX 0:ffff0a000264:9bc3:66df1d30:104 in state RUN
2024-09-09 17:18:15,575 797ee630d871 quarkus-run.jar[7] WARN [com.arj.ats.arjuna] (Transaction Reaper Worker 0) ARJUNA012095: Abort of action id 0:ffff0a000264:9bc3:66df1d30:104 invoked while multiple threads active within it.
2024-09-09 17:18:15,576 797ee630d871 quarkus-run.jar[7] WARN [com.arj.ats.arjuna] (Transaction Reaper Worker 0) ARJUNA012381: Action id 0:ffff0a000264:9bc3:66df1d30:104 completed with multiple threads - thread executor-thread-17 was in progress with java.base@17.0.11/sun.nio.ch.Net.poll(Native Method)
java.base@17.0.11/sun.nio.ch.NioSocketImpl.park(NioSocketImpl.java:186)
java.base@17.0.11/sun.nio.ch.NioSocketImpl.park(NioSocketImpl.java:195)
...
org.hibernate.query.spi.AbstractQuery.executeUpdate(AbstractQuery.java:651)
io.hyperfoil.tools.horreum.svc.DatasetServiceImpl.calculateLabelValues(DatasetServiceImpl.java:453)
io.hyperfoil.tools.horreum.svc.DatasetServiceImpl_Subclass.calculateLabelValues$$superforward(Unknown Source)
io.hyperfoil.tools.horreum.svc.DatasetServiceImpl_Subclass$$function$$6.apply(Unknown Source)
It seems the transaction got killed while calling a postgres procedure
Describe the bug
Looks like that trying to upload a run for some specific tests having a lot of dataset (10k or more) makes the active transaction to be forcefully killed (did we hit a timeout?)
It seems the transaction got killed while calling a postgres procedure
Horreum/horreum-backend/src/main/java/io/hyperfoil/tools/horreum/svc/DatasetServiceImpl.java
Line 459 in 69fdfe8
And this force the run upload to fail:
To Reproduce
Was not able to reproduce the same locally yet.
Version
What is the version of Horreum ?
If you are using a development branch; what is the commit id ?
The text was updated successfully, but these errors were encountered: