You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I run 2 cromwell jobs, and then I find each job have lots of file handle, I use command "lsof | grep uername | awk '{print $1"\t"$2}' | sort | uniq -c | sort -nr" to see it, if I run 50 jobs, I will not be able to log into that Linux through shell, how to reduce the number of file handle in each job? thanks.
java -Xms10M -Xmx125M -Dconfig.file=SGE.conf -jar cromwell-86.jar run xxx.wdl --inputs xxx.json
SGE.conf file:
# Documentation:
# https://cromwell.readthedocs.io/en/stable/backends/SGE
backend {
default = SGE
providers {
SGE {
actor-factory = "cromwell.backend.impl.sfs.config.ConfigBackendLifecycleActorFactory"
config {
# Limits the number of concurrent jobs
concurrent-job-limit = 5
# If an 'exit-code-timeout-seconds' value is specified:
# - check-alive will be run at this interval for every job
# - if a job is found to be not alive, and no RC file appears after this interval
# - Then it will be marked as Failed.
# Warning: If set, Cromwell will run 'check-alive' for every job at this interval
exit-code-timeout-seconds = 120
runtime-attributes = """
Int cpu = 1
Float? memory_gb
String? sge_queue = "xxx"
String? sge_project = "xxx"
"""
submit = """
qsub \
-terse \
-V \
-b y \
-N ${job_name} \
-wd ${cwd} \
-o ${out}.qsub \
-e ${err}.qsub \
${"-l num_proc=" + cpu + ",virtual_free=" + memory_gb + "g"} \
${"-q " + sge_queue} \
${"-P " + sge_project} \
-binding ${"linear:" + cpu} \
/usr/bin/env bash ${script}
"""
kill = "qdel ${job_id}"
check-alive = "qstat -j ${job_id}"
job-id-regex = "(\\d+)"
}
}
}
}
call-caching {
enabled = true
invalidate-bad-cache-results = true
}
The text was updated successfully, but these errors were encountered:
Hi,
I run 2 cromwell jobs, and then I find each job have lots of file handle, I use command "lsof | grep uername | awk '{print $1"\t"$2}' | sort | uniq -c | sort -nr" to see it, if I run 50 jobs, I will not be able to log into that Linux through shell, how to reduce the number of file handle in each job? thanks.
this is my java command
SGE.conf file:
The text was updated successfully, but these errors were encountered: