Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fetch() only works for the first task, returning None for subsequent tasks IDs #99

Closed
schnitzelbub opened this issue Oct 21, 2015 · 7 comments

Comments

@schnitzelbub
Copy link

First of all, a big thank you for this awesome project. I have had lots of trouble working with Celery before and am very glad that I stumbled over your project a couple of months ago.
Django Q seems clean - code and feature wise, very promising!

Unfortunately, I am currently stuck on fetching offloaded tasks. After fetching the first task I am not able to retrieve any further tasks after that.

I have setup a test project including a unit test that reproduces this issue.

The tests include combinations of cache and spawning external processes as I first suspected the issue in that area, but to no avail.

Could you please have a look at this?

Thanks in advance!

@Koed00
Copy link
Owner

Koed00 commented Oct 21, 2015

Hi. Thanks for testing and reporting this. I will definitely look at it as soon as I'm back from traveling this weekend or when I have access to a computer.

@schnitzelbub
Copy link
Author

I refactored the unit tests and was able to narrow the issue down.

It appears that only subsequent calls to fetch which use cached=True fail . So I suspect the issue resides in the function tasks.fetch_cached or in the broker.

By the way, as a broker I am using Redis 2.8.17 on Debian 8.1, but also double checked on an Ubuntu 14.04 system running Redis 2.8.4.

@Koed00
Copy link
Owner

Koed00 commented Oct 25, 2015

Ok I finally had time to run your tests and I'm seeing the same results. I haven't had time enough to dissect your code though. If I run a simple script with a bunch of fetched from cache, it seems to work fine:

from django_q.tasks import async, fetch

t1=async('math.copysign', 1, -1, cached=True)
t2=async('math.copysign', 2, -1, cached=True)
t3=async('math.copysign', 3, -1, cached=True)
t4=async('math.copysign', 4, -1, cached=True)
t5=async('math.copysign', 5, -1, cached=True)
t6=async('math.copysign', 6, -1, cached=False)
t7=async('math.copysign', 7, -1, cached=False)
t8=async('math.copysign', 8, -1, cached=False)
t9=async('math.copysign', 9, -1, cached=False)
t10=async('math.copysign', 10, -1, cached=False)


print(fetch(t1, cached=True, wait=-1).result)
print(fetch(t2, cached=True, wait=-1).result)
print(fetch(t3, cached=True, wait=-1).result)
print(fetch(t4, cached=True, wait=-1).result)
print(fetch(t5, cached=True, wait=-1).result)
print(fetch(t6, cached=False, wait=-1).result)
print(fetch(t7, cached=False, wait=-1).result)
print(fetch(t8, cached=False, wait=-1).result)
print(fetch(t9, cached=False, wait=-1).result)
print(fetch(t10, cached=False, wait=-1).result)


t11=async('math.copysign', 11, -1, cached=True)
t12=async('math.copysign', 12, -1, cached=True)
t13=async('math.copysign', 13, -1, cached=True)
t14=async('math.copysign', 14, -1, cached=True)
t15=async('math.copysign', 15, -1, cached=True)

print(fetch(t11, cached=True, wait=-1).result)
print(fetch(t12, cached=True, wait=-1).result)
print(fetch(t13, cached=True, wait=-1).result)
print(fetch(t14, cached=True, wait=-1).result)
print(fetch(t15, cached=True, wait=-1).result)
-1.0
-2.0
-3.0
-4.0
-5.0
-6.0
-7.0
-8.0
-9.0
-10.0
-11.0
-12.0
-13.0
-14.0
-15.0

The wait=-1 option is something I just added to dev, it makes the result function wait indefinitely for a result. This way I could at least tell if your results weren't just timing out.

I put a break in your test where the fetch was failing and tried to fetch the result manually from another console and it was just there. Yet it failed when I continued the test.

@Koed00
Copy link
Owner

Koed00 commented Oct 26, 2015

I think I maybe have tracked it down. The problem lies in the q_options. It's popped during execution. So when you do a fetch with cached=q_options['cached'] , cached will be None so it looks in the db for the result.

@schnitzelbub
Copy link
Author

Sorry for my delayed feedback.

Understanding your explanation for this behavior above, I just debugged the code and looked at the q_options dict in the async function. You are absolutely right about items being popped during the first call.

In order to circumvent this side-effect in my example unit tests I only had to change from

tid = django_q.async(
        "djq.work.do_work",
        st=sleep_time,
        external=ext_process,
        q_options=q_opts)

to

tid = django_q.async(
        "djq.work.do_work",
        st=sleep_time,
        external=ext_process,
        q_options=q_opts.copy())

and all tests ran green.

Are you considering to change this behavior in the future, or keep as is? What would be your proposal on how to use q_options for global configuration of tasks?

Thank you for your support!

@Koed00
Copy link
Owner

Koed00 commented Oct 26, 2015

I'm already testing several different versions in dev that shouldn't have this problem anymore.
I just haven't decided yet on which approach is best.
This also led me to add an Async class to wrap the async function, to make it easier to re-run tasks with similar settings and to keep the tasks and results together. I'll probably release this sometime in the next days when I'm satisfied I'm on the right track.

@schnitzelbub
Copy link
Author

Great, I am very excited about the upcoming release.

panhaoyu pushed a commit to panhaoyu/django-q that referenced this issue Sep 16, 2023
* Move logic that resubmits a task and then deletes the original into an action function that can be used for both failure and success resubmission.

* Use the resubmit_task action in the FailAdmin.

* Add the resubmit_task action to the TaskAdmin class.

* Update resubmit failure test to reflect using the new resubmit action.

* Add a test for resubmitting successful tasks.

---------

Co-authored-by: Scott Pashley
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants