Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

help getting redis + gunicorn running #475

Closed
fredzannarbor opened this issue May 10, 2021 · 8 comments
Closed

help getting redis + gunicorn running #475

fredzannarbor opened this issue May 10, 2021 · 8 comments

Comments

@fredzannarbor
Copy link

fredzannarbor commented May 10, 2021

I am running a very slightly modified version of the gunicorn + redis app and hitting a wall when dtale tries to get the list of instances.

file: nimble_dtale.py

import dtale
import pandas as pd

from dtale.app import build_app

from dtale.views import startup
from flask import redirect, jsonify

app = build_app(reaper_on=False)

dtale.global_state.use_redis_store('/home/bitnami/redis')

@app.route("/create-df")
def create_df():
    df = pd.DataFrame(dict(a=[1, 2, 3], b=[4, 5, 6]))
    instance = startup(data=df, ignore_duplicate=True)

    return redirect(f"/dtale/main/{instance._data_id}", code=302)

@app.route("/")
@app.route("/active-instances")
def get_all_dtale_servers():
    instances = []
    for data_id in dtale.global_state.keys():
        print(data_id)
        data_obj = dtale.get_instance(data_id)
        metadata = dtale.global_state.get_name(data_id)
        name = dtale.global_state.get_data_inst(data_id).name
        # convert pandas timestamp to python dateTime
        time = pd.Timestamp(metadata.get("start"), tz=None).to_pydatetime()
        datetime = time.strftime("%Y-%m-%d %H:%M:%S")
        instances.append(
            {
                'id': data_id,
                'name': name,
                'url': data_obj.main_url(),
                'datetime': datetime
            }
        )
    return jsonify(instances)


if __name__ == '__main__':
    app.run(host="0.0.0.0", port=5003)

gunicorn --workers=10 --preload -b 0.0.0.0:5003 'nimble_dtale:app'

[2021-05-10 22:58:57 +0000] [13530] [INFO] Starting gunicorn 20.0.4
[2021-05-10 22:58:57 +0000] [13530] [INFO] Listening at: http://0.0.0.0:5003 (13530)
[2021-05-10 22:58:57 +0000] [13530] [INFO] Using worker: sync
[2021-05-10 22:58:57 +0000] [13537] [INFO] Booting worker with pid: 13537
[2021-05-10 22:58:57 +0000] [13538] [INFO] Booting worker with pid: 13538
[2021-05-10 22:58:57 +0000] [13539] [INFO] Booting worker with pid: 13539
[2021-05-10 22:58:57 +0000] [13540] [INFO] Booting worker with pid: 13540
[2021-05-10 22:58:57 +0000] [13541] [INFO] Booting worker with pid: 13541
[2021-05-10 22:58:58 +0000] [13542] [INFO] Booting worker with pid: 13542
[2021-05-10 22:58:58 +0000] [13543] [INFO] Booting worker with pid: 13543
[2021-05-10 22:58:58 +0000] [13544] [INFO] Booting worker with pid: 13544
[2021-05-10 22:58:58 +0000] [13545] [INFO] Booting worker with pid: 13545
[2021-05-10 22:58:58 +0000] [13546] [INFO] Booting worker with pid: 13546

going to the server at nimblebooks.com:5003/create-df I get this:


Traceback (most recent call last):
  File "/home/bitnami/.virtualenvs/nimbleAI/lib/python3.8/site-packages/dtale/views.py", line 100, in _handle_exceptions
    return func(*args, **kwargs)
  File "/home/bitnami/.virtualenvs/nimbleAI/lib/python3.8/site-packages/dtale/views.py", line 2244, in get_data
    curr_dtypes = [c["name"] for c in global_state.get_dtypes(data_id)]
TypeError: 'NoneType' object is not iterable

going to nimblebooks.com:5003/active-instances I get this:

Traceback (most recent call last):
  File "/home/bitnami/.virtualenvs/nimbleAI/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
    response = self.full_dispatch_request()
  File "/home/bitnami/.virtualenvs/nimbleAI/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/home/bitnami/.virtualenvs/nimbleAI/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/home/bitnami/.virtualenvs/nimbleAI/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise
    raise value
  File "/home/bitnami/.virtualenvs/nimbleAI/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "/home/bitnami/.virtualenvs/nimbleAI/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/home/bitnami/nimbleAI/dtale_standalone/nimble_dtale.py", line 31, in get_all_dtale_servers
    time = pd.Timestamp(metadata.get("start"), tz=None).to_pydatetime()
AttributeError: 'str' object has no attribute 'get'

Any ideas? It seems something is wrong with creating or listing instances, but I'm stumped.

@fredzannarbor
Copy link
Author

I found the other thread on this problem #307 and followed the steps beginning in the comment on Nov. 2 - #307 (comment) - I can run my flask app and it loads data as I intend -- but when I look for the instances:


>>> import dtale
>>> dtale.global_state.use_redis_store('/home/bitnami/redis')
>>> dtale.global_state.DATA.keys()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
AttributeError: module 'dtale.global_state' has no attribute 'DATA'
>>> dtale.global_state.DATA.keys()
Traceback (most recent call last):

and then when I tried to run a "preloaded" dataframe as in the comment Nov. 11 #307 (comment) , i get this:

python nimble_dtale.py
Traceback (most recent call last):
  File "nimble_dtale.py", line 13, in <module>
    if '1' not in dtale.global_state.get_data():
TypeError: get_data() missing 1 required positional argument: 'data_id'

Help!

@aschonfeld
Copy link
Collaborator

@fredzannarbor sorry about the issues. I had the global_state refactored a while back and it looks like there was definitely some issues around how data gets flushed to Redis. I have fixes ready I just need to cut a release.

Once again, apologies for the issues.

@fredzannarbor
Copy link
Author

That's actually fabulous news as I spent most of last night butting my head against a wall convinced that I was missing something!

aschonfeld added a commit that referenced this issue May 12, 2021
@fredzannarbor
Copy link
Author

I tried replacing global_state.py in my site-packages with this version (475) but I am still experiencing an error.

I can create new instances using create-df but when I hit active-instances I get the traceback below.

Traceback (most recent call last):
  File "/Users/fred/.virtualenvs/nimbleAI/lib/python3.8/site-packages/flask/app.py", line 2446, in wsgi_app
    response = self.full_dispatch_request()
  File "/Users/fred/.virtualenvs/nimbleAI/lib/python3.8/site-packages/flask/app.py", line 1951, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/Users/fred/.virtualenvs/nimbleAI/lib/python3.8/site-packages/flask/app.py", line 1820, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/Users/fred/.virtualenvs/nimbleAI/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise
    raise value
  File "/Users/fred/.virtualenvs/nimbleAI/lib/python3.8/site-packages/flask/app.py", line 1949, in full_dispatch_request
    rv = self.dispatch_request()
  File "/Users/fred/.virtualenvs/nimbleAI/lib/python3.8/site-packages/flask/app.py", line 1935, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "nimble_dtale.py", line 84, in get_all_dtale_servers
AttributeError: 'str' object has no attribute 'get'
import dtale
import pandas as pd
import os

from flask import redirect, jsonify

from dtale.app import build_app

from dtale.views import startup

import dtale.predefined_filters as predefined_filters

app = build_app(reaper_on=False)

configfilepath = os.getcwd()  + '/' + 'dtale_local_settings.py'
print(configfilepath)
app.config.from_pyfile(configfilepath)
local_dtale_value = app.config['LOCAL_DTALE_VALUE']
local_dtale_stub = app.config['LOCAL_DTALE_STUB']
local_redis_store = app.config['LOCAL_REDIS_STORE']
print(local_dtale_stub, local_dtale_value)


dtale.global_state.use_redis_store(local_redis_store)


@app.route("/create-df")
def create_df2():
    df2 = pd.DataFrame(dict(a=[1, 2, 3], b=[4, 5, 6]))
    instance = startup(data=df2, ignore_duplicate=True)
    print(df2)
    return redirect(f"/dtale/main/{instance._data_id}", code=302)

@app.route("/create-dkdp/<userdirnumber>")
def create_df(userdirnumber):

        predefined_filters.set_filters([
    {
        "name": "Author is ",
        "column": "Author",
        "description": "filter by author name",
        "handler": lambda dkdp, val: dkdp[(dkdp["Author"] == val)],
        "input_type": "select"
    },
        {
        "name": "strong sellers",
        "column": "Units Sold",
        "description": "filter author name",
        "handler": lambda dkdp, val: dkdp[(dkdp["Units Sold"] == val) & (dkdp["Units Sold"] >= 10)],
        "input_type": "input"
    },
        {
        "name": "market",
        "column": "Currency",
        "description": "filter by currency and sales",
        "handler": lambda dkdp, val: dkdp[(dkdp["Currency"] == val) & (dkdp["Units Sold"] >= 10)],
        "input_type": "select"
    }
])
        print(os.getcwd())
        dkdppicklepath = local_dtale_stub + '/' + "app/userdocs/" + userdirnumber + '/dataframes/dkdp.pkl'
        print(dkdppicklepath)
        dkdp = pd.read_pickle(dkdppicklepath)
        instance = startup(data=dkdp, ignore_duplicate=True)
        return redirect(f"/dtale/main/{instance._data_id}", code=302) 

@app.route("/")
@app.route("/active-instances")
def get_all_dtale_servers():
    instances = []
    for data_id in dtale.global_state.keys():
        print(data_id)
        data_obj = dtale.get_instance(data_id)
        metadata = dtale.global_state.get_name(data_id)
        name = dtale.global_state.get_data_inst(data_id).name
        # convert pandas timestamp to python dateTime
        time = pd.Timestamp(metadata.get("start"), tz=None).to_pydatetime()
        datetime = time.strftime("%Y-%m-%d %H:%M:%S")
        instances.append(
            {
                'id': data_id,
                'name': name,
                'url': data_obj.main_url(),
                'datetime': datetime
            }
        )
    return jsonify(instances) 

@app.route("/hello")
def hello_world():
    return "Hello World!"

if __name__ == '__main__':
    app.run(host="0.0.0.0", port=5003)

@aschonfeld
Copy link
Collaborator

You need to change:
metadata = dtale.global_state.get_name(data_id)

to
metadata = dtale.global_state.get_metadata(data_id)

aschonfeld added a commit that referenced this issue May 12, 2021
aschonfeld added a commit that referenced this issue May 12, 2021
@fredzannarbor
Copy link
Author

gunicorn working like a champ in dev. thanks!

Can you let me know when pip is updated with these changes, so I can upgrade my production environment without patching?

@aschonfeld
Copy link
Collaborator

Glad to hear it! Yea i wanted to get a release out yesterday but then there were a ton of package changes and i’ve battling those. I can probably cut a release tomorrow with your changes and then leave off some additional package upgrades til the next release

@aschonfeld
Copy link
Collaborator

@fredzannarbor just released v1.46.0 to pypi and conda-forge is in flight

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants