-
-
Notifications
You must be signed in to change notification settings - Fork 103
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TimeOut when running on Colab #31
Comments
@jeromemassot I don't think you can run it on Colab (as it is). The library starts a big java server (CoreNLP server) locally and then sends requests. I don't know how Colab works but chances are you can't do it because the ports are closed... One way to make it work would be to host the java server on a remote server (for example AWS, GCP) and give the IP:port to CoreNLPClient (argument Now the wrapper is responsible for starting the java server and converts requests from python > java. If we can decouple the two and only run the "converter" on Colab, chances that it works are fairly high imo. Related issue: #21. |
Hi Phillipe!! |
Hi @philipperemy , I have been running OpenIE from my local Jupyter notebook for the past month and it worked fine. I'm getting timeout errors today, was wondering if the server is down or something. If so, can you please provide a fix/workaround? Thanks. PermanentlyFailedException: Timed out waiting for service to come alive. |
Hi Philippe,
I try to use the wrapper from Colab but I have a permanent issue when it tries to start the remote server.
PermanentlyFailedException: Timed out waiting for service to come alive.
I have installed the CoreNLP library with the English .jar and update the CLASSPATH, so if I can point to my local java module, it may work more easily than pointing to a remote server.
Is it possible to custom Stanford-OpenIE-Python wrapper to use a local install ?
Thanks
Best regards
Jerome
The text was updated successfully, but these errors were encountered: