-
-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Segmentation fault in spaCy 2.0.5 / python 3.5 #1757
Comments
Thanks! Passing |
@honnibal I don't think it has solved the issue, sadly. Also could be related: |
same problem here, |
I'm experiencing a similar problem training the NER on anything but a very small set of examples. Training on anything over 1000 examples throws the following error. Is this a memory error?
Info about spaCy I note that I got the same error when trying to train using each of (a) the Prodigy ner.batch-train recipe and (b) the regular spacy train_ner.py script. Example Error messages when running prodigy: line 1: 41665 Segmentation fault: 11 python -m prodigy "$@" line 1: 49673 Segmentation fault: 11 python -m prodigy "$@" |
I'm also experiencing the same issue when training the English NER model. When training on about 100 examples there were no problems, but with 500+ I also get the error: "Segmentation fault: 11" Environment
The error occurs on nlp.update after 2 or 3 iterations. other_pipes = [pipe for pipe in nlp.pipe_names if pipe != 'ner']
with nlp.disable_pipes(*other_pipes): #trains only the ner model
optimizer = nlp.begin_training()
for itn in range(n_iter):
random.shuffle(train)
losses = {}
for text, annotations in train:
nlp.update(
[text], #batch of texts
[annotations], #batch of annotations
drop = dropout, #make it harder to memorize data
sgd = optimizer, #update weights
losses = losses)
print(losses) |
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
spaCy 2.0.5 is throwing a core dump. No core dump from this code was seen when using spaCy 1.6, 1.7.5, 1.8.2. I have seen the core dump happen whether or not I am running the debugger.
Here is the complete code that causes the core dump:
I see that if I replace "!=" with "is not" then the core dump does not happen.
Info about spaCy
The text was updated successfully, but these errors were encountered: