You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
SymmetricDeleteModel.load("//spellcheck_sd_en_2.0.2_2.4_1556604489934")
.setInputCols(Array(context.inputField))
.setOutputCol(context.outputField + "_token")
.setDupsLimit(2) //error
.setFrequencyThreshold(0)
.setMaxEditDistance(3)
.setDeletesThreshold(0)
.setMaxFrequency(10) //error
.setMinFrequency(1) //error
Unless i set values for setDupsLimit/setMaxFrequency/setMinFrequency it fails with error complaining that default value is missing. For rest of parameters I am using default values I've discovered in source code. How should i set those parameters causing error? SHoudl spark NLP have default values for them?
Expected Behavior
setDupsLimit/setMaxFrequency/setMinFrequency should be used to fine tune default values - SymmetricDeleteModel should work using defauts values for them
Current Behavior
Failures becasue of missinf default values
Possible Solution
Steps to Reproduce
Context
Your Environment
Spark NLP version: 2.6.3
Apache NLP version:
Java version (java -version):1.8
Setup and installation (Pypi, Conda, Maven, etc.):
Operating System and version:
Link to your project (if any):
The text was updated successfully, but these errors were encountered:
Thanks for reporting this, every annotator should work without any parameters. So the default values are missing and they are required. I'll take a look for the next release
Description
SymmetricDeleteModel.load("//spellcheck_sd_en_2.0.2_2.4_1556604489934")
.setInputCols(Array(context.inputField))
.setOutputCol(context.outputField + "_token")
.setDupsLimit(2) //error
.setFrequencyThreshold(0)
.setMaxEditDistance(3)
.setDeletesThreshold(0)
.setMaxFrequency(10) //error
.setMinFrequency(1) //error
Unless i set values for setDupsLimit/setMaxFrequency/setMinFrequency it fails with error complaining that default value is missing. For rest of parameters I am using default values I've discovered in source code. How should i set those parameters causing error? SHoudl spark NLP have default values for them?
Expected Behavior
setDupsLimit/setMaxFrequency/setMinFrequency should be used to fine tune default values - SymmetricDeleteModel should work using defauts values for them
Current Behavior
Failures becasue of missinf default values
Possible Solution
Steps to Reproduce
Context
Your Environment
The text was updated successfully, but these errors were encountered: