-
Notifications
You must be signed in to change notification settings - Fork 675
关于内存回收问题 #53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
重新 init 可以满足需求吗? |
当前使用方法是init了多个SensitiveWordBs,重新init无法满足需求 |
这个具体的使用场景是什么? |
第二个问题是其实不建议创建多个 SensitiveWordBs,统一使用一个,根据标签处理能否满足场景? |
我这边场景是不同模块需要不同的过滤逻辑,比如党建相关需要的敏感词就会多点,其他地方可能不要那么多过滤,看了下标签应该满足不了 |
具体想要怎么样的回收机制?把词典构建占用的内存 map 销毁吗? |
是的是的 就想要这个 |
v0.16.1 已支持。 SensitiveWordBs wordBs = SensitiveWordBs.newInstance()
.init();
// 后续因为一些原因移除了对应信息,希望释放内存。
wordBs.destroy(); |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
如当前使用按分类初始化两个wordData,删除一个分类后如何把当前分类初始化的wordData占用的内存回收,是否可以暴露个清理方法?
The text was updated successfully, but these errors were encountered: