”A nice, online, stand-alone estimator of logistic regression”
Features (planned, at least)
-
Logistic regression through-REST-api (with dynamic learning rate and regularization)
-
Anomaly detection and drift detection
-
Model rollbacks, version handling and stop-learning mode
-
Classification and probability estimation by repeated testing
Logistic regression (sort of invented 1830s) is one of the most used ML tools for classification, with good reason.
The world is divided into two classes, those who believe the incredible, and those who do the improbable.
Oscar Wilde
Stochastic gradient descent (with finesse) might not be optimal but we can try to do one thing good. The hedgehog will be adaptive to parameter changes, as a consequence.
A fox knows many things, but a hedgehog one important thing
Old Greek proverb
Even simple models (like log-reg), done with curiosity, will give insights into a problem. I call on both hedgehogs and foxes to help me now and in the future.
I meant it as a kind of enjoyable intellectual game, but it was taken seriously. Every classification throws light on something.
Isaiah Berlin
Adam original paper : https://arxiv.org/abs/1412.6980 https://link.medium.com/QyWDEMtAQX
https://www.kdnuggets.com/2019/06/gradient-descent-algorithms-cheat-sheet.html
https://towardsdatascience.com/adam-latest-trends-in-deep-learning-optimization-6be9a291375c
https://machinelearningmastery.com/adam-optimization-algorithm-for-deep-learning/
http://ruder.io/optimizing-gradient-descent/
https://www.internalpointers.com/post/cost-function-logistic-regression
https://arxiv.org/pdf/1704.04289.pdf
https://ml-cheatsheet.readthedocs.io/en/latest/gradient_descent.html
https://link.medium.com/ZXR9Gd0JlX
https://github.com/jasoncapehart/go-sgd/blob/master/sgd.go
outliers: https://www.kdnuggets.com/2019/06/overview-outlier-detection-methods-pyod.html
instead of yaml... input data?? https://github.com/toml-lang/toml/blob/master/README.md